Explicit Planning Helps Language Models in Logical Reasoning

Hongyu Zhao, Kangrui Wang, Mo Yu, Hongyuan Mei


Abstract
Language models have been shown to perform remarkably well on a wide range of natural language processing tasks. In this paper, we propose LEAP, a novel system that uses language models to perform multi-step logical reasoning and incorporates explicit planning into the inference procedure. Explicit planning enables the system to make more informed reasoning decisions at each step by looking ahead into their future effects. Moreover, we propose a training strategy that safeguards the planning process from being led astray by spurious features. Our full system significantly outperforms other competing methods on multiple standard datasets. When using small T5 models as its core selection and deduction components, our system performs competitively compared to GPT-3 despite having only about 1B parameters (i.e., 175 times smaller than GPT-3). When using GPT-3.5, it significantly outperforms chain-of-thought prompting on the challenging PrOntoQA dataset. We have conducted extensive empirical studies to demonstrate that explicit planning plays a crucial role in the system’s performance.
Anthology ID:
2023.emnlp-main.688
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11155–11173
Language:
URL:
https://aclanthology.org/2023.emnlp-main.688
DOI:
10.18653/v1/2023.emnlp-main.688
Bibkey:
Cite (ACL):
Hongyu Zhao, Kangrui Wang, Mo Yu, and Hongyuan Mei. 2023. Explicit Planning Helps Language Models in Logical Reasoning. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 11155–11173, Singapore. Association for Computational Linguistics.
Cite (Informal):
Explicit Planning Helps Language Models in Logical Reasoning (Zhao et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.688.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.688.mp4