Chain-of-Thought Reasoning in Tabular Language Models

Mingyu Zheng, Hao Yang, Wenbin Jiang, Zheng Lin, Yajuan Lyu, Qiaoqiao She, Weiping Wang


Abstract
Tabular mathematical reasoning task requires models to perform multi-step operations including information look-up and numerical calculation, based on heterogeneous data from tables and questions. Existing solutions tend to extend chain-of-thought (CoT) reasoning into powerful large language models (LLMs) to promote multi-hop mathematical reasoning. However, such LLM-based approaches are not a viable solution in the scenario of privatization deployment or limited resources. To address this problem, we revisit small-scale tabular language models (TaLMs) and extend chain-of-thought reasoning into TaLMs for the first time. Specifically, we propose a novel framework, TaCo, which coordinates two TaLMs responsible for CoT generation and answer inference, respectively. Besides, our framework can be combined with an external calculator to enhance accurate numerical calculation. On the TABMWP dataset, TaCo outperforms the state-of-the-art ChatGPT by 9.55% (82.60%92.15% in accuracy) with much less parameters (0.8B). The code will be released along with the paper.
Anthology ID:
2023.findings-emnlp.734
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11006–11019
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.734
DOI:
10.18653/v1/2023.findings-emnlp.734
Bibkey:
Cite (ACL):
Mingyu Zheng, Hao Yang, Wenbin Jiang, Zheng Lin, Yajuan Lyu, Qiaoqiao She, and Weiping Wang. 2023. Chain-of-Thought Reasoning in Tabular Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 11006–11019, Singapore. Association for Computational Linguistics.
Cite (Informal):
Chain-of-Thought Reasoning in Tabular Language Models (Zheng et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.734.pdf