On the Representational Capacity of Recurrent Neural Language Models

Franz Nowak, Anej Svete, Li Du, Ryan Cotterell


Abstract
This work investigates the computational expressivity of language models (LMs) based on recurrent neural networks (RNNs). Siegelmann and Sontag (1992) famously showed that RNNs with rational weights and hidden states and unbounded computation time are Turing complete. However, LMs define weightings over strings in addition to just (unweighted) language membership and the analysis of the computational power of RNN LMs (RLMs) should reflect this. We extend the Turing completeness result to the probabilistic case, showing how a rationally weighted RLM with unbounded computation time can simulate any deterministic probabilistic Turing machine (PTM) with rationally weighted transitions. Since, in practice, RLMs work in real-time, processing a symbol at every time step, we treat the above result as an upper bound on the expressivity of RLMs. We also provide a lower bound by showing that under the restriction to real-time computation, such models can simulate deterministic real-time rational PTMs.
Anthology ID:
2023.emnlp-main.434
Original:
2023.emnlp-main.434v1
Version 2:
2023.emnlp-main.434v2
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7011–7034
Language:
URL:
https://aclanthology.org/2023.emnlp-main.434
DOI:
10.18653/v1/2023.emnlp-main.434
Bibkey:
Cite (ACL):
Franz Nowak, Anej Svete, Li Du, and Ryan Cotterell. 2023. On the Representational Capacity of Recurrent Neural Language Models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 7011–7034, Singapore. Association for Computational Linguistics.
Cite (Informal):
On the Representational Capacity of Recurrent Neural Language Models (Nowak et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.434.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.434.mp4