Human Inspired Progressive Alignment and Comparative Learning for Grounded Word Acquisition

Yuwei Bao, Barrett Lattimer, Joyce Chai


Abstract
Human language acquisition is an efficient, supervised, and continual process. In this work, we took inspiration from how human babies acquire their first language, and developed a computational process for word acquisition through comparative learning. Motivated by cognitive findings, we generated a small dataset that enables the computation models to compare the similarities and differences of various attributes, learn to filter out and extract the common information for each shared linguistic label. We frame the acquisition of words as not only the information filtration process, but also as representation-symbol mapping. This procedure does not involve a fixed vocabulary size, nor a discriminative objective, and allows the models to continually learn more concepts efficiently. Our results in controlled experiments have shown the potential of this approach for efficient continual learning of grounded words.
Anthology ID:
2023.acl-long.863
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15475–15493
Language:
URL:
https://aclanthology.org/2023.acl-long.863
DOI:
10.18653/v1/2023.acl-long.863
Bibkey:
Cite (ACL):
Yuwei Bao, Barrett Lattimer, and Joyce Chai. 2023. Human Inspired Progressive Alignment and Comparative Learning for Grounded Word Acquisition. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 15475–15493, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Human Inspired Progressive Alignment and Comparative Learning for Grounded Word Acquisition (Bao et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.863.pdf
Video:
 https://aclanthology.org/2023.acl-long.863.mp4