Multi-Dialectal Representation Learning of Sinitic Phonology

Zhibai Jia


Abstract
Machine learning techniques have shown their competence for representing and reasoning in symbolic systems such as language and phonology. In Sinitic Historical Phonology, notable tasks that could benefit from machine learning include the comparison of dialects and reconstruction of proto-languages systems. Motivated by this, this paper provides an approach for obtaining multi-dialectal representations of Sinitic syllables, by constructing a knowledge graph from structured phonological data ,then applying the BoxE technique from knowledge base learning. We applied unsupervised clustering techniques to the obtained representations to observe that the representations capture phonemic contrast from the input dialects. Furthermore, we trained classifiers to perform inference of unobserved Middle Chinese labels, showing the representations’ potential for indicating archaic, proto-language features. The representations can be used for performing completion of fragmented Sinitic phonological knowledge bases, estimating divergences between different characters, or aiding the exploration and reconstruction of archaic features.
Anthology ID:
2023.acl-srw.2
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Vishakh Padmakumar, Gisela Vallejo, Yao Fu
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
19–29
Language:
URL:
https://aclanthology.org/2023.acl-srw.2
DOI:
10.18653/v1/2023.acl-srw.2
Bibkey:
Cite (ACL):
Zhibai Jia. 2023. Multi-Dialectal Representation Learning of Sinitic Phonology. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop), pages 19–29, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Multi-Dialectal Representation Learning of Sinitic Phonology (Jia, ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-srw.2.pdf
Video:
 https://aclanthology.org/2023.acl-srw.2.mp4