Abed at KSAA-RD Shared Task: Enhancing Arabic Word Embedding with Modified BERT Multilingual

Abdelrahim Qaddoumi


Abstract
This paper presents a novel approach to the Arabic Reverse Dictionary Shared Task at WANLP 2023 by leveraging the BERT Multilingual model and introducing modifications augmentation and using a multi attention head. The proposed method aims to enhance the performance of the model in understanding and generating word embeddings for Arabic definitions, both in monolingual and cross-lingual contexts. It achieved good results compared to benchmark and other models in the shared task 1 and 2.
Anthology ID:
2023.arabicnlp-1.42
Volume:
Proceedings of ArabicNLP 2023
Month:
December
Year:
2023
Address:
Singapore (Hybrid)
Editors:
Hassan Sawaf, Samhaa El-Beltagy, Wajdi Zaghouani, Walid Magdy, Ahmed Abdelali, Nadi Tomeh, Ibrahim Abu Farha, Nizar Habash, Salam Khalifa, Amr Keleg, Hatem Haddad, Imed Zitouni, Khalil Mrini, Rawan Almatham
Venues:
ArabicNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
472–476
Language:
URL:
https://aclanthology.org/2023.arabicnlp-1.42
DOI:
10.18653/v1/2023.arabicnlp-1.42
Bibkey:
Cite (ACL):
Abdelrahim Qaddoumi. 2023. Abed at KSAA-RD Shared Task: Enhancing Arabic Word Embedding with Modified BERT Multilingual. In Proceedings of ArabicNLP 2023, pages 472–476, Singapore (Hybrid). Association for Computational Linguistics.
Cite (Informal):
Abed at KSAA-RD Shared Task: Enhancing Arabic Word Embedding with Modified BERT Multilingual (Qaddoumi, ArabicNLP-WS 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.arabicnlp-1.42.pdf