Che Wanxiang


2023

pdf bib
FinBART: A Pre-trained Seq2seq Language Model for Chinese Financial Tasks
Dong Hongyuan | Che Wanxiang | He Xiaoyu | Zheng Guidong | Wen Junjie
Proceedings of the 22nd Chinese National Conference on Computational Linguistics

“Pretrained language models are making a more profound impact on our lives than ever before. They exhibit promising performance on a variety of general domain Natural Language Process-ing (NLP) tasks. However, few work focuses on Chinese financial NLP tasks, which comprisea significant portion of social communication. To this end, we propose FinBART, a pretrainedseq2seq language model for Chinese financial communication tasks. Experiments show thatFinBART outperforms baseline models on a series of downstream tasks including text classifica-tion, sequence labeling and text generation. We further pretrain the model on customer servicecorpora, and results show that our model outperforms baseline models and achieves promisingperformance on various real world customer service text mining tasks.”