|
|
Chinese-Vietnamese Cross-Language Event Retrieval Incorporating Event Knowledge |
HUANG Yuxin1,2, DENG Tongjie1,2, YU Zhengtao1,2, XIAN Yantuan1,2 |
1. Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650504; 2. Key Laboratory of Artificial Intelligence in Yunnan Province, Kunming University of Science and Technology, Kunming 650504 |
|
|
Abstract The goal of Chinese-Vietnamese cross-language event retrieval task is to retrieve Vietnamese documents expressing the same event based on the input Chinese query. Existing cross-language retrieval models exhibit poor alignment in Chinese-Vietnamese low-resource retrieval, and simple semantic matching retrieval struggles to comprehend the event semantic information of complex queries. To address this issue, a Chinese-Vietnamese cross-language event retrieval method incorporating event knowledge is proposed. A Chinese-Vietnamese cross-language event pre-training module is built for continuous pre-training to improve the representation performance of the model on Chinese-Vietnamese low-resource languages. The difference between the masked predicted values and the true values of the event is discriminated based on contrastive learning to encourage the model to better understand and capture the event knowledge features. Experiments on cross-language event retrieval tasks and cross-language question-and-answer tasks demonstrate the performance improvement of the proposed method.
|
Received: 05 September 2023
|
|
Fund:National Natural Science Foundation of China(No.U21B2027,61972186,6226028), Science and Technology Major Special Projects of Yunnan Province(No.202302AD080003,202103AA080015,202202202AD080003), Key Basic Research Projects of Yunnan Province(No.202203AP140100), Fundamental Research Projects of Yunnan Province(No.202301AS070047,202301AT070471), Kunming University of Technology's "Double First Class" Joint Special Project(No.202201BE070001-021) |
Corresponding Authors:
YU Zhengtao, Ph.D., professor. His research interests includes na-tural language processing, information retrie-val and machine translation.
|
About author:: HUANG Yuxin, Ph.D., associate profe-ssor. His research interests include natural language processing and text summarization.DENG Tongjie, master student. His research interests include natural language processing and information retrievalXIAN Yantuan, Ph.D., associate profe-ssor. His research interests include natural language processing, information retrieval and machine translation. |
|
|
|
[1] CAO Y, LIU H, WAN X J. Jointly Learning to Align and Summarize for Neural Cross-Lingual Summarization // Proc of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2020: 6220-6231 [2] CONNEAU A, WU S J, LI H R, et al. Emerging Cross-Lingual Structure in Pretrained Language Models // Proc of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2020: 6022-6034. [3] BHATTACHARYA P, GOYAL P, SARKAR S. Using Communities of Words Derived from Multilingual Word Vectors for Cross-Language Information Retrieval in Indian Languages. ACM Transactions on Asian and Low-Resource Language Information Processing, 2018, 18(1). DOI: 10.1145/3208358. [4] JIANG Z L, EL-JAROUDI A, HARTMANN W, et al. Cross-Lingual Information Retrieval with BERT // Proc of the Workshop on Cross-Language Search and Summarization of Text and Speech. Stroudsburg, USA: ACL, 2020: 26-31. [5] DEVLIN J, CHANG M W, LEE K, et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding // Proc of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Long and Short Papers). Stroudsburg, USA: ACL, 2019: 4171-4186. [6] CONNEAU A, KHANDELWAL K, GOYAL N, et al. Unsupervised Cross-Lingual Representation Learning at Scale // Proc of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2020: 8440-8451. [7] LIU Y H, OTT M, GOYAL N, et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach[C/OL].[2023-2-22]. https://arxiv.org/abs/1907.11692. [8] YANG Z L, DAI Z H, YANG Y M, et al. XLNet: Generalized Autoregressive Pretraining for Language Understanding // Proc of the 33rd International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2019: 5753-5763. [9] CHANG W C, FELIX X Y, CHANG Y W, et al. Pre-training Tasks for Embedding-Based Large-Scale Retrieval[C/OL].[2023-2-22]. https://arxiv.org/pdf/2002.03932.pdf. [10] NOGUEIRA R, CHO K. Passage Re-ranking with BERT[C/OL]. [2023-2-22].https://arxiv.org/pdf/1901.04085.pdf. [11] YILMAZ Z A, WANG S J, YANG W, et al. Applying BERT to Document Retrieval with Birch // Proc of the Conference on Empi-rical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing(System Demonstrations). Stroudsburg, USA: ACL, 2019: 19-24. [12] HUANG H Y, LIANG Y B, DUAN N, et al. Unicoder: A Universal Language Encoder by Pre-training with Multiple Cross-Lingual Tasks // Proc of the Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Stroudsburg, USA: ACL, 2019: 2485-2494. [13] YANG E, NAIR S, CHANDRADEVAN R, et al. C3: Continued Pretraining with Contrastive Weak Supervision for Cross Language AD-HOC Retrieval // Proc of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval. New York, USA: ACM, 2022: 2507-2512. [14] CONNEAU A, LAMPLE G. Cross-Lingual Language Model Pretraining // Proc of the 33rd International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2019: 7059-7069. [15] SHI P, LIN J. Cross-Lingual Relevance Transfer for Document Retrieval[C/OL]. [2023-2-22]. https://arxiv.org/abs/1911.02989. [16] MACAVANEY S, SOLDAINI L, GOHARIAN N. Teaching a New Dog Old Tricks: Resurrecting Multilingual Retrieval Using Zero-Shot Learning // Proc of the 42nd European Conference on IR Research. Berlin, Germany: Springer, 2020: 246-254. [17] YU P X, FEI H L, LI P. Cross-Lingual Language Model Pretrai-ning for Retrieval // Proc of the Web Conference. Washington, USA: IEEE, 2021: 1029-1039. [18] LEWIS P, OGUZ B, RINOTT R, et al. MLQA: Evaluating Cross-Lingual Extractive Question Answering // Proc of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2020: 7315-7330. [19] HAN R J, REN X, PENG N Y. ECONET: Effective Continual Pretraining of Language Models for Event Temporal Reasoning // Proc of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2021: 5367-5380. [20] WANG H Y, CHEN M H, ZHANG H M, et al. Joint Constrained Learning for Event-Event Relation Extraction // Proc of the Confe-rence on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2020: 696-706. [21] ZHOU B, NING Q, KHASHABI D, et al. Temporal Common Sense Acquisition with Minimal Supervision // Proc of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2020: 7579-7589. [22] ZHAO X Y, LIN S T, DURRETT G. Effective Distant Supervision for Temporal Relation Extraction // Proc of the 2nd Workshop on Domain Adaptation for NLP. Stroudsburg, USA: ACL, 2021: 195-203. [23] PEREIRA L, LIU X D, CHENG F, et al. Adversarial Training for Commonsense Inference // Proc of the 5th Workshop on Representation Learning for NLP. Stroudsburg, USA: ACL, 2020: 55-60. [24] FENG F X Y, YANG Y F, CER D, et al. Language-Agnostic BERT Sentence Embedding // Proc of the 60th Annual Meeting of the Association for Computational Linguistics(Long Papers). Stroudsburg, USA: ACL, 2022: 878-891. |
|
|
|