模式识别与人工智能
Friday, Apr. 4, 2025 Home      About Journal      Editorial Board      Instructions      Ethics Statement      Contact Us                   中文
Pattern Recognition and Artificial Intelligence  2023, Vol. 36 Issue (5): 471-482    DOI: 10.16451/j.cnki.issn1003-6059.202305007
Researches and Applications Current Issue| Next Issue| Archive| Adv Search |
Dual View Contrastive Learning Networks for Multi-hop Reading Comprehension
CHEN Jinwen1,2, CHEN Yuzhong1,2
1. College of Computer and Data Science, Fuzhou University, Fuzhou 350108;
2. Fujian Key Laboratory of Network Computing and Intelligent Information Processing, Fuzhou University, Fuzhou 350108

Download: PDF (774 KB)   HTML (1 KB) 
Export: BibTeX | EndNote (RIS)      
Abstract  Multi-hop reading comprehension is an important task in machine reading comprehension, aiming at constructing a multi-hop reasoning chain from multiple documents to answer questions with requirement of combining evidence from multiple documents. Graph neural networks are widely applied to multi-hop reading comprehension tasks. However, there are still shortcomings in terms of insufficient acquisition of context mutual information for the multiple document reasoning chain and the introduction of noise due to some candidate answers being mistakenly judged as correct answers solely based on their similarity to the question. To address these issues, dual view contrastive learning networks(DVCGN) for multi-hop reading comprehension are proposed. Firstly, a heterogeneous graph-based node-level contrastive learning method is employed. Positive and negative sample pairs are generated at the node level, and both node-level and feature-level corruptions are introduced to the heterogeneous graph to construct dual views. The two corrupted views are updated iteratively through a graph attention network. DVCGN maximizes the similarity of node representations in dual views to learn node representations , obtain rich contextual semantic information and accurately model the current node representation and its relationship with the remaining nodes in the reasoning chain. Consequently, multi-granularity contextual information is effectively distinguished from interference information and richer mutual information is constructed for the reasoning chain. Furthermore, a question-guided graph node pruning method is proposed. It leverages question information to filter answer entity nodes, narrowing down the range of candidate answers and mitigating noise caused by similarity expressions in evidence sentences. Finally, experimental results on HOTPOTQA dataset demonstrate the superior performance of DVCGN.
Key wordsMachine Reading Comprehension      Multi-hop Reading Comprehension      Heterogeneous Graph      Graph Attention Networks      Contrastive Learning     
Received: 20 January 2023     
ZTFLH: TP391  
Fund:National Natural Science Foundation of China(No.61672158), Natural Science Foundation of Fujian Province(No.2020J01494), Industry University Cooperation Project of Fujian Province(No.2021H6022)
Corresponding Authors: CHEN Yuzhong, Ph.D., professor. His research interests include computational intelligence, natural language processing and data mining.   
About author:: CHEN Jinwen, master student. Her research interests include natural language processing and machine reading comprehension.
Service
E-mail this article
Add to my bookshelf
Add to citation manager
E-mail Alert
RSS
Articles by authors
CHEN Jinwen
CHEN Yuzhong
Cite this article:   
CHEN Jinwen,CHEN Yuzhong. Dual View Contrastive Learning Networks for Multi-hop Reading Comprehension[J]. Pattern Recognition and Artificial Intelligence, 2023, 36(5): 471-482.
URL:  
http://manu46.magtech.com.cn/Jweb_prai/EN/10.16451/j.cnki.issn1003-6059.202305007      OR     http://manu46.magtech.com.cn/Jweb_prai/EN/Y2023/V36/I5/471
Copyright © 2010 Editorial Office of Pattern Recognition and Artificial Intelligence
Address: No.350 Shushanhu Road, Hefei, Anhui Province, P.R. China Tel: 0551-65591176 Fax:0551-65591176 Email: bjb@iim.ac.cn
Supported by Beijing Magtech  Email:support@magtech.com.cn