模式识别与人工智能
Saturday, May. 3, 2025 Home      About Journal      Editorial Board      Instructions      Ethics Statement      Contact Us                   中文
  2018, Vol. 31 Issue (12): 1127-1133    DOI: 10.16451/j.cnki.issn1003-6059.201812008
Researches and Applications Current Issue| Next Issue| Archive| Adv Search |
Attention Mechanism Based Question Entity Linking
REN Chaogan1,2, YANG Yan1,2, JIA Zhen1,2, TANG Huijia1, YU Xiuying1
1.School of Information Science and Technology, Southwest Jiaotong University, Chengdu 611756
2.Key Laboratory of Cloud Computing and Intelligent Technology of Sichuan Province, Southwest Jiaotong University, Chengdu 611756

Download: PDF (976 KB)   HTML (1 KB) 
Export: BibTeX | EndNote (RIS)      
Abstract  In question entity linking, a large amount of work in data processing and feature selection is required, cumulative errors are caused easily and the linking effect is reduced. To address the issues, an attention mechanism based encoder-decoder model for entity linking(AMEDEL) is proposed. In this model, long short-term memory network is utilized to encode the questions. Then, entity mentions and disambiguation information are generated as outputs through the decoder process by attention mechanism. Finally, these outputs are linked to the entities in knowledge base. The experiments are conducted on a dataset of questions and entities about products in automotive field. The results show that the proposed model obtains satisfactory results by only employing rare contextual information.
Key wordsQuestion Entity Linking      Attention Mechanism      Encoder-Decoder      Long Short-Term Memory Network      Generative Model     
Received: 21 September 2018     
ZTFLH: TP 391  
Fund:Supported by National Natural Science Foundation of China(No.61572407), National Key Technology Research and Development Program (2015BAH19F02)
About author:: (REN Chaogan, master student. His research interests include natural language processing and deep learning.)
(YANG Yan(Corresponding author), Ph.D., professor. Her research interests include artificial intelligence, big data analysis and mi-ning and ensemble learning.)
(JIA Zhen, Ph.D., lecturer. Her research interests include knowledge graph and natural language processing.)
(TANG Huijia, master, professor. Her research interests include cloud service system and technology.)
(YU Xiuying, Ph.D., lecturer. Her resear-ch interests include information security.)
Service
E-mail this article
Add to my bookshelf
Add to citation manager
E-mail Alert
RSS
Articles by authors
REN Chaogan
YANG Yan
JIA Zhen
TANG Huijia
YU Xiuying
Cite this article:   
REN Chaogan,YANG Yan,JIA Zhen等. Attention Mechanism Based Question Entity Linking[J]. , 2018, 31(12): 1127-1133.
URL:  
http://manu46.magtech.com.cn/Jweb_prai/EN/10.16451/j.cnki.issn1003-6059.201812008      OR     http://manu46.magtech.com.cn/Jweb_prai/EN/Y2018/V31/I12/1127
Copyright © 2010 Editorial Office of Pattern Recognition and Artificial Intelligence
Address: No.350 Shushanhu Road, Hefei, Anhui Province, P.R. China Tel: 0551-65591176 Fax:0551-65591176 Email: bjb@iim.ac.cn
Supported by Beijing Magtech  Email:support@magtech.com.cn