|
|
Aspect Level Sentiment Analysis Based on Recurrent Neural Network with Auxiliary Memory |
LIAO Xiangwen1,2,3, LIN Wei1,2,3, WU Yunbing1,2,3, WEI Jingjing4, CHEN Guolong4 |
1.College of Mathematics and Computer Science, Fuzhou University, Fuzhou 350116; 2.Fujian Provincial Key Laboratory of Networking Computing and Intelligent Information Processing,Fuzhou University, Fuzhou 350116; 3.Digital Fujian Institute of Financial Big Data, Fuzhou 350116; 4.College of Electronics and Information Science, Fujian Jiang-xia University, Fuzhou 350108 |
|
|
Abstract Aspect level sentiment analysis employs information of terms to extract features from a sentence, and it cannot utilize information of both aspects and terms simultaneously. Therefore, the model performance is low. Aiming at this problem, an aspect level sentiment analysis based on recurrent neural network with auxiliary memory is proposed. Deep bidirectional long short term memory(DBLSTM) and positional information of words are exploited to build position-weighted memory. The attention mechanism is combined with aspect terms to build aspect memory, and with position-weighted memory and aspect memory to input a multi-layer gated recurrent unit. Then, sentimental features of the aspect are obtained. Finally, sentimental polarity is identified by the normalized function. Experimental results show that the proposed method achieves better results on three public datasets with high effectiveness.
|
Received: 24 June 2019
|
|
Fund:Supported by National Natural Science Foundation of China(No.61772135,U1605251), Natural Science Foundation of Fujian Province(No. 2017J01755), Open Project of Key Laboratory of Network Data Science & Technology of Chinese Academy of Sciences(No. CASNDST201708,CASNDST201606), Open Project of National Laboratory of Pattern Recognition in China ( No.201900041), CERNET Innovation Project(No. NGII20160501), Directors Project Fund of Key Laboratory of Trustworthy Distributed Computing and Service of Ministry of Education (No.2017KF01) |
Corresponding Authors:
LIAO Xiangwen, Ph.D., professor. His research interests include opinion mining and sentiment analysis.
|
About author:: LIN Wei, master student. His research interests include opinion mining and sentiment analysis.WU Yunbing, master, associate professor. His research interests include knowledge re-presentation and knowledge discovery.WEI Jingjing, Ph.D., lecturer. Her research interests include opinion mining. |
|
|
|
[1] PANG B, LEE L. Opinion Mining and Sentiment Analysis. Foundations and Trends in Information Retrieval, 2008, 2(1/2): 1-135. [2] PONTIKI M, GALANIS D, PAVLOPOULOS J, et al. SemEval-2014 Task 4: Aspect Based Sentiment Analysis // Proc of the 8th International Workshop on Semantic Evaluation. Berlin, Germany: Sprin-ger, 2014: 27-35. [3] WAGNER J, ARORA P, CORTES S, et al. DCU: Aspect-Based Polarity Classification for SemEval Task 4 // Proc of the 8th Interna-tional Workshop on Semantic Evaluation. Berlin, Germany: Sprin-ger, 2014: 223-229. [4] KIRITCHENKO S, ZHU X D, CHERRY C, et al. NRC-Canada-2014: Detecting Aspects and Sentiment in Customer Reviews // Proc of the 8th International Workshop on Semantic Evaluation. Berlin, Germany: Springer, 2014: 437-442. [5] DONG L, WEI F R, TAN C Q, et al. Adaptive Recursive Neural Network for Target-Dependent Twitter Sentiment Classification // Proc of the 52nd Annual Meeting of the Association for Computatio-nal Linguistics. Stroudsburg, USA: ACL, 2014, II: 49-54. [6] TANG D Y, QIN B, FENG X C, et al. Effective LSTMs for Target-Dependent Sentiment Classification // Proc of the 26th International Conference on Computational Linguistics. Berlin, Germany: Sprin-ger, 2016: 3298-3307. [7] ZHANG M S, ZHANG Y, VO D T. Gated Neural Networks for Targeted Sentiment Analysis // Proc of the 30th AAAI Conference on Artificial Intelligence. Palo Alto, USA: AAAI Press, 2016: 3087-3093. [8] VO D T, ZHANG Y. Target-Dependent Twitter Sentiment Classification with Rich Automatic Features // Proc of the 24th Internatio-nal Conference on Artificial Intelligence. Palo Alto, USA: AAAI Press, 2015: 1347-1353. [9] TANG D Y, QIN B, LIU T. Aspect Level Sentiment Classification with Deep Memory Network // Proc of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2016: 214-224. [10] KIM Y, DENTON C, HOANG L, et al. Structured Attention Networks[C/OL]. [2019-05-25]. https://arxiv.org/pdf/1702.00887.pdf. [11] BAHDANAU D, CHO K, BENGIO Y. Neural Machine Translation by Jointly Learning to Align and Translate[C/OL]. [2019-05-25]. https://arxiv.org/pdf/1409.0473.pdf. [12] LUONG M T, PHAM H, MANNING C D. Effective Approaches to Attention-Based Neural Machine Translation[C/OL]. [2019-05-25]. https://arxiv.org/pdf/1508.04025v3.pdf. [13] KUMAR A, IRSOY O, ONDRUSKA P, et al. Ask Me Anything: Dynamic Memory Networks for Natural Language Processing // Proc of the 33rd International Conference on Machine Learning. Berlin, Germany: Springer, 2016: 1378-1387. [14] SUKHBAATAR S, SZLAM A, WESTON J, et al. End-to-End Memory Networks // CORTES C, LAWRENCE N D, LEE D D, et al., eds. Advances in Neural Information Processing Systems 28. Cambridge, USA: The MIT Press, 2015: 2440-2448. [15] ROCKTÄSCHEL T, GREFENSTETTE E, HERMANN K M, et al. Reasoning about Entailment with Neural Attention[C/OL]. [2019-05-25]. https://arxiv.org/pdf/1509.06664.pdf. [16] RUSH M A, CHOPRA S, WESTON J. A Neural Attention Model for Abstractive Sentence Summarization[C/OL]. [2019-05-25]. https://arxiv.org/pdf/1509.06664.pdf. [17] HERMANN K M, KOCvISKY/ T, GREFENSTETTE E, et al. Tea-ching Machines to Read and Comprehend // CORTES C, LAWRENCE N D, LEE D D, et al., eds. Advances in Neural Information Processing Systems 28. Cambridge, USA: The MIT Press, 2015: 1693-1701. [18] SEO M, KEMBHAVI A, FARHADI A, et al. Bidirectional Attention Flow for Machine Comprehension[C/OL]. [2019-05-25]. https://arxiv.org/pdf/1611.01603.pdf. [19] WANG Y Q, HUANG M L, ZHAO L, et al. Attention-Based LS-TM for Aspect-Level Sentiment Classification // Proc of the Confe-rence on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2016: 606-615. [20] MA D H, LI S J, ZHANG X D, et al. Interactive Attention Networks for Aspect-Level Sentiment Classification // Proc of the 26th International Joint Conference on Artificial Intelligence. Berlin, Germany: Springer, 2017: 4068-4074. [21] CHEN P, SUN Z Q, BING L D, et al. Recurrent Attention Network on Memory for Aspect Sentiment Analysis // Proc of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2017: 452-461. [22] LIU Q, ZHANG H B, ZENG Y F, et al. Content Attention Model for Aspect Based Sentiment Analysis // Proc of the World Wide Web Conference. Washington, USA: IEEE, 2018: 1023-1032. [23] ZHU P S, QIAN T Y. Enhanced Aspect Level Sentiment Classification with Auxiliary Memory // Proc of the 27th International Conference on Computational Linguistics. Stroudsburg, USA: ACL, 2018: 1077-1087. [24] WANG X Y, XU G L, ZHANG J Y, et al. Syntax-Directed Hybrid Attention Network for Aspect-Level Sentiment Analysis. IEEE Access, 2019, 7: 5014-5025. [25] HE R D, LEE W S, NG H T, et al. Effective Attention Modeling for Aspect-Level Sentiment Classification // Proc of the 27th International Conference on Computational Linguistics. Stroudsburg, USA: ACL, 2018: 1121-1131. [26] HE R D, LEE W S, NG H T, et al. Exploiting Document Know-ledge for Aspect-Level Sentiment Classification // Proc of the 56th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2018: 579-585. [27] MAJUMDER N, PORIA S, GELBUKH A F, et al. IARM: Inter-Aspect Relation Modeling with Memory Networks in Aspect-Based Sentiment Analysis // Proc of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2018: 3402-3411. [28] PENNINGTON J, SOCHER R, MANNING C D. Glove: Global Vectors for Word Representation // Proc of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2014: 1532-1543. [29] MIKOLOV T, CHEN K, CORRADO G S, et al. Efficient Estimation of Word Representations in Vector Space[C/OL]. [2019-05-25]. https://arxiv.org/pdf/1301.3781.pdf. [30] KARPATHY A, JOHNSON J, LI F F. Visualizing and Understanding Recurrent Networks[C/OL]. [2019-05-25]. https://openreview.net/pdf?id=71BmK0m6qfAE8VvKUQWB. |
|
|
|