Abstract:In the aspect-level sentiment classification task, the abilities of the existing methods to enhance aspect terms information and utilize local feature information are weak. To settle this problem, a feature fusion learning network(FFLN) is proposed. Firstly, comments are processed into text, aspect and text-aspect as input. After obtaining vector representation of the input by bidirectional encoder representation from the transformers model, the attention encoder is utilized to obtain the hidden state of the context and aspect items and extract the semantic information. Then, based on the hidden state feature, aspect-specific text vector representation is generated using aspect-specific transformation component to integrate aspect terms information into context representation. Finally, the local features are extracted from aspect-specific text vector by the context position weighted module. The final representation features are obtained by the fusion learning of global and local features, and sentiment classification is conducted. Experiments on classical English datasets and Chinese review datasets show that FFLN improves the classification effect.
[1] XIAO Z, LI X, WANG L, et al. Using Convolution Control Block for Chinese Sentiment Analysis. Journal of Parallel and Distributed Computing, 2018, 116: 18-26. [2] 吴佳林.基于Aspect层次结构的细粒度评论情感分析.硕士学位论文.广州:华南理工大学, 2019. (WU J L. Fine-Grained Reviews Sentiment Analysis Based on Aspect Hierarchical Structure. Master Dissertation. Guangzhou, China: South China University of Technology, 2019.) [3] TANG D Y, QIN B, LIU T. Aspect Level Sentiment Classification with Deep Memory Network // Proc of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2016: 214-224. [4] ZHOU J, HUANG J X, CHEN Q, et al. Deep Learning for Aspect-Level Sentiment Classification: Survey, Vision, and Challenges. IEEE Access, 2019, 7: 78454-78483. [5] JIANG L, YU M, ZHOU M, et al. Target-Dependent Twitter Sentiment Classification // Proc of the 49th Annual Meeting of the Association for Computational Linguistics(Human Language Technologies). Stroudsburg, USA: ACL, 2011: 151-160. [6] KIRITCHENKO S, ZHU X D, CHERRY C, et al. Nrc-Canada-2014: Detecting Aspects and Sentiment in Customer Reviews // Proc of the 8th International Workshop on Semantic Evaluation. Stroudsburg, USA: ACL, 2014: 437-442. [7] WANG R Y, TAO Z Y. Interactive Attention Encoder Network with Local Context Features for Aspect-Level Sentiment Analysis // Proc of the IEEE/CIC International Conference on Communications in China. Washington, USA: IEEE, 2020: 571-576. [8] TANG D Y, QIN B, FENG X C, et al. Effective LSTMs for Target-Dependent Sentiment Classification // Proc of the 26th International Conference on Computational Linguistics(Technical Papers). Strouds-burg, USA: ACL, 2016: 3298-3307. [9] 郑 诚,曹 源,薛满意.面向方面级情感分类的多层注意网络.计算机工程与应用, 2020, 56(19): 176-181. (ZHENG C, CAO Y, XUE M Y. Multi-layer Attention Network for Aspect-Level Sentiment Classification. Computer Engineering and Applications, 2020, 56(19): 176-181.) [10] FAN F F, FENG Y S, ZHAO D Y. Multi-grained Attention Network for Aspect-Level Sentiment Classification // Proc of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2018: 3433-3442. [11] HUANG B X, Ou Y L, CARLEY K M. Aspect Level Sentiment Classification with Attention-over-Attention Neural Networks // Proc of the International Conference on Social Computing, Behavioral-Cultural Modeling and Prediction and Behavior Representation in Modeling and Simulation. Berlin, Germany: Springer, 2018: 197-206. [12] CHEN P, SUN Z Q, BING L D, et al. Recurrent Attention Network on Memory for Aspect Sentiment Analysis // Proc of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2017: 452-461. [13] PHAN M H, OGUNBONA P O. Modeling Context and Syntactical Features for Aspect-Based Sentiment Analysis // Proc of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2020: 3211-3220. [14] LU Q, ZHU Z F, ZHANG D Y, et al. Interactive Rule Attention Network for Aspect-Level Sentiment Analysis. IEEE Access, 2020, 8: 52505-52516. [15] WANG Y Q, HUANG M L, ZHAO L, et al. Attention-Based LSTM for Aspect-Level Sentiment Classification // Proc of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2016: 606-615. [16] MA D H, LI S J, ZHANG X D, et al. Interactive Attention Networks for Aspect-Level Sentiment Classification // Proc of the 26th International Joint Conference on Artificial Intelligence. New York, USA: ACM, 2017: 4068-4074. [17] SONG Y W, WANG J H, JIANG T, et al. Attentional Encoder Network for Targeted Sentiment Classification[C/OL]. [2021-07-13]. https://arxiv.org/pdf/1902.09314v1.pdf. [18] 杨玉亭,冯 林,代磊超,等.面向上下文注意力联合学习网络的方面级情感分类模型.模式识别与人工智能, 2020, 33(8): 753-765. (YANG Y T, FENG L, DAI L C, et al. Context-Oriented Attention Joint Learning Network for Aspect-Level Sentiment Classifica-tion. Pattern Recognition and Artificial Intelligence, 2020, 33(8): 753-765.) [19] DEVLIN J, CHANG M W, LEE K, et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding // Proc of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies(Long and Short Papers). Stroudsburg, USA: ACL, 2019: 4171-4186. [20] ZHANG C, LI Q C, SONG D W. Aspect-Based Sentiment Classification with Aspect-Specific Graph Convolutional Networks // Proc of the Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Stroudsburg, USA: ACL, 2019: 4568-4578. [21] TANG H, JI D H, LI C L, et al. Dependency Graph Enhanced Dual-Transformer Structure for Aspect-Based Sentiment Classification // Proc of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2020: 6578-6588. [22] MIKOLOV T, SUTSKEVER I, CHEN K, et al. Distributed Representations of Words and Phrases and Their Compositionality // Proc of the 26th International Conference on Neural Information Proce-ssing Systems. Cambridge, USA: The MIT Press, 2013: 3111-3119. [23] PENNINGTON J, SOCHER R, MANNING C. GloVe: Global Ve-ctors for Word Representation // Proc of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2014: 1532-1543. [24] PETERS M E, NEUMANN M, IYYER M, et al. Deep Contextualized Word Representations // Proc of the Conference of the North American Chapter of the Association for Computational Linguistics(Human Language Technologies). Stroudsburg, USA: ACL, 2018: 2227-2237. [25] RADFORD A, NARASIMHAN K, SALIMANS T, et al. Improving Language Understanding by Generative Pre-training[C/OL]. [2021-07-13]. https://www.cs.ubc.ca/~amuham01/LING530/pa pers/radford2018improving.pdf. [26] VASWANI A, SHAZEER N, PARMAR N, et al. Attention Is All You Need // Proc of the 31st International Conference on Neural Information Processing Systems. Cambridge, USA: The MIT Press, 2017: 6000-6010. [27] HE K M, ZHANG X Y, REN S Q, et al. Deep Residual Learning for Image Recognition // Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2016: 770-778. [28] PENG H Y, MA Y K, LI Y, et al. Learning Multi-grained Aspect Target Sequence for Chinese Sentiment Analysis. Knowledge-Based Systems, 2018, 148: 167-176. [29] ZHAO Y Y, QIN B, LIU T. Creating a Fine-Grained Corpus for Chinese Sentiment Analysis. IEEE Intelligent Systems, 2015, 30(1): 36-43. [30] LI S, ZHAO Z, HU R F, et al. Analogical Reasoning on Chinese Morphological and Semantic Relations // Proc of the 56th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2018: 138-143.