|
|
Automatic Short Text Summarization Based on Part-of-Speech Soft Template Attention Mechanism |
ZHANG Yafei1,2, ZUO Yixi1,2, YU Zhengtao1,2, GUO Junjun1,2, GAO Shengxiang1,2 |
1. Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650504 2. Yunnan Key Laboratory of Artificial Intelligence, Kunming University of Science and Technology, Kunming 650504 |
|
|
Abstract Semantic integrity of the summary with intuitive subject-predicate-object structure is strong in the short text summarization task. However, part-of-speech combinations impose constraints on the structure. Aiming at this problem, an automatic short text summarization method based on part-of-speech soft template attention mechanism is proposed. Firstly, text is tagged with part-of-speech tags, and the tagged part-of-speech sequence is regarded as part-of-speech soft template of the text to guide the method to construct the structural specifications of a summary. Part-of-speech soft template is characterized at the encoder. Then, the part-of-speech soft template attention mechanism is introduced to enhance the attention of core part-of-speech information in text, such as nouns and verbs. Finally, the part-of-speech soft template attention and traditional attention are combined to generate a summary at the decoder. Experimental results verify the effectiveness of the proposed method on short text summarization datasets.
|
Received: 14 March 2020
|
|
Fund:National Key Research and Development Program of China(No.2018YFC0830105,2018YFC0830101,2018YFC08 30100), National Natural Science Foundation of China(No.61762056,61866020,61761026,61972186), Natural Science Foundation of Yunnan Province(No.2018FB104) |
Corresponding Authors:
YU Zhengtao, Ph.D., professor. His research interests include natural language processing, information retrieval and machine translation.
|
About author:: ZHANG Yafei, Ph.D., lecturer. Her research interests include natural language processing and pattern recognition. ZUO Yixi, master student. Her research interests include natural language proce-ssing. GUO Junjun, Ph.D., lecturer. His research interests include natural language proce-ssing. GAO Shengxiang, Ph.D., associate professor. Her research interests include natural language processing. |
|
|
|
[1] NALLAPATI R, ZHOU B W, DOS SANTOS C N, et al. Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Be-yond // Proc of the 20th SIGNLL Conference on Computational Na-tural Language Learning. Stroudsburg, USA: ACL, 2016: 280-290. [2] BAHDANAU D, CHO K, BENGIO Y. Neural Machine Translation by Jointly Learning to Align and Translate[C/OL]. [2020-03-13]. https://arxiv.org/pdf/1409.0473.pdf. [3] VASWANI A, SHAZEER N, PARMAR N, et al. Attention Is All You Need[C/OL]. [2020-03-13]. https://arxiv.org/pdf/1706.03762.pdf. [4] ZHOU L, HOVY E. Template-Filtered Headline Summarization[C/OL]. [2020-03-13]. https://www.aclweb.org/anthology/W04-1010.pdf. [5] RUSH A M, CHOPRA S, WESTON J. A Neural Attention Model for Abstractive Sentence Summarization[C/OL]. [2020-03-13]. https://www.aclweb.org/anthology/W04-1010.pdf. [6] CHOPRA S, AULI M, RUSH A M. Abstractive Sentence Summarization with Attentive Recurrent Neural Networks // Proc of the Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technologies. Stroudsburg, USA: ACL, 2016: 93-98. [7] HOCHREITER S, SCHMIDHUBER J. Long Short-Term Memory. Neural Computation, 1997, 9(8): 1735-1780. [8] GU J T, LU Z D, LI H, et al. Incorporating Copying Mechanism in Sequence-to-Sequence Learning // Proc of the 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2016: 1631-1640. [9] SEE A, LIU P J, MANNING C D. Get to the Point: Summarization with Pointer-Generator Networks // Proc of the 55th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2017: 1073-1083. [10] CAO Z Q, WEI F R, LI W J, et al. Faithful to the Original: Fact Aware Neural Abstractive Summarization[C/OL]. [2020-03-13]. https://arxiv.org/pdf/1711.04434.pdf. [11] LI P J, LAM W, BING L D, et al. Deep Recurrent Generative Decoder for Abstractive Text Summarization // Proc of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2017: 2091-2100. [12] LIN J Y, SUN X, MA S M, et al. Global Encoding for Abstractive Summarization // Proc of the 56th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2018: 163-169. [13] GAO S, CHEN X Y, LI P J, et al. Abstractive Text Summarization by Incorporating Reader Comments // Proc of the 33rd AAAI Conference on Artificial Intelligence. Palo Alto, USA: AAAI Press, 2019: 6399-6406. [14] CAO Z Q, LI W J, LI S J, et al. Retrieve, Rerank and Rewrite: Soft Template Based Neural Summarization // Proc of the 56th Annual Meeting of the Association for Computational Linguistics(Long Papers). Stroudsburg, USA: ACL, 2018, I: 152-161. [15] NAPOLES C, GORMLEY M, VAN DURME B. Annotated Gigaword // Proc of the Joint Workshop on Automatic Knowledge Base Construction and Web-Scale Knowledge Extraction. Stroudsburg, USA: ACL, 2012: 95-100. [16] HU B T, CHEN Q C, ZHU F Z. LCSTS: A Large Scale Chinese Short Text Summarization Dataset // Proc of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2015: 1967-1972. [17] LIN C Y, HOVY E. Automatic Evaluation of Summaries Using N-gram Co-occurrence Statistics // Proc of the Conference of the North American Chapter of the Association for Computational Lin-guistics on Human Language Technology. Stroudsburg, USA: ACL, 2003: 71-78. [18] KINGMA D P, BA J. Adam: A Method for Stochastic Optimization[C/OL]. [2020-03-13]. https://arxiv.org/pdf/1412.6980.pdf. [19] LUONG M T, PHAM H,MANNING C D. Effective Approaches to Attention-Based Neural Machine Translation[C/OL]. [2020-03-13]. https://arxiv.org/pdf/1508.04025.pdf. [20] ZHOU Q Y, YANG N, WEI F R, et al. Selective Encoding for Abstractive Sentence Summarization // Proc of the 55th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2017: 1095-1104. |
|
|
|