Attention-based RNN with question-aware loss and multi-level copying mechanism for natural answer generation

Guardado en:
Detalles Bibliográficos
Publicado en:Complex & Intelligent Systems vol. 10, no. 5 (Oct 2024), p. 7249
Autor principal: Zhao, Fen
Otros Autores: Shao, Huishuang, Li, Shuo, Wang, Yintong, Yu, Yan
Publicado:
Springer Nature B.V.
Materias:
Acceso en línea:Citation/Abstract
Full Text - PDF
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!

MARC

LEADER 00000nab a2200000uu 4500
001 3104652876
003 UK-CbPIL
022 |a 2199-4536 
022 |a 2198-6053 
024 7 |a 10.1007/s40747-024-01538-5  |2 doi 
035 |a 3104652876 
045 2 |b d20241001  |b d20241031 
100 1 |a Zhao, Fen  |u Nanjing Xiaozhuang University, School of Information Engineering, Nanjing, China (GRID:grid.440845.9) (ISNI:0000 0004 1798 0981) 
245 1 |a Attention-based RNN with question-aware loss and multi-level copying mechanism for natural answer generation 
260 |b Springer Nature B.V.  |c Oct 2024 
513 |a Journal Article 
520 3 |a Natural answer generation is in a very clear practical significance and strong application background, which can be widely used in the field of knowledge services such as community question answering and intelligent customer service. Traditional knowledge question answering is to provide precise answer entities and neglect the defects; namely, users hope to receive a complete natural answer. In this research, we propose a novel attention-based recurrent neural network for natural answer generation, which is enhanced with multi-level copying mechanisms and question-aware loss. To generate natural answers that conform to grammar, we leverage multi-level copying mechanisms and the prediction mechanism which can copy semantic units and predict common words. Moreover, considering the problem that the generated natural answer does not match the user question, question-aware loss is introduced to make the generated target answer sequences correspond to the question. Experiments on three response generation tasks show our model to be superior in quality while being more parallelizable and requiring significantly less time to train. Our model achieves 0.727 BLEU on the SimpleQuestions response generation task, improving over the existing best results by over 0.007 BLEU. Our model has scored a significant enhancement on naturalness with up to 0.05 more than best performing baseline. The simulation results show that our method can generate grammatical and contextual natural answers according to user needs. 
653 |a Recurrent neural networks 
653 |a Attention 
653 |a Questions 
653 |a Copying 
653 |a Customer services 
653 |a Language 
653 |a User needs 
653 |a Knowledge 
653 |a Intelligent systems 
653 |a Neural networks 
653 |a Semantics 
653 |a Natural language 
700 1 |a Shao, Huishuang  |u Chongqing University of Posts and Telecommunications, School of Computer Science and Technology, Chongqing, China (GRID:grid.411587.e) (ISNI:0000 0001 0381 4112) 
700 1 |a Li, Shuo  |u Nanjing Xiaozhuang University, School of Information Engineering, Nanjing, China (GRID:grid.440845.9) (ISNI:0000 0004 1798 0981); De Montfort University, Faculty of Computing, Engineering and Media, Leicester, UK (GRID:grid.48815.30) (ISNI:0000 0001 2153 2936) 
700 1 |a Wang, Yintong  |u Nanjing Xiaozhuang University, School of Information Engineering, Nanjing, China (GRID:grid.440845.9) (ISNI:0000 0004 1798 0981) 
700 1 |a Yu, Yan  |u Chengdu University of Information Technology, School of Cybersecurity, Chengdu, China (GRID:grid.411307.0) (ISNI:0000 0004 1790 5236) 
773 0 |t Complex & Intelligent Systems  |g vol. 10, no. 5 (Oct 2024), p. 7249 
786 0 |d ProQuest  |t Advanced Technologies & Aerospace Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3104652876/abstract/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3104652876/fulltextPDF/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch