Chinese Mathematical Knowledge Entity Recognition Based on Linguistically Motivated Bidirectional Encoder Representation from Transformers

שמור ב:
מידע ביבליוגרפי
הוצא לאור ב:Information vol. 16, no. 1 (2025), p. 42
מחבר ראשי: Song, Wei
מחברים אחרים: He, Zheng, Ma, Shuaiqi, Zhang, Mingze, Guo, Wei, Keqing Ning
יצא לאור:
MDPI AG
נושאים:
גישה מקוונת:Citation/Abstract
Full Text + Graphics
Full Text - PDF
תגים: הוספת תג
אין תגיות, היה/י הראשונ/ה לתייג את הרשומה!
תיאור
Resumen:We assessed whether constructing a mathematical knowledge graph for a knowledge question-answering system or a course recommendation system, Named Entity Recognition (NER), is indispensable. The accuracy of its recognition directly affects the actual performance of these subsequent tasks. In order to improve the accuracy of mathematical knowledge entity recognition and provide effective support for subsequent functionalities, this paper adopts the latest pre-trained language model, LERT, combined with a Bidirectional Gated Recurrent Unit (BiGRU), Iterated Dilated Convolutional Neural Networks (IDCNNs), and Conditional Random Fields (CRFs), to construct the LERT-BiGRU-IDCNN-CRF model. First, LERT provides context-related word vectors, and then the BiGRU captures both long-distance and short-distance information, the IDCNN retrieves local information, and finally the CRF is decoded to output the corresponding labels. Experimental results show that the accuracy of this model when recognizing mathematical concepts and theorem entities is 97.22%, the recall score is 97.47%, and the F1 score is 97.34%. This model can accurately recognize the required entities, and, through comparison, this method outperforms the current state-of-the-art entity recognition models.
ISSN:2078-2489
DOI:10.3390/info16010042
Fuente:Advanced Technologies & Aerospace Database