Lacking the embedding of a word? Look it up into a traditional dictionary

Guardado en:
Detalles Bibliográficos
Publicado en:arXiv.org (Sep 24, 2021), p. n/a
Autor principal: Ruzzetti, Elena Sofia
Otros Autores: Ranaldi, Leonardo, Mastromattei, Michele, Fallucchi, Francesca, Zanzotto, Fabio Massimo
Publicado:
Cornell University Library, arXiv.org
Materias:
Acceso en línea:Citation/Abstract
Full text outside of ProQuest
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Descripción
Resumen:Word embeddings are powerful dictionaries, which may easily capture language variations. However, these dictionaries fail to give sense to rare words, which are surprisingly often covered by traditional dictionaries. In this paper, we propose to use definitions retrieved in traditional dictionaries to produce word embeddings for rare words. For this purpose, we introduce two methods: Definition Neural Network (DefiNNet) and Define BERT (DefBERT). In our experiments, DefiNNet and DefBERT significantly outperform state-of-the-art as well as baseline methods devised for producing embeddings of unknown words. In fact, DefiNNet significantly outperforms FastText, which implements a method for the same task-based on n-grams, and DefBERT significantly outperforms the BERT method for OOV words. Then, definitions in traditional dictionaries are useful to build word embeddings for rare words.
ISSN:2331-8422
DOI:10.18653/v1/2022.findings-acl.208
Fuente:Engineering Database