A BERT–LSTM–Attention Framework for Robust Multi-Class Sentiment Analysis on Twitter Data
Guardado en:
| Publicado en: | Systems vol. 13, no. 11 (2025), p. 964-983 |
|---|---|
| Autor principal: | |
| Otros Autores: | , , , , , |
| Publicado: |
MDPI AG
|
| Materias: | |
| Acceso en línea: | Citation/Abstract Full Text + Graphics Full Text - PDF |
| Etiquetas: |
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
| Resumen: | This paper proposes a hybrid deep learning model for robust and interpretable sentiment classification of Twitter data. The model integrates Bidirectional Encoder Representations from Transformers (BERT)-based contextual embeddings, a Bidirectional Long Short-Term Memory (BiLSTM) network, and a custom attention mechanism to classify tweets into four sentiment categories: Positive, Negative, Neutral, and Irrelevant. Addressing the challenges of noisy and multilingual social media content, the model incorporates a comprehensive preprocessing pipeline and data augmentation strategies including back-translation and synonym replacement. An ablation study demonstrates that combining BERT with BiLSTM improves the model’s sensitivity to sequence dependencies, while the attention mechanism enhances both classification accuracy and interpretability. Empirical results show that the proposed model outperforms BERT-only and BERT+BiLSTM baselines, achieving F1-scores (F1) above 0.94 across all sentiment classes. Attention weight visualizations further reveal the model’s ability to focus on sentiment-bearing tokens, providing transparency in decision-making. The proposed framework is well-suited for deployment in real-time sentiment monitoring systems and offers a scalable solution for multilingual and multi-class sentiment analysis in dynamic social media environments. We also include a focused characterization of the dataset via an Exploratory Data Analysis in the Methods section. |
|---|---|
| ISSN: | 2079-8954 |
| DOI: | 10.3390/systems13110964 |
| Fuente: | Advanced Technologies & Aerospace Database |