Beyond BERT: Exploring the Efficacy of RoBERTa and ALBERT in Supervised Multiclass Text Classification
Salvato in:
| Pubblicato in: | International Journal of Advanced Computer Science and Applications vol. 15, no. 3 (2024) |
|---|---|
| Autore principale: | |
| Pubblicazione: |
Science and Information (SAI) Organization Limited
|
| Soggetti: | |
| Accesso online: | Citation/Abstract Full Text - PDF |
| Tags: |
Nessun Tag, puoi essere il primo ad aggiungerne!!
|
| Abstract: | This study investigates the performance of transformer-based machine learning models, specifically BERT, RoBERTa, and ALBERT, in multiclass text classification within the context of the Universal Access to Quality Tertiary Education (UAQTE) program. The aim is to systematically categorize and analyze qualitative responses to uncover domain-specific patterns in students' experiences. Through rigorous evaluation of various hyperparameter configurations, consistent enhancements in model performance are observed with smaller batch sizes and increased epochs, while optimal learning rates further boost accuracy. However, achieving an optimal balance between sequence length and model efficacy presents nuanced challenges, with instances of overfitting emerging after a certain number of epochs. Notably, the findings underscore the effectiveness of the UAQTE program in addressing student needs, particularly evident in categories such as "Family Support" and "Financial Support," with RoBERTa emerging as a standout choice due to its stable performance during training. Future research should focus on fine-tuning hyperparameter values and adopting continuous monitoring mechanisms to reduce overfitting. Furthermore, ongoing review and modification of educational efforts, informed by evidence-based decision-making and stakeholder feedback, is critical to fulfill students' changing needs effectively. |
|---|---|
| ISSN: | 2158-107X 2156-5570 |
| DOI: | 10.14569/IJACSA.2024.0150323 |
| Fonte: | Advanced Technologies & Aerospace Database |