Staged transfer learning for multi-label half-face emotion recognition

Guardado en:
Detalles Bibliográficos
Publicado en:Journal of Engineering and Applied Science vol. 72, no. 1 (Dec 2025), p. 57
Publicado:
Springer Nature B.V.
Materias:
Acceso en línea:Citation/Abstract
Full Text
Full Text - PDF
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!

MARC

LEADER 00000nab a2200000uu 4500
001 3205584644
003 UK-CbPIL
022 |a 1110-1903 
022 |a 1110-1393 
024 7 |a 10.1186/s44147-025-00615-x  |2 doi 
035 |a 3205584644 
045 2 |b d20251201  |b d20251231 
245 1 |a Staged transfer learning for multi-label half-face emotion recognition 
260 |b Springer Nature B.V.  |c Dec 2025 
513 |a Journal Article 
520 3 |a As fundamental drivers of human behavior, emotions can be expressed through various modalities, including facial expressions. Facial emotion recognition (FER) has emerged as a pivotal area of affective computing, enabling accurate detection of human emotions from visual cues. To enhance the efficiency and maintain accuracy, we propose a novel approach that leverages deep learning and transfer learning techniques to classify emotions based on only half of the human face. We introduce EMOFACE, a comprehensive half-facial imagery dataset annotated with 25 distinct emotion labels, providing a diverse and inclusive resource for multi-label half-facial emotion classification. By combining this dataset with the established FER2013 dataset, we employ a staged transfer learning framework that effectively addresses the challenges of multi-label half facial emotion classification. Our proposed approach, which utilizes a custom convolutional neural network (ConvNet) and five pre-trained deep learning models (VGG16, VGG19, DenseNet, MobileNet, and ResNet), achieves impressive results. We report an average binary accuracy of 0.9244 for training, 0.9152 for validation, and 0.9138 for testing, demonstrating the efficacy of our method. The potential applications of this research extend to various domains, including affective computing, healthcare, robotics, human–computer interaction, and self-driving cars. By advancing the field of half-facial multi-label emotion recognition, our work contributes to the development of more intuitive and empathetic human–machine interactions. 
653 |a Physiology 
653 |a Autonomous cars 
653 |a Behavior 
653 |a Accuracy 
653 |a Psychology 
653 |a Happiness 
653 |a Labels 
653 |a Deep learning 
653 |a Datasets 
653 |a Classification 
653 |a Affective computing 
653 |a Communication 
653 |a Artificial neural networks 
653 |a Machine learning 
653 |a Emotions 
653 |a Robotics 
653 |a Emotion recognition 
653 |a Computer vision 
653 |a Decision making 
653 |a Neural networks 
653 |a Algorithms 
773 0 |t Journal of Engineering and Applied Science  |g vol. 72, no. 1 (Dec 2025), p. 57 
786 0 |d ProQuest  |t Engineering Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3205584644/abstract/embedded/75I98GEZK8WCJMPQ?source=fedsrch 
856 4 0 |3 Full Text  |u https://www.proquest.com/docview/3205584644/fulltext/embedded/75I98GEZK8WCJMPQ?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3205584644/fulltextPDF/embedded/75I98GEZK8WCJMPQ?source=fedsrch