Multi-HM: A Chinese Multimodal Dataset and Fusion Framework for Emotion Recognition in Human–Machine Dialogue Systems

שמור ב:
מידע ביבליוגרפי
הוצא לאור ב:Applied Sciences vol. 15, no. 8 (2025), p. 4509
מחבר ראשי: Fu, Yao
מחברים אחרים: Liu, Qiong, Song, Qing, Zhang Pengzhou, Liao Gongdong
יצא לאור:
MDPI AG
נושאים:
גישה מקוונת:Citation/Abstract
Full Text + Graphics
Full Text - PDF
תגים: הוספת תג
אין תגיות, היה/י הראשונ/ה לתייג את הרשומה!

MARC

LEADER 00000nab a2200000uu 4500
001 3194491923
003 UK-CbPIL
022 |a 2076-3417 
024 7 |a 10.3390/app15084509  |2 doi 
035 |a 3194491923 
045 2 |b d20250101  |b d20251231 
084 |a 231338  |2 nlm 
100 1 |a Fu, Yao 
245 1 |a Multi-HM: A Chinese Multimodal Dataset and Fusion Framework for Emotion Recognition in Human–Machine Dialogue Systems 
260 |b MDPI AG  |c 2025 
513 |a Journal Article 
520 3 |a Sentiment analysis is pivotal in advancing human–computer interaction (HCI) systems as it enables emotionally intelligent responses. While existing models show potential for HCI applications, current conversational datasets exhibit critical limitations in real-world deployment, particularly in capturing domain-specific emotional dynamics and context-sensitive behavioral patterns—constraints that hinder semantic comprehension and adaptive capabilities in task-driven HCI scenarios. To address these gaps, we present Multi-HM, the first multimodal emotion recognition dataset explicitly designed for human–machine consultation systems. It contains 2000 professionally annotated dialogues across 10 major HCI domains. Our dataset employs a five-dimensional annotation framework that systematically integrates textual, vocal, and visual modalities while simulating authentic HCI workflows to encode pragmatic behavioral cues and mission-critical emotional trajectories. Experiments demonstrate that Multi-HM-trained models achieve state-of-the-art performance in recognizing task-oriented affective states. This resource establishes a crucial foundation for developing human-centric AI systems that dynamically adapt to users’ evolving emotional needs. 
653 |a Affect (Psychology) 
653 |a Television programs 
653 |a Datasets 
653 |a Sentiment analysis 
653 |a Emotions 
653 |a Feedback 
653 |a Interactive computer systems 
653 |a Business metrics 
653 |a Benchmarks 
700 1 |a Liu, Qiong 
700 1 |a Song, Qing 
700 1 |a Zhang Pengzhou 
700 1 |a Liao Gongdong 
773 0 |t Applied Sciences  |g vol. 15, no. 8 (2025), p. 4509 
786 0 |d ProQuest  |t Publicly Available Content Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3194491923/abstract/embedded/6A8EOT78XXH2IG52?source=fedsrch 
856 4 0 |3 Full Text + Graphics  |u https://www.proquest.com/docview/3194491923/fulltextwithgraphics/embedded/6A8EOT78XXH2IG52?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3194491923/fulltextPDF/embedded/6A8EOT78XXH2IG52?source=fedsrch