Enhancing Human-Robot Interaction through Ensemble Intention Recognition and Trajectory Tracking

Gorde:
Xehetasun bibliografikoak
Argitaratua izan da:IISE Annual Conference. Proceedings (2025), p. 1-7
Egile nagusia: Alinezhad, Elnaz
Beste egile batzuk: Mohammadzadeh, Ali Kamali, Masoud, Sara
Argitaratua:
Institute of Industrial and Systems Engineers (IISE)
Gaiak:
Sarrera elektronikoa:Citation/Abstract
Full Text
Full Text - PDF
Etiketak: Etiketa erantsi
Etiketarik gabe, Izan zaitez lehena erregistro honi etiketa jartzen!

MARC

LEADER 00000nab a2200000uu 4500
001 3243713085
003 UK-CbPIL
024 7 |a 10.21872/2025IISE_6188  |2 doi 
035 |a 3243713085 
045 2 |b d20250101  |b d20251231 
084 |a 102209  |2 nlm 
100 1 |a Alinezhad, Elnaz 
245 1 |a Enhancing Human-Robot Interaction through Ensemble Intention Recognition and Trajectory Tracking 
260 |b Institute of Industrial and Systems Engineers (IISE)  |c 2025 
513 |a Conference Proceedings 
520 3 |a Effective intention recognition and trajectory tracking are critical for enabling collaborative robots (cobots) to anticipate and support human actions in Human-Robot Interaction (HRI). This study investigates the application of ensemble deep learning to classify human intentions and track movement trajectories using data collected from Virtual Reality (VR) environments. VR provides a controlled, immersive setting for precise monitoring of human behavior, facilitating robust model training. We develop and evaluate ensemble models combining CNNs, LSTMs, and Transformers, leveraging their complementary strengths. While CNN and CNN-LSTM models achieved high accuracy, they exhibited limitations in distinguishing specific intentions under certain conditions. In contrast, the CNN-Transformer model demonstrated superior precision, recall, and F1-scores in intention classification and exhibited robust trajectory tracking. By integrating multiple architectures, the ensemble approach enhanced predictive performance, improving adaptability to complex human behaviors. These findings highlight the potential of ensemble learning in advancing real-time human intention understanding and motion prediction, fostering more intuitive and effective HRI. The proposed framework contributes to developing intelligent cobots capable of dynamically adapting to human actions, paving the way for safer and more efficient collaborative workspaces. 
610 4 |a Leap Motion 
653 |a Behavior 
653 |a Accuracy 
653 |a Collaboration 
653 |a Deep learning 
653 |a Adaptability 
653 |a Trends 
653 |a Artificial neural networks 
653 |a Robots 
653 |a Data processing 
653 |a Manufacturing 
653 |a Machine learning 
653 |a Tracking 
653 |a Time series 
653 |a Virtual reality 
653 |a Robustness 
653 |a Artificial intelligence 
653 |a Trajectories 
653 |a Recognition 
653 |a Neural networks 
653 |a Decision making 
653 |a Human motion 
653 |a Algorithms 
653 |a Human engineering 
653 |a Real time 
653 |a Ensemble learning 
653 |a Industry 5.0 
653 |a Human behavior 
700 1 |a Mohammadzadeh, Ali Kamali 
700 1 |a Masoud, Sara 
773 0 |t IISE Annual Conference. Proceedings  |g (2025), p. 1-7 
786 0 |d ProQuest  |t Science Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3243713085/abstract/embedded/6A8EOT78XXH2IG52?source=fedsrch 
856 4 0 |3 Full Text  |u https://www.proquest.com/docview/3243713085/fulltext/embedded/6A8EOT78XXH2IG52?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3243713085/fulltextPDF/embedded/6A8EOT78XXH2IG52?source=fedsrch