Self-MCKD: Enhancing the Effectiveness and Efficiency of Knowledge Transfer in Malware Classification

Guardat en:
Dades bibliogràfiques
Publicat a:Electronics vol. 14, no. 6 (2025), p. 1077
Autor principal: Hyeon-Jin, Jeong
Altres autors: Han-Jin, Lee, Gwang-Nam, Kim, Choi, Seok-Hwan
Publicat:
MDPI AG
Matèries:
Accés en línia:Citation/Abstract
Full Text + Graphics
Full Text - PDF
Etiquetes: Afegir etiqueta
Sense etiquetes, Sigues el primer a etiquetar aquest registre!

MARC

LEADER 00000nab a2200000uu 4500
001 3181457752
003 UK-CbPIL
022 |a 2079-9292 
024 7 |a 10.3390/electronics14061077  |2 doi 
035 |a 3181457752 
045 2 |b d20250101  |b d20251231 
084 |a 231458  |2 nlm 
100 1 |a Hyeon-Jin, Jeong 
245 1 |a Self-MCKD: Enhancing the Effectiveness and Efficiency of Knowledge Transfer in Malware Classification 
260 |b MDPI AG  |c 2025 
513 |a Journal Article 
520 3 |a As malware continues to evolve, AI-based malware classification methods have shown significant promise in improving the malware classification performance. However, these methods lead to a substantial increase in computational complexity and the number of parameters, increasing the computational cost during the training process. Moreover, the maintenance cost of these methods also increases, as frequent retraining and transfer learning are required to keep pace with evolving malware variants. In this paper, we propose an efficient knowledge distillation technique for AI-based malware classification methods called Self-MCKD. Self-MCKD transfers output logits that are separated into the target class and non-target classes. With the separation of the output logits, Self-MCKD enables efficient knowledge transfer by assigning weighted importance to the target class and non-target classes. Also, Self-MCKD utilizes small and shallow AI-based malware classification methods as both the teacher and student models to overcome the need to use large and deep methods as the teacher model. From the experimental results using various malware datasets, we show that Self-MCKD outperforms the traditional knowledge distillation techniques in terms of the effectiveness and efficiency of its malware classification. 
653 |a Maintenance costs 
653 |a Classification 
653 |a Optimization techniques 
653 |a Knowledge 
653 |a Malware 
653 |a Neural networks 
653 |a Teachers 
653 |a Effectiveness 
653 |a Computing costs 
653 |a Distillation 
653 |a Methods 
653 |a Machine learning 
653 |a Learning 
653 |a Efficiency 
700 1 |a Han-Jin, Lee 
700 1 |a Gwang-Nam, Kim 
700 1 |a Choi, Seok-Hwan 
773 0 |t Electronics  |g vol. 14, no. 6 (2025), p. 1077 
786 0 |d ProQuest  |t Advanced Technologies & Aerospace Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3181457752/abstract/embedded/L8HZQI7Z43R0LA5T?source=fedsrch 
856 4 0 |3 Full Text + Graphics  |u https://www.proquest.com/docview/3181457752/fulltextwithgraphics/embedded/L8HZQI7Z43R0LA5T?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3181457752/fulltextPDF/embedded/L8HZQI7Z43R0LA5T?source=fedsrch