MS-PreTE: A Multi-Scale Pre-Training Encoder for Mobile Encrypted Traffic Classification

Shranjeno v:
Bibliografske podrobnosti
izdano v:Big Data and Cognitive Computing vol. 9, no. 8 (2025), p. 216-238
Glavni avtor: Wang, Ziqi
Drugi avtorji: Qiu Yufan, Liu, Yaping, Zhang, Shuo, Liu, Xinyi
Izdano:
MDPI AG
Teme:
Online dostop:Citation/Abstract
Full Text + Graphics
Full Text - PDF
Oznake: Označite
Brez oznak, prvi označite!

MARC

LEADER 00000nab a2200000uu 4500
001 3243981581
003 UK-CbPIL
022 |a 2504-2289 
024 7 |a 10.3390/bdcc9080216  |2 doi 
035 |a 3243981581 
045 2 |b d20250101  |b d20251231 
100 1 |a Wang, Ziqi  |u Cyberspace Institute of Advanced Technology, Guangzhou University, Guangzhou 510006, China; ziqiwang00@foxmail.com (Z.W.); liuxinyi@e.gzhu.edu.cn (X.L.) 
245 1 |a MS-PreTE: A Multi-Scale Pre-Training Encoder for Mobile Encrypted Traffic Classification 
260 |b MDPI AG  |c 2025 
513 |a Journal Article 
520 3 |a Mobile traffic classification serves as a fundamental component in network security systems. In recent years, pre-training methods have significantly advanced this field. However, as mobile traffic is typically mixed with third-party services, the deep integration of such shared services results in highly similar TCP flow characteristics across different applications. This makes it challenging for existing traffic classification methods to effectively identify mobile traffic. To address the challenge, we propose MS-PreTE, a two-phase pre-training framework for mobile traffic classification. MS-PreTE introduces a novel multi-level representation model to preserve traffic information from diverse perspectives and hierarchical levels. Furthermore, MS-PreTE incorporates a focal-attention mechanism to enhance the model’s capability in discerning subtle differences among similar traffic flows. Evaluations demonstrate that MS-PreTE achieves state-of-the-art performance on three mobile application datasets, boosting the F1 score for Cross-platform (iOS) to 99.34% (up by 2.1%), Cross-platform (Android) to 98.61% (up by 1.6%), and NUDT-Mobile-Traffic to 87.70% (up by 2.47%). Moreover, MS-PreTE exhibits strong generalization capabilities across four real-world traffic datasets. 
653 |a Traffic flow 
653 |a Machine learning 
653 |a Datasets 
653 |a Deep learning 
653 |a Classification 
653 |a Applications programs 
653 |a Security systems 
653 |a Mobile communications networks 
653 |a Neural networks 
653 |a Mobile computing 
653 |a Architecture 
653 |a Flow characteristics 
653 |a Semantics 
700 1 |a Qiu Yufan  |u Cyberspace Institute of Advanced Technology, Guangzhou University, Guangzhou 510006, China; ziqiwang00@foxmail.com (Z.W.); liuxinyi@e.gzhu.edu.cn (X.L.) 
700 1 |a Liu, Yaping  |u Cyberspace Institute of Advanced Technology, Guangzhou University, Guangzhou 510006, China; ziqiwang00@foxmail.com (Z.W.); liuxinyi@e.gzhu.edu.cn (X.L.) 
700 1 |a Zhang, Shuo  |u Cyberspace Institute of Advanced Technology, Guangzhou University, Guangzhou 510006, China; ziqiwang00@foxmail.com (Z.W.); liuxinyi@e.gzhu.edu.cn (X.L.) 
700 1 |a Liu, Xinyi  |u Cyberspace Institute of Advanced Technology, Guangzhou University, Guangzhou 510006, China; ziqiwang00@foxmail.com (Z.W.); liuxinyi@e.gzhu.edu.cn (X.L.) 
773 0 |t Big Data and Cognitive Computing  |g vol. 9, no. 8 (2025), p. 216-238 
786 0 |d ProQuest  |t Advanced Technologies & Aerospace Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3243981581/abstract/embedded/H09TXR3UUZB2ISDL?source=fedsrch 
856 4 0 |3 Full Text + Graphics  |u https://www.proquest.com/docview/3243981581/fulltextwithgraphics/embedded/H09TXR3UUZB2ISDL?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3243981581/fulltextPDF/embedded/H09TXR3UUZB2ISDL?source=fedsrch