Loading…
A Novel Multimodal Deep Learning Framework for Encrypted Traffic Classification
Traffic classification is essential for cybersecurity maintenance and network management, and has been widely used in QoS (Quality of Service) guarantees, intrusion detection, and other tasks. Recently, with the emergence of SSL/TLS encryption protocols in the modern Internet environment, the tradit...
Saved in:
Published in: | IEEE/ACM transactions on networking 2023-06, Vol.31 (3), p.1369-1384 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Traffic classification is essential for cybersecurity maintenance and network management, and has been widely used in QoS (Quality of Service) guarantees, intrusion detection, and other tasks. Recently, with the emergence of SSL/TLS encryption protocols in the modern Internet environment, the traditional payload-based classification methods are no longer effective. Some researchers have used machine learning methods to model the flow features of encrypted traffics (e.g. message type, length sequence, statistical features, etc.), and achieved good results in some cases. However, these high-level hand-designed features cannot be used for more fine-grained operations and may lead to the loss of important information, thus affecting the classification accuracy. To overcome this limitation, in this paper, we designed a novel multimodal deep learning framework for encrypted traffic classification called PEAN. PEAN uses the raw bytes and length sequence as the input, and uses the self-attention mechanism to learn the deep relationship among network packets in a biflow. Furthermore, unsupervised pre-training was introduced to enhance PEAN's ability to characterize network packets. Experiments on a real trace set captured in a large data center demonstrate the effectiveness of PEAN, which achieves better results than the state-of-the-art methods. |
---|---|
ISSN: | 1063-6692 1558-2566 |
DOI: | 10.1109/TNET.2022.3215507 |