Loading…
On Assessing Vulnerabilities of the 5G Networks to Adversarial Examples
The use of artificial intelligence and machine learning is recognized as the key enabler for 5G mobile networks which would allow service providers to tackle the network complexity and ensure security, reliability and allocation of the necessary resources to their customers in a dynamic, robust and...
Saved in:
Published in: | IEEE access 2022, Vol.10, p.126285-126303 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The use of artificial intelligence and machine learning is recognized as the key enabler for 5G mobile networks which would allow service providers to tackle the network complexity and ensure security, reliability and allocation of the necessary resources to their customers in a dynamic, robust and trustworthy way. Dependability of the future generation networks on accurate and timely performance of its artificial intelligence components means that disturbance in the functionality of these components may have negative impact on the entire network. As a result, there is an increasing concern about the vulnerability of intelligent machine learning driven frameworks to adversarial effects. In this study, we evaluate various adversarial example generation attacks against multiple artificial intelligence and machine learning models which can potentially be deployed in future 5G networks. First, we describe multiple use cases for which attacks on machine learning components are conceivable including the models employed and the data used for their training. After that, attack algorithms, their implementations and adjustments to the target models are summarised. Finally, the attacks implemented for the aforementioned use cases are evaluated based on deterioration of the objective functions optimised by the target models. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2022.3225921 |