Loading…
Causality-based adversarial attacks for robust GNN modelling with application in fault detection
Fault detection techniques based on graph neural networks have been a trending topic. With the issue of poor robustness, the accuracy relies highly on the quality of the monitoring data. Numerous scholars have come up with robust GNN models. However, the model's accuracy remains low when it com...
Saved in:
Published in: | Reliability engineering & system safety 2024-12, Vol.252, p.110464, Article 110464 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Fault detection techniques based on graph neural networks have been a trending topic. With the issue of poor robustness, the accuracy relies highly on the quality of the monitoring data. Numerous scholars have come up with robust GNN models. However, the model's accuracy remains low when it comes to solving tasks like graph-level fault detection. In this work, the authors propose several causality-based adversarial attacks that are designed with reference to the principles of causal discovery algorithms for generating causal graph models and associated errors. The attack amplifies all possible types of raw errors present in the data, which allows the trained model to be robust and accurate enough to maintain high error detection accuracy with the proposed adversarial elimination regularization. A real dataset considering high-speed train braking system is considered as case study. Three typical graph neural network models including classical GCN, robust GCN and median GCN are taken as base models to verify the validity of the modelling framework. The results prove that the causality-based adversarial attacks proposed in this work can effectively improve all the base models’ robustness with low-quality monitoring data. |
---|---|
ISSN: | 0951-8320 |
DOI: | 10.1016/j.ress.2024.110464 |