Loading…
A Lightweight and Adaptive Knowledge Distillation Framework for Remaining Useful Life Prediction
For prognostics and health management of industrial systems, machine remaining useful life (RUL) prediction is an essential task. While deep learning-based methods have achieved great successes in RUL prediction tasks, large-scale neural networks are still difficult to deploy on edge devices owing t...
Saved in:
Published in: | IEEE transactions on industrial informatics 2023-08, Vol.19 (8), p.1-11 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | For prognostics and health management of industrial systems, machine remaining useful life (RUL) prediction is an essential task. While deep learning-based methods have achieved great successes in RUL prediction tasks, large-scale neural networks are still difficult to deploy on edge devices owing to the constraints of memory capacity and computing power. In this paper, we propose a lightweight and adaptive knowledge distillation (KD) framework to alleviate this problem. Firstly, multiple teacher models are compressed into a student model through KD to improve the industrial prediction accuracy. Secondly, a dynamic exiting method is studied to enable an adaptive inference on the distilled student model. Finally, we develop a reparameterization scheme to further lessen the student network. Experiments on two turbofan engine degradation datasets and a bearing degradation dataset demonstrate that our method significantly outperforms the state-of-the-art KD methods and enables the distilled model with an adaptive inference ability. |
---|---|
ISSN: | 1551-3203 1941-0050 |
DOI: | 10.1109/TII.2022.3224969 |