Loading…
Bearing fault diagnosis method using a spatio-temporal neural network based on feature transfer learning
An intelligent bearing fault diagnosis method based requires a large quantity of labeled data. However, in an actual engineering environment, only a tiny amount of unlabeled data can be collected. To solve this problem, we construct a spatio-temporal neural network (STN) model by multi-layer fusion...
Saved in:
Published in: | Measurement science & technology 2023-01, Vol.34 (1), p.15119 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | An intelligent bearing fault diagnosis method based requires a large quantity of labeled data. However, in an actual engineering environment, only a tiny amount of unlabeled data can be collected. To solve this problem, we construct a spatio-temporal neural network (STN) model by multi-layer fusion of convolutional neural network (CNN) and long-term memory network features. Then, a model based on feature migration is constructed and a STN is applied as the feature extractor of the network. Finally, the Case Western Reserve University bearing dataset is employed to verify the performance of our proposed model, and the influence of different neural network feature extractors (CNN, recurrent neural network, long- and short-term memory network, STN) and several feature transfer measures [correlation alignment, multiple kernel maximum mean discrepancy, joint maximum mean discrepancy, discriminative joint probability maximum mean discrepancy (DJP-MMD) on the accuracy of the model were compared. The results show that the diagnostic accuracy of the proposed method is over 98%, and the diagnostic accuracy can be maintained at around 99% in most cases when the signal to noise ratio (SNR) is 10 dB. When the SNR is lower than 2 dB, the accuracy of the STN-DJPMMD model is still over 88%. |
---|---|
ISSN: | 0957-0233 1361-6501 |
DOI: | 10.1088/1361-6501/ac9078 |