Loading…
A novel method for noise reduction of blade tip timing signals based on sparse representation and dictionary learning
Blade Tip Timing Technology (BTT) is a new type of blade vibration measurement method with non-contact measurement capabilities, which is associated with high efficiency and convenience. However, due to under-sampling feature of the BTT signal, the method can only measure part of the vibration infor...
Saved in:
Published in: | Journal of low frequency noise, vibration, and active control vibration, and active control, 2024-03, Vol.43 (1), p.437-454 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Blade Tip Timing Technology (BTT) is a new type of blade vibration measurement method with non-contact measurement capabilities, which is associated with high efficiency and convenience. However, due to under-sampling feature of the BTT signal, the method can only measure part of the vibration information. Therefore, various BTT spectral reconstruction algorithms have been developed based on the under-sampling feature and sparsity of the blade vibration spectrum. Because of the harsh working environment of the blades, BTT signal typically contains high-level Gaussian noise in actual measurement, which significantly reduces frequency-domain sparsity and even aliases the vibration modals, resulting in a decrease in the performance of sparse reconstruction algorithms. In this paper, a multi-sequences BTT signal noise reduction method is proposed, which is based on sparse representation to suppress Gaussian noise. Through appropriately modifying the constraint in the â„“0 optimization problem, the method filters out the noise elements in the sparse vector of the BTT signal. Then, denoising is completed by sparse inverse transformation. Finally, a global average-based singular value decomposition dictionary learning algorithm (GA-K-SVD) is proposed to generate an over-complete sparse dictionary which is adaptable to the original signal, to increase the effectiveness of GA-K-SVD. At last, simulation and experience are carried out to verify the performance optimization effect for sparse reconstruction algorithm and the effectiveness of proposed noise reduction method. |
---|---|
ISSN: | 1461-3484 2048-4046 |
DOI: | 10.1177/14613484231200856 |