Loading…
Stability analysis of the bias compensated LMS algorithm
The presence of noise is a pervasive issue that significantly impacts the performance of adaptive filtering algorithms. To address this challenge, the bias compensation technique has recently emerged, involving the incorporation of an additional term into the update equation. Despite the apparent si...
Saved in:
Published in: | Digital signal processing 2024-04, Vol.147, p.104395, Article 104395 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The presence of noise is a pervasive issue that significantly impacts the performance of adaptive filtering algorithms. To address this challenge, the bias compensation technique has recently emerged, involving the incorporation of an additional term into the update equation. Despite the apparent simplicity of the bias-compensated least-mean-squares algorithm, conducting a comprehensive theoretical analysis of its performance is a complex task. In this paper, we undertake an exact expectation analysis to demonstrate the asymptotic unbiasedness of the algorithm, even when disregarding the commonly assumed theory of independence between adaptive coefficients and input data. Furthermore, to improve the understanding of the algorithm's stability, we employ a stochastic model that assumes independence between the radial and angular distributions of the input vector. The resulting model is intricate, necessitating heuristic approximations to derive practical insights. Notably, our analysis reveals that the upper bound on the step size of the bias-compensated least-mean-squares algorithm consistently remains smaller than that of the least-mean-squares algorithm. These findings gain robust support from extensive simulations. |
---|---|
ISSN: | 1051-2004 1095-4333 |
DOI: | 10.1016/j.dsp.2024.104395 |