Loading…

On convergence and optimality of maximum-likelihood APA

Affine projection algorithm (APA) is a well-known algorithm in adaptive filtering applications such as audio echo cancellation. APA relies on three parameters: \(P\) (projection order), \(\mu\) (step size) and \(\delta\) (regularization parameter). It is known that running APA for a fixed set of par...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2023-10
Main Authors: Jalali, Shirin, Nuzman, Carl, Sun, Yue
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Affine projection algorithm (APA) is a well-known algorithm in adaptive filtering applications such as audio echo cancellation. APA relies on three parameters: \(P\) (projection order), \(\mu\) (step size) and \(\delta\) (regularization parameter). It is known that running APA for a fixed set of parameters leads to a tradeoff between convergence speed and accuracy. Therefore, various methods for adaptively setting the parameters have been proposed in the literature. Inspired by maximum likelihood (ML) estimation, we derive a new ML-based approach for adaptively setting the parameters of APA, which we refer to as ML-APA. For memoryless Gaussian inputs, we fully characterize the expected misalignment error of ML-APA as a function of iteration number and show that it converges to zero as \(O({1\over t})\). We further prove that the achieved error is asymptotically optimal. ML-APA updates its estimate once every \(P\) samples. We also propose incremental ML-APA (IML-APA), which updates the coefficients at every time step and outperforms ML-APA in our simulations results. Our simulation results verify the analytical conclusions for memoryless inputs and show that the new algorithms also perform well for strongly correlated input signals.
ISSN:2331-8422