Loading…
Nonlinear FIR adaptive filters with a gradient adaptive amplitude in the nonlinearity
A nonlinear gradient descent (NGD) learning algorithm with an adaptive amplitude of the nonlinearity is derived for the class of nonlinear finite impulse response (FIR) adaptive filters (dynamical perceptron). This is based on the adaptive amplitude backpropagation (AABP) algorithm for large-scale n...
Saved in:
Published in: | IEEE signal processing letters 2002-08, Vol.9 (8), p.253-255 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | A nonlinear gradient descent (NGD) learning algorithm with an adaptive amplitude of the nonlinearity is derived for the class of nonlinear finite impulse response (FIR) adaptive filters (dynamical perceptron). This is based on the adaptive amplitude backpropagation (AABP) algorithm for large-scale neural networks. The amplitude of the nonlinear activation function is made gradient adaptive to give the adaptive amplitude nonlinear gradient descent (AANGD) algorithm, making the AANGD suitable for processing nonlinear and nonstationary input signals with a large dynamical range. Experimental results show the AANGD algorithm outperforming the standard NGD algorithm on both colored and nonlinear input with large dynamics. Despite its simplicity, the considered algorithm proves suitable for adaptive filtering of nonlinear and nonstationary signals. |
---|---|
ISSN: | 1070-9908 1558-2361 |
DOI: | 10.1109/LSP.2002.803001 |