A normalized gradient descent algorithm for nonlinear adaptive filters using a gradient adaptive step size

Mandic, D. P., Hanna, A. I. and Razaz, M. (2001) A normalized gradient descent algorithm for nonlinear adaptive filters using a gradient adaptive step size. IEEE Signal Processing Letters, 8 (11). pp. 295-297. ISSN 1070-9908

Full text not available from this repository.

Abstract

A fully adaptive normalized nonlinear gradient descent (FANNGD) algorithm for online adaptation of nonlinear neural filters is proposed. An adaptive stepsize that minimizes the instantaneous output error of the filter is derived using a linearization performed by a Taylor series expansion of the output error. For rigor, the remainder of the truncated Taylor series expansion within the expression for the adaptive learning rate is made adaptive and is updated using gradient descent. The FANNGD algorithm is shown to converge faster than previously introduced algorithms of this kind

Item Type: Article
Faculty \ School: Faculty of Science > School of Computing Sciences
Depositing User: Vishal Gautam
Date Deposited: 07 Mar 2011 14:17
Last Modified: 17 Mar 2020 17:56
URI: https://ueaeprints.uea.ac.uk/id/eprint/23881
DOI: 10.1109/97.969448

Actions (login required)

View Item View Item