Towards an optimal (PRNN) based nolinear predictor

Chambers, J. A. and Mandic, D. P. (1999) Towards an optimal (PRNN) based nolinear predictor. IEEE Transactions on Neural Networks, 10 (6). pp. 1435-1442.

Full text not available from this repository.

Abstract

We present an approach for selecting optimal parameters for the pipelined recurrent neural network (PRNN) in the paradigm of nonlinear and nonstationary signal prediction. We consider the role of nesting, which is inherent to the PRNN architecture. The corresponding number of nested modules needed for a certain prediction task, and their contribution toward the final prediction gain give a thorough insight into the way the PRNN performs, and offers solutions for optimization of its parameters. In particular, nesting allows the forgetting factor in the cost function of the PRNN to exceed unity, hence it becomes an emphasis factor. This compensates for the small contribution of the distant modules to the prediction process, due to nesting, and helps to circumvent the problem of vanishing gradient, experienced in RNNs for prediction. The PRNN is shown to outperform the linear least mean square and recursive least squares predictors, as well as previously proposed PRNN schemes, at no expense of additional computational complexity.

Item Type: Article
Faculty \ School: Faculty of Science > School of Computing Sciences
Depositing User: EPrints Services
Date Deposited: 01 Oct 2010 13:42
Last Modified: 24 Sep 2024 10:28
URI: https://ueaeprints.uea.ac.uk/id/eprint/3759
DOI: 10.1109/72.809088

Actions (login required)

View Item View Item