On the choice of parameters of the cost function in nested modular RNN's

Mandic, D. P. and Chambers, J. A. (1999) On the choice of parameters of the cost function in nested modular RNN's. IEEE Transactions on Neural Networks, 11 (2). pp. 315-322. ISSN 1045-9227

Full text not available from this repository.

Abstract

We address the choice of the coefficients in the cost function of a modular nested recurrent neural-network (RNN) architecture, known as the pipelined recurrent neural network (PRNN). Such a network can cope with the problem of vanishing gradient, experienced in prediction with RNN's. Constraints on the coefficients of the cost function, in the form of a vector norm, are considered. Unlike the previous cost function for the PRNN, which included a forgetting factor motivated by the recursive least squares (RLS) strategy, the proposed forms of cost function provide “forgetting” of the outputs of adjacent modules based upon the network architecture. Such an approach takes into account the number of modules in the PRNN, through the unit norm constraint on the coefficients of the cost function of the PRNN. This is shown to be particularly suitable, since due to inherent nesting in the PRNN, every module gives its full contribution to the learning process, whereas the unit norm constrained cost function introduces a sense of forgetting in the memory management of the PRNN. The PRNN based upon a modified cost function outperforms existing PRNN schemes in the time series prediction simulations presented

Item Type: Article
Faculty \ School: Faculty of Science > School of Computing Sciences
Depositing User: Vishal Gautam
Date Deposited: 07 Mar 2011 14:22
Last Modified: 15 Dec 2022 01:58
URI: https://ueaeprints.uea.ac.uk/id/eprint/23716
DOI: 10.1109/72.839003

Actions (login required)

View Item View Item