Relating the Slope of the Activation Function and the Learning Rate Within a Recurrent Neural Network

Mandic, D. P. and Chambers, J. A. (1999) Relating the Slope of the Activation Function and the Learning Rate Within a Recurrent Neural Network. Neural Computation, 11 (5). pp. 1069-1977. ISSN 0899-7667

Full text not available from this repository.

Abstract

A relationship between the learning rate ? in the learning algorithm, and the slope ß in the nonlinear activation function, for a class of recurrent neural networks (RNNs) trained by the real-time recurrent learning algorithm is provided. It is shown that an arbitrary RNN can be obtained via the referent RNN, with some deterministic rules imposed on its weights and the learning rate. Such relationships reduce the number of degrees of freedom when solving the nonlinear optimization task of finding the optimal RNN parameters.

Item Type: Article
Faculty \ School: Faculty of Science > School of Computing Sciences
Related URLs:
Depositing User: Vishal Gautam
Date Deposited: 10 Mar 2011 09:36
Last Modified: 21 Apr 2020 21:22
URI: https://ueaeprints.uea.ac.uk/id/eprint/23718
DOI: 10.1162/089976699300016340

Actions (login required)

View Item View Item