Pearlmutter, Barak A. (1995) Gradient calculations for dynamic recurrent neural networks: A survey. IEEE Transactions on Neural Networks, 6 (5). pp. 1212-1228. ISSN 1045-9227
|
Download (380kB)
| Preview
|
Abstract
We survey learning algorithms for recurrent neural networks with hidden units, and put the various techniques into a common framework. We discuss fixed point learning algorithms, namel recurrent back propagation and deterministic Boltzmann Machines, and non-fixed point algorithms, namely back propagation through time, Elman's history cut off, and Jordan's output feedback architecture. Forward propagation, an outline technique that uses adjoint equations, and variations thereof, are also discussed. In many cases, the unified presentation leads to generalizations of carious sorts. We discuss advantages and disadvantages of temporally continuous neural networks in contrast to clocked ones, continue with some "tricks of the trade" for training, using, and simulating continuous time and recurrent neural networks. We present some simulations, and at the end, address issues of computational complexity and learning speed.
Item Type: | Article |
---|---|
Keywords: | recurrent neural networks; backpropagation through time; real time recurrent learning; trajectory learning; |
Academic Unit: | Faculty of Science and Engineering > Computer Science Faculty of Science and Engineering > Research Institutes > Hamilton Institute |
Item ID: | 5490 |
Depositing User: | Barak Pearlmutter |
Date Deposited: | 14 Oct 2014 14:40 |
Journal or Publication Title: | IEEE Transactions on Neural Networks |
Publisher: | Institute of Electrical and Electronics Engineers (IEEE) |
Refereed: | Yes |
URI: | |
Use Licence: | This item is available under a Creative Commons Attribution Non Commercial Share Alike Licence (CC BY-NC-SA). Details of this licence are available here |
Repository Staff Only(login required)
Item control page |
Downloads
Downloads per month over past year