+612 9045 4394
Convergence Analysis of Recurrent Neural Networks : Network Theory and Applications - Zhang Yi

Convergence Analysis of Recurrent Neural Networks

Network Theory and Applications

Hardcover Published: 30th November 2003
ISBN: 9781402076947
Number Of Pages: 233

Share This Book:


RRP $467.99
or 4 easy payments of $81.06 with Learn more
Ships in 7 to 10 business days

Other Available Editions (Hide)

  • Paperback View Product Published: 14th September 2013

Since the outstanding and pioneering research work of Hopfield on recurrent neural networks (RNNs) in the early 80s of the last century, neural networks have rekindled strong interests in scientists and researchers. Recent years have recorded a remarkable advance in research and development work on RNNs, both in theoretical research as weIl as actual applications. The field of RNNs is now transforming into a complete and independent subject. From theory to application, from software to hardware, new and exciting results are emerging day after day, reflecting the keen interest RNNs have instilled in everyone, from researchers to practitioners. RNNs contain feedback connections among the neurons, a phenomenon which has led rather naturally to RNNs being regarded as dynamical systems. RNNs can be described by continuous time differential systems, discrete time systems, or functional differential systems, and more generally, in terms of non- linear systems. Thus, RNNs have to their disposal, a huge set of mathematical tools relating to dynamical system theory which has tumed out to be very useful in enabling a rigorous analysis of RNNs.

List of Figuresp. ix
Prefacep. xi
Acknowledgmentsp. xv
Introductionp. 1
Introductionp. 1
Recurrent Neural Networksp. 1
Convergence of RNNsp. 2
Outline of the Bookp. 4
Some Notations and Terminologiesp. 7
Energy Functions Method for Convergence Analysisp. 8
Energy Functionsp. 8
Relationships Between Minimums of Energy Functions and Equilibria of Networksp. 11
Conclusionsp. 14
Hopfield Recurrent Neural Networksp. 15
Introductionp. 15
Equilibria Analysisp. 17
Complete Stabilityp. 20
Global Convergencep. 21
Global Asymptotic Stabilityp. 22
Global Exponential Stabilityp. 23
Local Exponential Convergencep. 27
Examplep. 29
Discussions and Conclusionsp. 31
Cellular Neural Networksp. 33
Introductionp. 33
Properties of Output functionp. 34
Standard CNNsp. 37
The Standard CNN Modelp. 37
Equilibria Analysisp. 38
Complete Stability of CNNsp. 40
Global Exponential Stability and Convergence Rate Estimationp. 45
CNNs with Constant Delaysp. 50
Model of CNNs with Constant Delaysp. 51
Conditions for GESp. 52
CNNs with Infinite Delayp. 54
Model of CNNs with Infinite Delayp. 54
Preliminariesp. 55
Relations Between State Stability and Output Stabilityp. 56
Global Convergence Analysisp. 59
Examplesp. 65
Conclusionsp. 66
Recurrent Neural Networks With Unsaturating Piecewise Linear Activation Functionsp. 69
Introductionp. 69
Preliminariesp. 71
Multistability Analysisp. 74
Boundedness and Global Attractivityp. 74
Complete Stabilityp. 81
Simulation Examplesp. 84
Conclusionsp. 89
Lotka-Volterra Recurrent Neural Networks with Delaysp. 91
Introductionp. 91
Multistability Analysisp. 92
Preliminariesp. 92
Boundedness and Global Attractivityp. 94
Complete Convergencep. 99
Simulations and Examplesp. 103
Monostability Analysisp. 104
Preliminariesp. 104
Positive Lower Boundednessp. 106
Exponential Convergencep. 110
Asymptotic Convergencep. 112
Simulation Resultsp. 115
Conclusionsp. 117
Delayed Recurrent Neural Networks With Global Lipschitz Activation Functionsp. 119
Introductionp. 119
Global Lipschitz Activation Functionsp. 119
Functional Differential Equationsp. 121
Organisationsp. 121
RNNs with Constant Delaysp. 122
Preliminariesp. 122
Convergence Rate Estimatep. 124
Global Exponential Stabilityp. 128
Discussions and Illustrative Examplesp. 129
Conclusionsp. 130
RNNs with Variable Delaysp. 131
Preliminariesp. 131
Convergence Analysisp. 132
Examples and Discussionsp. 138
Conclusionsp. 140
Special RNN model with Delaysp. 140
Preliminariesp. 140
Equilibrium Point and Convergence Analysisp. 142
Examples and Discussionsp. 148
Conclusionsp. 150
Absolute Stability and Absolute Periodicityp. 150
Preliminariesp. 151
Main Resultsp. 154
Simulation Study on Absolute Periodicityp. 159
Conclusionsp. 162
Bidirectional Associative Memory RNNs with Delaysp. 163
Equilibrium Points and Stability Analysisp. 163
Stable Periodic Trajectoriesp. 168
Discussions and Examplesp. 169
Other Models of Continuous Time Recurrent Neural Networksp. 171
RNNs with Variable Inputsp. 171
Introductionp. 171
Preliminariesp. 173
Convergence Analysisp. 174
Examplesp. 181
RNN Model for Extracting Eigenvectorsp. 183
Introductionp. 183
Equilibrium Points Analysisp. 184
Representation of Solutionsp. 185
Convergence Analysisp. 187
Examples and Simulation Resultsp. 190
Discrete Recurrent Neural Networksp. 195
Introductionp. 195
Discrete RNNs with Unsaturating Piecewise Linear Activation Functionsp. 196
Preliminariesp. 197
Boundedness and Global Attractivityp. 199
Complete Convergencep. 201
Simulation Examplesp. 205
Discrete RNN Model for Winner Take All Problemp. 208
Neural Network Modelp. 209
Dynamic Propertiesp. 210
Network Response Timep. 214
Simulationsp. 215
Table of Contents provided by Ingram. All Rights Reserved.

ISBN: 9781402076947
ISBN-10: 1402076940
Series: Network Theory and Applications
Audience: Professional
Format: Hardcover
Language: English
Number Of Pages: 233
Published: 30th November 2003
Publisher: Springer-Verlag New York Inc.
Country of Publication: US
Dimensions (cm): 23.4 x 15.5  x 1.91
Weight (kg): 0.55