+612 9045 4394
 
CHECKOUT
Modeling Brain Function : The World of Attractor Neural Networks - Daniel J. Amit

Modeling Brain Function

The World of Attractor Neural Networks

Paperback Published: 21st December 1992
ISBN: 9780521421249
Number Of Pages: 524

Share This Book:

Paperback

RRP $98.95
$83.95
15%
OFF
or 4 easy payments of $20.99 with Learn more
This title is not in stock at the Booktopia Warehouse and needs to be ordered from our supplier.
Click here to read more about delivery expectations.

Exploring one of the most exciting and potentially rewarding areas of scientific research, the study of the principles and mechanisms underlying brain function, this book introduces and explains the techniques brought from physics to the study of neural networks and the insights they have stimulated. Substantial progress in understanding memory, the learning process, and self-organization by studying the properties of models of neural networks have resulted in discoveries of important parallels between the properties of statistical, nonlinear cooperative systems in physics and neural networks. The author presents a coherent and clear, nontechnical view of all the basic ideas and results. More technical aspects are restricted to special sections and appendices in each chapter.

"...of interest to those following the neural net field...takes off from discoveries that link areas of physics with the emerging neural network paradigm." Intelligence Monthly "...regard this book as an opening of a discussion--undoubtedly a very qualified one." Journal of Mathematical Psychology

Prefacep. xiii
Introductionp. 1
Philosophy and Methodologyp. 1
Reduction to physics and physics modeling analoguesp. 1
Methods for mind and matterp. 3
Some methodological questionsp. 5
Neurophysiological Backgroundp. 9
Building blocks for neural networksp. 9
Dynamics of neurons and synapsesp. 12
More complicated building blocksp. 15
From biology to information processingp. 17
Modeling Simplified Neurophysiological Informationp. 18
Neuron as perceptron and formal neuronp. 18
Digression on formal neurons and perceptronsp. 20
Beyond the basic perceptronp. 25
Building blocks for attractor neural networks (ANN)p. 27
The Network and the Worldp. 31
Neural states, network states and state spacep. 31
Digression on the relation between measuresp. 33
Representations on network statesp. 35
Thinking about output mechanismp. 38
Spontaneous Computation vs. Cognitive Processingp. 44
Input systems, transducers, transformersp. 44
ANN's as computing elements -- a positionp. 45
ANN's and computation of mental representationsp. 48
Bibliographyp. 53
The Basic Attractor Neural Networkp. 58
Networks of Analog, Discrete, Noisy Neuronsp. 58
Analog neurons, spike rates, two-state neural modelsp. 58
Binary representation of single neuron activityp. 63
Noisy dynamics of discrete two-state neuronsp. 65
Dynamical Evolution of Network Statesp. 68
Network dynamics of discrete-neuronsp. 68
Synchronous dynamicsp. 70
Asynchronous dynamicsp. 72
Sample trajectories and lessons about dynamicsp. 74
Types of trajectories and possible interpretation - a summaryp. 79
On Attractorsp. 81
The landscape metaphorp. 81
Perception, recognition and recallp. 84
Perception errors due to spurious states - possible role of noisep. 85
Psychiatric speculations and imagesp. 87
The role of noise and simulated annealingp. 89
Frustration and diversity of attractorsp. 91
Bibliographyp. 95
General Ideas Concerning Dynamicsp. 97
The Stochastic Process, Ergodicity and Beyondp. 97
Stochastic equation and apparent ergodicityp. 97
Two ways of evading ergodicityp. 101
Cooperativity as an Emergent Property in Magnetic Analogp. 105
Ising model for a magnet - spin, field and interactionp. 105
Dynamics and equilibrium propertiesp. 108
Noiseless, short range ferromagnetp. 112
Fully connected Ising model: real non-ergodicityp. 119
From Dynamics to Landscapes - The Free Energyp. 125
Energy as Lyapunov function for noiseless dynamicsp. 125
Parametrized attractor distributions with noisep. 126
Free-energy landscapes - a noisy Lyapunov functionp. 127
Free-energy minima, non-ergodicity, order-parametersp. 129
Free-Energy of Fully Connected Ising Modelp. 131
From minimization equation to the free-energyp. 131
The analytic way to the free-energyp. 133
Attractors at metastable statesp. 140
Synaptic Symmetry and Landscapesp. 141
Noiseless asynchronous dynamics - energyp. 141
Detailed balance for noisy asynchronous dynamicsp. 142
Noiseless synchronous dynamics - Lyapunov functionp. 143
Detailed balance for noisy synchronous dynamicsp. 145
Appendix: Technical Details for Stochastic Equationsp. 146
The maximal eigen-value and the associated vectorp. 146
Differential equation for mean magnetizationp. 147
The minimization of the dynamical free-energyp. 150
Legendre transform for the free-energyp. 152
Bibliographyp. 153
Symmetric Neural Networks at Low Memory Loadingp. 155
Motivations and List of Resultsp. 155
Simplifying assumptions and specific questionsp. 155
Specific answers for low loading of random memoriesp. 158
Properties of the noiseless networkp. 162
Properties of the network in the presence of fast noisep. 166
Explicit Construction of Synaptic Efficaciesp. 169
Choice of memorized patternsp. 169
Storage prescription - "Hebb's rule"p. 170
A decorrelating (but nonlocal) storage prescriptionp. 172
Stability Considerations at Low Storagep. 174
Signal to noise analysis - memories, spurious statesp. 174
Basins of attraction and retrieval timesp. 178
Neurophysiological interpretationp. 180
Mean Field Approach to Attractorsp. 181
Self-consistency and equations for attractorsp. 181
Self-averaging and the final equationsp. 187
Free-energy, extrema, stabilityp. 189
Mean-field and free-energy - synchronous dynamicsp. 191
Retrieval States, Spurious States - Noiselessp. 192
Perfect retrieval of memorized patternsp. 192
Noiseless, symmetric spurious memoriesp. 194
Non-symmetric spurious statesp. 198
Are spurious states a free lunch?p. 199
Role of Noise at Low Loadingp. 200
Ergodicity at high noise levels - asynchronousp. 200
Just below the critical noise levelp. 201
Positive role of noise and retrieval with no fixed pointsp. 206
Appendix: Technical Details for Low Storagep. 208
Free-energy at finite p - asynchronousp. 208
Free-energy and solutions - synchronous dynamicsp. 209
Bound on magnitude of overlapsp. 211
Asymmetric spurious solutionp. 212
Bibliographyp. 213
Storage and Retrieval of Temporal Sequencesp. 215
Motivations: Introspective, Biological, Philosophicalp. 215
The introspective motivationp. 215
The biological motivationp. 216
Philosophical motivationsp. 218
Storing and Retrieving Temporal Sequencesp. 221
Functional asymmetryp. 221
Early ideas for instant temporal sequencesp. 221
Temporal Sequences by Delayed Synapsesp. 226
A simple generalization and its motivationp. 226
Dynamics with fast and slow synapsesp. 229
Simulation examples of sequence recallp. 231
Adiabatically varying energy landscapesp. 235
Bi-phasic oscillations and CPG'sp. 238
Tentative Steps into Abstract Computationp. 239
The attempt to reintroduce structured operationsp. 239
Ann counting chimesp. 241
Counting network - an exercise in connectionist programmingp. 241
The networkp. 243
Its dynamicsp. 245
Simulationsp. 248
Reflections on associated cognitive psychologyp. 251
Sequences Without Synaptic Delaysp. 253
Basic oscillator - origin of cognitive time scalep. 253
Behavior in the absence of noisep. 255
The role of noisep. 256
Synaptic structure and underlying dynamicsp. 259
Network storing sequence with several patternsp. 262
Appendix: Elaborate Temporal Sequencesp. 262
Temporal sequences by time averaged synaptic inputsp. 262
Temporal sequences without errorsp. 266
Bibliographyp. 267
Storage Capacity of ANN'sp. 271
Motivation and general considerationsp. 271
Different measures of storage capacityp. 271
Storage capacity of human brainsp. 273
Intrinsic interest in high storagep. 275
List of resultsp. 275
Statistical Estimates of Storagep. 278
Statistical signal to noise analysisp. 278
Absolute informational bounds on storage capacityp. 283
Coupling (synaptic efficacies) for optimal storagep. 285
Theory Near Memory Saturationp. 289
Mean-field equations with replica symmetryp. 289
Retrieval in the absence of fast noisep. 294
Analysis of the T = 0 equationsp. 299
Memory Saturation with Noise and Fieldsp. 304
A tour in the T-[alpha] phase diagramp. 304
Effect of external fields - thresholds and PSP'sp. 308
Fields coupled to several patternsp. 311
Some technical details related to phase diagramsp. 312
Balance Sheet for Standard ANNp. 315
Limiting framework and analytic consequencesp. 315
Finite-size effects and basins of attraction: simulationsp. 318
Beyond the Memory Blackout Catastrophep. 324
Bounded synapses and palimpsest memoryp. 324
The 7 [plus or minus] 2 rule and palimpsest memoriesp. 328
Appendix: Replica Symmetric Theoryp. 330
The replica methodp. 330
The free-energy and the mean-field equationsp. 332
Marginal storage and palimpsestsp. 339
Bibliographyp. 342
Robustness - Getting Closer to Biologyp. 345
Synaptic Noise and Synaptic Dilutionp. 345
Two meanings of robustnessp. 345
Noise in synaptic efficaciesp. 347
Random symmetric dilution of synapsesp. 352
Non-Linear Synapses and Limited Analog Depthp. 355
Place and role of non-linear synapsesp. 355
Properties of networks with clipped synapsesp. 357
Non-linear storage and the noisy equivalentp. 359
Clipping at low storage levelp. 362
Random vs. Functional Synaptic Asymmetryp. 363
Random asymmetry and performance qualityp. 363
Asymmetry, noise and spin-glass suppressionp. 366
Neuronal specificity of synapses - Dale's lawp. 368
Extreme asymmetric dilutionp. 370
Functional asymmetryp. 375
Effective Cortical Cycle Timesp. 375
Slow bursts and relative refractory periodp. 375
Neuronal memory and expanded scenariop. 377
Simplified scenario for relative refractory periodp. 378
Appendix: Technical Detailsp. 380
Digression - the mean-field equationsp. 380
Dilution requirementp. 384
Bibliographyp. 385
Memory Data Structuresp. 387
Biological and Computational Motivationp. 387
Low mean activity level and background-foreground asymmetryp. 387
Hierarchies for biology and for computationp. 388
Local Treatment of Low Activity Patternsp. 389
Demise of naive standard modelp. 389
Modified ANN and a plague of spurious statesp. 391
Constrained dynamics - monitoring thresholdsp. 396
Properties of the constrained biased networkp. 398
Quantity of information in an ANN with low activityp. 403
More effective storage of low activity (sparse) patternsp. 405
Hierarchical Data Structures in a Single Networkp. 409
Early proposalsp. 409
Explicit construction of hierarchy in a single ANNp. 410
Properties of hierarchy in a single networkp. 412
Prosopagnosia and learning class propertiesp. 412
Multy-ancestry with many generationsp. 414
Hierarchies in Multi-ANN: Generalization Firstp. 418
Organization of the data and the networksp. 418
Hierarchical dynamicsp. 420
Hierarchy for image vector quantizationp. 422
Appendix: Technical Details for Biased Patternsp. 423
Noise estimates for biased patternsp. 423
Mean-field equations in noiseless biased networkp. 424
Retrieval entropy in biased networkp. 424
Mean-square noise in low activity networkp. 425
Bibliographyp. 426
Learningp. 428
The Context of Learningp. 428
General Comments and a limited scopep. 428
Modes, time scales and other constraintsp. 430
The need for learning modesp. 432
Results for learning in learning modesp. 433
Learning in Modesp. 434
Perceptron learningp. 434
ANN learning by perceptron algorithmp. 438
Local learning of the Kohonen synaptic matrixp. 441
Natural Learning - Double Dynamicsp. 443
General featuresp. 443
Learning in a network of physiological neuronsp. 444
Learning to form associationsp. 447
Memory generation and maintenancep. 450
Technical Details in Learning Modelsp. 455
Local Iterative Construction of Projector Matrixp. 455
The free energy and the correlation functionp. 458
Bibliographyp. 458
Hardware Implementations of Neural Networksp. 461
Situating Artificial Neural Networksp. 461
The role of hardware implementationsp. 461
Motivations for different designsp. 462
The VLSI Neural Networkp. 465
High density high speed integrated chipp. 465
Smaller, more flexible electronic ANN'sp. 469
The Electro-Optical ANNp. 474
Shift Register (CCD) Implementationp. 477
Bibliographyp. 479
Glossaryp. 481
Indexp. 487
Table of Contents provided by Syndetics. All Rights Reserved.

ISBN: 9780521421249
ISBN-10: 0521421241
Audience: Professional
Format: Paperback
Language: English
Number Of Pages: 524
Published: 21st December 1992
Publisher: CAMBRIDGE UNIV PR
Country of Publication: GB
Dimensions (cm): 22.86 x 15.16  x 2.77
Weight (kg): 0.71