+612 9045 4394
 
CHECKOUT
Neural Engineering : Computation, Representation, and Dynamics in Neurobiological Systems - Chris Eliasmith

Neural Engineering

Computation, Representation, and Dynamics in Neurobiological Systems

Paperback Published: 20th August 2004
ISBN: 9780262550604
Number Of Pages: 380
For Ages: 18+ years old

Share This Book:

Paperback

RRP $140.99
$105.75
25%
OFF
or 4 easy payments of $26.44 with Learn more
This title is not in stock at the Booktopia Warehouse and needs to be ordered from our supplier.
Click here to read more about delivery expectations.

For years, researchers have used the theoretical tools of engineering to understand neural systems, but much of this work has been conducted in relative isolation. In "Neural Engineering," Chris Eliasmith and Charles Anderson provide a synthesis of the disparate approaches current in computational neuroscience, incorporating ideas from neural coding, neural computation, physiology, communications theory, control theory, dynamics, and probability theory. This synthesis, they argue, enables novel theoretical and practical insights into the functioning of neural systems. Such insights are pertinent to experimental and computational neuroscientists and to engineers, physicists, and computer scientists interested in how their quantitative tools relate to the brain. The authors present three principles of neural engineering based on the representation of signals by neural ensembles, transformations of these representations through neuronal coupling weights, and the integration of control theory and neural dynamics. Through detailed examples and in-depth discussion, they make the case that these guiding principles constitute a useful theory for generating large-scale models of neurobiological function. A software package written in MatLab for use with their methodology, as well as examples, course notes, exercises, documentation, and other material, are available on the Web.

"From principle component analysis to Kalman filters, information theory to attractor dynamics, this book is a brilliant introduction to the mathematical and engineering methods used to analyse neural function." - Leif Finkel, Professor, Neuroengineering Research Laboratories, University of Pennsylvania

Prefacep. xiii
Using this book as a course textp. xvii
Acknowledgmentsp. xix
Of neurons and engineersp. 1
Explaining neural systemsp. 3
Neural representationp. 5
The single neuronp. 9
Beyond the single neuronp. 11
Neural transformationp. 13
Three principles of neural engineeringp. 15
Principle 1p. 16
Principle 2p. 17
Principle 3p. 18
Addendump. 18
Methodologyp. 19
System descriptionp. 19
Design specificationp. 21
Implementationp. 21
Discussionp. 22
A possible theory of neurobiological systemsp. 23
Representation
Representation in populations of neuronsp. 29
Representing scalar magnitudesp. 30
Engineered representationp. 30
Biological representationp. 33
Noise and precisionp. 40
Noisy neuronsp. 40
Biological representation and noisep. 42
An example: Horizontal eye positionp. 44
System descriptionp. 44
Design specificationp. 46
Implementationp. 47
Discussionp. 48
Representing vectorsp. 49
An example: Arm movementsp. 52
System descriptionp. 53
Design specificationp. 54
Implementationp. 55
Discussionp. 55
An example: Semicircular canalsp. 57
System descriptionp. 57
Implementationp. 58
Summaryp. 59
Extending population representationp. 61
A representational hierarchyp. 61
Function representationp. 63
Function spaces and vector spacesp. 69
An example: Working memoryp. 72
System descriptionp. 73
Design specificationp. 74
Implementationp. 77
Discussionp. 78
Summaryp. 79
Temporal representation in spiking neuronsp. 81
The leaky integrate-and-fire (LIF) neuronp. 81
Introductionp. 81
Characterizing the LIF neuronp. 83
Strengths and weaknesses of the LIF neuron modelp. 88
Temporal codes in neuronsp. 89
Decoding neural spikesp. 92
Introductionp. 92
Neuron pairsp. 94
Representing time dependent signals with spikesp. 96
Discussionp. 103
Information transmission in LIF neuronsp. 105
Finding optimal decoders in LIF neuronsp. 105
Information transmissionp. 109
Discussionp. 114
More complex single neuron modelsp. 115
Adapting LIF neuronp. 116
[theta]-neuronp. 118
Adapting, conductance-based neuronp. 123
Discussionp. 126
Summaryp. 127
Population-temporal representationp. 129
Putting time and populations together againp. 129
Noise and precision: Dealing with distortionsp. 132
An example: Eye position revisitedp. 136
Implementationp. 136
Discussionp. 137
Summaryp. 139
Transformation
Feed-forward transformationsp. 143
Linear transformations of scalarsp. 143
A communication channelp. 143
Adding two variablesp. 148
Linear transformations of vectorsp. 151
Nonlinear transformationsp. 153
Multiplying two variablesp. 154
Negative weights and neural inhibitionp. 160
Analysisp. 161
Discussionp. 166
An example: The vestibular systemp. 168
System descriptionp. 169
Design specificationp. 174
Implementationp. 175
Discussionp. 180
Summaryp. 182
Analyzing representation and transformationp. 185
Basis vectors and basis functionsp. 185
Decomposing [Gamma]p. 192
Determining possible transformationsp. 196
Linear tuning curvesp. 200
Gaussian tuning curvesp. 204
Quantifying representationp. 206
Representational capacityp. 206
Useful representationp. 208
The importance of diversityp. 210
Summaryp. 216
Dynamic transformationsp. 219
Control theory and neural modelsp. 221
Introduction to control theoryp. 221
A control theoretic description of neural populationsp. 222
Revisiting levels of analysisp. 225
Three principles of neural engineering quantifiedp. 230
An example: Controlling eye positionp. 232
Implementationp. 233
Discussionp. 240
An example: Working memoryp. 244
Introductionp. 244
Implementationp. 244
Dynamics of the vector representationp. 244
Simulation resultsp. 245
Discussionp. 248
Attractor networksp. 250
Introductionp. 250
Generalizing representationp. 254
Generalizing dynamicsp. 256
Discussionp. 258
An example: Lamprey locomotionp. 260
Introductionp. 260
System descriptionp. 261
Design specificationp. 264
Implementationp. 265
Discussionp. 271
Summaryp. 273
Statistical inference and learningp. 275
Statistical inference and neurobiological systemsp. 275
An example: Interpreting ambiguous inputp. 281
An example: Parameter estimationp. 283
An example: Kalman filteringp. 287
Two versions of the Kalman filterp. 288
Discussionp. 291
Learningp. 293
Learning a communication channelp. 294
Learning from learningp. 298
Summaryp. 300
Chapter 2 derivationsp. 301
Determining optimal decoding weightsp. 301
Chapter 4 derivationsp. 303
Opponency and linearityp. 303
Leaky integrate-and-fire model derivationsp. 303
Optimal filter analysis with a sliding windowp. 305
Information transmission of linear estimators for nonlinear systemsp. 309
Chapter 5 derivationsp. 313
Residual fluctuations due to spike trainsp. 313
Chapter 6 derivationsp. 317
Coincidence detectionp. 317
Chapter 7 derivationsp. 319
Practical considerations for finding linear decoders for x and f (x)p. 319
Finding the useful representational spacep. 323
Chapter 8 derivationsp. 327
Synaptic dynamics dominate neural dynamicsp. 327
Derivations for the lamprey modelp. 327
Determining muscle tensionp. 327
Errorp. 329
Oscillator dynamicsp. 331
Coordinate changes with matricesp. 332
Projection matricesp. 333
Referencesp. 335
Indexp. 351
Table of Contents provided by Ingram. All Rights Reserved.

ISBN: 9780262550604
ISBN-10: 0262550601
Series: Computational Neuroscience Series
Audience: Professional
For Ages: 18+ years old
Format: Paperback
Language: English
Number Of Pages: 380
Published: 20th August 2004
Publisher: MIT Press Ltd
Country of Publication: US
Dimensions (cm): 22.9 x 17.8  x 1.6
Weight (kg): 0.55