+612 9045 4394
 
CHECKOUT
Sequential Monte Carlo Methods in Practice : Statistics for Engineering and Information Science - Arnaud Doucet

Sequential Monte Carlo Methods in Practice

Statistics for Engineering and Information Science

By: Arnaud Doucet (Editor), A. Smith (Foreword by), Nando de Freitas (Editor), Neil Gordon (Editor)

Hardcover Published: 21st June 2001
ISBN: 9780387951461
Number Of Pages: 582

Share This Book:

Hardcover

RRP $616.99
$427.35
31%
OFF
or 4 easy payments of $106.84 with Learn more
Ships in 10 to 15 business days

Other Available Editions (Hide)

  • Paperback View Product Published: 1st December 2010
    $227.94

Monte Carlo methods are revolutionizing the on-line analysis of data in fields as diverse as financial modeling, target tracking and computer vision. These methods, appearing under the names of bootstrap filters, condensation, optimal Monte Carlo filters, particle filters and survival of the fittest, have made it possible to solve numerically many complex, non-standard problems that were previously intractable. This book presents the first comprehensive treatment of these techniques, including convergence results and applications to tracking, guidance, automated target recognition, aircraft navigation, robot navigation, econometrics, financial modeling, neural networks, optimal control, optimal filtering, communications, reinforcement learning, signal enhancement, model averaging and selection, computer vision, semiconductor design, population biology, dynamic Bayesian networks, and time series analysis. This will be of great value to students, researchers and practitioners, who have some basic knowledge of probability. Arnaud Doucet received the Ph. D. degree from the University of Paris-XI Orsay in 1997. From 1998 to 2000, he conducted research at the Signal Processing Group of Cambridge University, UK. He is currently an assistant professor at the Department of Electrical Engineering of Melbourne University, Australia. His research interests include Bayesian statistics, dynamic models and Monte Carlo methods. Nando de Freitas obtained a Ph.D. degree in information engineering from Cambridge University in 1999. He is presently a research associate with the artificial intelligence group of the University of California at Berkeley. His main research interests are in Bayesian statistics and the application of on-line and batch Monte Carlo methods to machine learning. Neil Gordon obtained a Ph.D. in Statistics from Imperial College, University of London in 1993. He is with the Pattern and Information Processing group at the Defence Evaluation and Research Agency in the United Kingdom. His research interests are in time series, statistical data analysis, and pattern recognition with a particular emphasis on target tracking and missile guidance.

From the reviews:

JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION

"...a remarkable, successful effort at making these ideas available to statisticians. It gives an overview, presents available theory, gives a splendid development of various bells and whistles important in practical implementation, and finally gives a large number of detailed examples and case studies...The authors and editors have been careful to write in a unified, readable way...I find it remarkable that the editors and authors have combined to produce an accessible bible that will be studied and used for years to come."

"Usually, very few volumes edited from papers contributed by many different authors result in books which can serve as either good textbooks or as useful reference. However, in the case of this book, it is enough to read the foreword by Adrian Smith to realize that this particular volume is quite different. ... it is a good reference book for SMC." (Mohan Delampady, Sankhya: Indian Journal of Statistics, Vol. 64 (A), 2002)

"In this book the authors present sequential Monte Carlo (SMC) methods ... . Over the last few years several closely related algorithms have appeared under the names `boostrap filters', `particle filters', `Monte Carlo filters', and `survival of the fittest'. The book under review brings together many of these algorithms and presents theoretical developments ... . This book will be of great value to advanced students, researchers, and practitioners who want to learn about sequential Monte Carlo methods for the computational problems of Bayesian Statistics." (E. Novak, Metrika, May, 2003)

"This book provides a very good overview of the sequential Monte Carlo methods and contains many ideas on further research on methodologies and newer areas of application. ... It will be certainly a valuable reference book for students and researchers working in the area of on-line data analysis. ... the techniques discussed in this book are of great relevance to practitioners dealing with real time data." (Pradipta Sarkar, Technometrics, Vol. 45 (1), 2003)

Forewordp. v
Acknowledgmentsp. vii
Contributorsp. xxi
Introductionp. 1
An Introduction to Sequential Monte Carlo Methodsp. 3
Motivationp. 3
Problem statementp. 5
Monte Carlo methodsp. 6
Perfect Monte Carlo samplingp. 7
Importance samplingp. 8
The Bootstrap filterp. 10
Discussionp. 13
Theoretical Issuesp. 15
Particle Filters - A Theoretical Perspectivep. 17
Introductionp. 17
Notation and terminologyp. 17
Markov chains and transition kernelsp. 18
The filtering problemp. 19
Convergence of measure-valued random variablesp. 20
Convergence theoremsp. 21
The fixed observation casep. 21
The random observation casep. 24
Examples of particle filtersp. 25
Description of the particle filtersp. 25
Branching mechanismsp. 28
Convergence of the algorithmp. 31
Discussionp. 33
Appendixp. 35
Conditional probabilities and conditional expectationsp. 35
The recurrence formula for the conditional distribution of the signalp. 38
Interacting Particle Filtering With Discrete Observationsp. 43
Introductionp. 43
Nonlinear filtering: general factsp. 46
An interacting particle system under Case Ap. 48
Subcase A1p. 48
Subcase A2p. 55
An interacting particle system under Case Bp. 60
Subcase B1p. 60
Subcase B2p. 67
Discretely observed stochastic differential equationsp. 71
Case Ap. 72
Case Bp. 73
Strategies for Improving Sequential Monte Carlo Methodsp. 77
Sequential Monte Carlo Methods for Optimal Filteringp. 79
Introductionp. 79
Bayesian filtering and sequential estimationp. 79
Dynamic modelling and Bayesian filteringp. 79
Alternative dynamic modelsp. 80
Sequential Monte Carlo Methodsp. 82
Methodologyp. 82
A generic algorithmp. 85
Convergence resultsp. 86
Application to digital communicationsp. 88
Model specification and estimation objectivesp. 89
SMC applied to demodulationp. 91
Simulationsp. 93
Deterministic and Stochastic Particle Filters in State-Space Modelsp. 97
Introductionp. 97
General issuesp. 98
Model and exact filterp. 98
Particle filtersp. 99
Gaussian quadraturep. 100
Quadrature filtersp. 101
Numerical errorp. 102
A small illustrative examplep. 104
Case studies from ecologyp. 104
Problem area and modelsp. 104
Quadrature filters in practicep. 107
Numerical experimentsp. 110
Concluding remarksp. 112
Appendix: Derivation of numerical errorsp. 114
RESAMPLE-MOVE Filtering with Cross-Model Jumpsp. 117
Introductionp. 117
Problem statementp. 118
The RESAMPLE-MOVE algorithmp. 119
Commentsp. 124
Central limit theoremp. 125
Dealing with model uncertaintyp. 126
Illustrative applicationp. 129
Applying RESAMPLE-MOVEp. 131
Simulation experimentp. 134
Uncertainty about the type of targetp. 135
Conclusionsp. 138
Improvement Strategies for Monte Carlo Particle Filtersp. 139
Introductionp. 139
General sequential importance samplingp. 140
Markov chain movesp. 143
The use of bridging densities with MCMC movesp. 144
Simulation example: TVAR model in noisep. 145
Particle filter algorithms for TVAR modelsp. 146
Bootstrap (SIR) filterp. 148
Auxiliary particle filter (APF)p. 149
MCMC resamplingp. 150
Simulation resultsp. 152
Summaryp. 157
Acknowledgementsp. 158
Approximating and Maximising the Likelihood for a General State-Space Modelp. 159
Introductionp. 159
Bayesian methodsp. 159
Pointwise Monte Carlo approximation of the likelihoodp. 161
Examplesp. 161
Approximation of the likelihood function based on filter samplesp. 164
Approximations based on smoother samplesp. 166
Approximation of the likelihood functionp. 167
Stochastic EM-algorithmp. 167
Comparison of the methodsp. 168
AR(1) processp. 168
Nonlinear example, 3 parametersp. 171
Nonlinear model, 5 parametersp. 173
Discussionp. 173
Recursive estimationp. 173
Monte Carlo Smoothing and Self-Organising State-Space Modelp. 177
Introductionp. 177
General state-space model and state estimationp. 178
The model and the state estimation problemp. 178
Non-Gaussian filter and smootherp. 179
Monte Carlo filter and smootherp. 180
Approximation of non-Gaussian distributionsp. 180
Monte Carlo filteringp. 181
Derivation of the Monte Carlo filterp. 182
Monte Carlo smoothingp. 183
Non-Gaussian smoothing for the stochastic volatility modelp. 186
Nonlinear Smoothingp. 188
Self-organising state-space modelsp. 189
Likelihood of the model and parameter estimationp. 189
Self-organising state-space modelp. 191
Examplesp. 192
Self-organising smoothing for the stochastic volatility modelp. 192
Time series with trend and stochastic volatilityp. 194
Conclusionp. 195
Combined Parameter and State Estimation in Simulation-Based Filteringp. 197
Introduction and historical perspectivep. 197
General frameworkp. 199
Dynamic model and analysis perspectivep. 199
Filtering for statesp. 200
Filtering for states and parametersp. 202
The treatment of model parametersp. 202
Artificial evolution of parametersp. 202
Kernel smoothing of parametersp. 203
Reinterpreting artificial parameter evolutionsp. 204
A general algorithmp. 206
Factor stochastic volatility modellingp. 208
Discussion and future directionsp. 217
A Theoretical Framework for Sequential Importance Sampling with Resamplingp. 225
Introductionp. 225
Sequential importance sampling principlep. 227
Properly weighted samplep. 227
Sequential build-upp. 228
Operations for enhancing SISp. 229
Reweighting, resampling and reallocationp. 230
Rejection control and partial rejection controlp. 231
Marginalisationp. 234
Monte Carlo filter for state-space modelsp. 234
The general state-space modelp. 235
Conditional dynamic linear model and the mixture Kalman filterp. 236
Some examplesp. 237
A simple illustrationp. 237
Target tracking with MKFp. 239
Discussionp. 241
Acknowledgementsp. 242
Improving Regularised Particle Filtersp. 247
Introductionp. 247
Particle filtersp. 249
The (classical) interacting particle filter (IPF)p. 250
Regularised particle filters (RPF)p. 251
Progressive correctionp. 255
Focus on the correction stepp. 256
Principle of progressive correctionp. 257
Adaptive choice of the decompositionp. 258
The local rejection regularised particle filter (L2RPF)p. 260
Description of the filterp. 260
Computing the coefficient c[superscript (i) subscript t]([alpha subscript t])p. 263
Applications to tracking problemsp. 264
Range and bearingp. 265
Bearings-onlyp. 266
Multiple model particle filter (MMPF)p. 269
Auxiliary Variable Based Particle Filtersp. 273
Introductionp. 273
Particle filtersp. 274
The definition of particle filtersp. 274
Sampling the empirical prediction densityp. 274
Weaknesses of particle filtersp. 276
Auxiliary variablep. 277
The basicsp. 277
A generic SIR based auxiliary proposalp. 278
Examples of adaptionp. 283
Fixed-lag filteringp. 288
Reduced random samplingp. 289
Basic ideasp. 289
Simple outlier examplep. 290
Conclusionp. 292
Acknowledgementsp. 293
Improved Particle Filters and Smoothingp. 295
Introductionp. 295
The methodsp. 296
The smooth bootstrapp. 296
Adaptive importance samplingp. 300
The kernel sampler of Hurzeler and Kunschp. 302
Partially smooth bootstrapp. 303
Roughening and sample augmentationp. 305
Application of the methods in particle filtering and smoothingp. 306
Application of smooth bootstrap procedures to a simple control problemp. 308
Description of the problemp. 308
An approach to the continuous-time version of the problemp. 309
An adaptation of Titterington's methodp. 310
Probabilistic criterion 1p. 310
Probabilistic criterion 2: working directly with the costp. 311
Unknown variancesp. 311
Resampling implementationp. 312
Simulation resultsp. 314
Further work on this problemp. 317
Applicationsp. 319
Posterior Cramer-Rao Bounds for Sequential Estimationp. 321
Introductionp. 321
Review of the posterior Cramer-Rao boundp. 322
Bounds for sequential estimationp. 323
Estimation modelp. 324
Posterior Cramer-Rao boundp. 325
Relative Monte Carlo evaluationp. 327
Example--terrain navigationp. 329
Conclusionsp. 338
Statistical Models of Visual Shape and Motionp. 339
Introductionp. 339
Statistical modelling of shapep. 341
Statistical modelling of image observationsp. 343
Sampling methodsp. 345
Modelling dynamicsp. 346
Learning dynamicsp. 349
Particle filteringp. 352
Dynamics with discrete statesp. 354
Conclusionsp. 355
Sequential Monte Carlo Methods for Neural Networksp. 359
Introductionp. 359
Model specificationp. 360
MLP models for regression and classificationp. 360
Variable dimension RBF modelsp. 362
Estimation objectivesp. 365
General SMC algorithmp. 366
Importance sampling stepp. 367
Selection stepp. 368
MCMC Stepp. 369
Exact stepp. 371
On-line classificationp. 371
Simple classification examplep. 372
An application to fault detection in marine diesel enginesp. 373
An application to financial time seriesp. 375
Conclusionsp. 379
Sequential Estimation of Signals under Model Uncertaintyp. 381
Introductionp. 381
The problem of parameter estimation under uncertaintyp. 383
Sequential updating of the solutionp. 384
Sequential algorithm for computing the solutionp. 389
A Sequential-Importance-Resampling schemep. 390
Sequential sampling scheme based on mixturesp. 395
Examplep. 397
Conclusionsp. 400
Acknowledgmentp. 400
Particle Filters for Mobile Robot Localizationp. 401
Introductionp. 401
Monte Carlo localizationp. 403
Bayes filteringp. 403
Models of robot motion and perceptionp. 404
Implementation as particle filtersp. 405
Robot resultsp. 408
Comparison to grid-based localizationp. 410
MCL with mixture proposal distributionsp. 414
The need for better samplingp. 414
An alternative proposal distributionp. 416
The mixture proposal distributionp. 419
Robot resultsp. 420
Multi-robot MCLp. 423
Basic considerationsp. 423
Robot resultsp. 425
Conclusionp. 426
Self-Organizing Time Series Modelp. 429
Introductionp. 429
Generalised state-space modelp. 429
Monte Carlo filterp. 430
Self-organizing time series modelp. 432
Genetic algorithm filterp. 432
Self-organizing state-space modelp. 434
Resampling scheme for filteringp. 435
Selection schemep. 435
Comparison of performance: simulation studyp. 436
Applicationp. 438
Time-varying frequency wave in small count datap. 438
Self-organizing state-space model for time-varying frequency wavep. 439
Resultsp. 440
Conclusionsp. 444
Sampling in Factored Dynamic Systemsp. 445
Introductionp. 445
Structured probabilistic modelsp. 448
Bayesian networksp. 448
Hybrid networksp. 449
Dynamic Bayesian networksp. 451
Particle filtering for DBNsp. 454
Experimental resultsp. 457
Conclusionsp. 464
In-Situ Ellipsometry Solutions Using Sequential Monte Carlop. 465
Introductionp. 465
Application backgroundp. 465
State-space modelp. 467
Ellipsometry measurement modelp. 468
System evolution modelp. 471
Particle filterp. 472
Resultsp. 474
Conclusionp. 475
Acknowledgmentsp. 477
Manoeuvring Target Tracking Using a Multiple-Model Bootstrap Filterp. 479
Introductionp. 479
Optimal multiple-model solutionp. 481
The IMM algorithmp. 483
Multiple model bootstrap filterp. 484
Examplep. 486
Target tracking examplesp. 488
Target scenariosp. 488
Linear, Gaussian testsp. 488
Polar simulation resultsp. 492
Conclusionsp. 495
Acknowledgmentsp. 496
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networksp. 499
Introductionp. 499
RBPF in generalp. 500
How do we choose which nodes to sample?p. 503
The RBPF algorithm in detailp. 506
Application: concurrent localisation and map learning for a mobile robotp. 508
Results on a one-dimensional gridp. 511
Results on a two-dimensional gridp. 514
Conclusions and future workp. 515
Particles and Mixtures for Tracking and Guidancep. 517
Introductionp. 517
Guidance as a stochastic control problemp. 518
Information statep. 519
Dynamic programming and the dual effectp. 520
Separability and certainty equivalencep. 521
Sub-optimal strategiesp. 522
Derivation of control laws from particlesp. 523
Certainty equivalence controlp. 523
A scheme based on open-loop feedback controlp. 524
Guidance in the presence of intermittent spurious objects and clutterp. 525
Introductionp. 525
Problem formulationp. 525
Simulation examplep. 526
Guidance resultsp. 528
Monte Carlo Techniques for Automated Target Recognitionp. 533
Introductionp. 533
The Bayesian posteriorp. 535
Inference enginesp. 536
Jump-diffusion samplingp. 539
Diffusion Processesp. 540
Jump processesp. 541
Jump-diffusion algorithmp. 544
Sensor modelsp. 545
Experimentsp. 547
Acknowledgmentsp. 552
Bibliographyp. 553
Indexp. 577
Table of Contents provided by Syndetics. All Rights Reserved.

ISBN: 9780387951461
ISBN-10: 0387951466
Series: Statistics for Engineering and Information Science
Audience: Professional
Format: Hardcover
Language: English
Number Of Pages: 582
Published: 21st June 2001
Publisher: Springer-Verlag New York Inc.
Country of Publication: US
Dimensions (cm): 24.13 x 16.51  x 3.18
Weight (kg): 0.98

This product is categorised by