Hardcover
Published: 21st June 2001
ISBN: 9780387951461
Number Of Pages: 582
Monte Carlo methods are revolutionizing the on-line analysis of data in fields as diverse as financial modeling, target tracking and computer vision. These methods, appearing under the names of bootstrap filters, condensation, optimal Monte Carlo filters, particle filters and survival of the fittest, have made it possible to solve numerically many complex, non-standard problems that were previously intractable. This book presents the first comprehensive treatment of these techniques, including convergence results and applications to tracking, guidance, automated target recognition, aircraft navigation, robot navigation, econometrics, financial modeling, neural networks, optimal control, optimal filtering, communications, reinforcement learning, signal enhancement, model averaging and selection, computer vision, semiconductor design, population biology, dynamic Bayesian networks, and time series analysis. This will be of great value to students, researchers and practitioners, who have some basic knowledge of probability. Arnaud Doucet received the Ph. D. degree from the University of Paris-XI Orsay in 1997. From 1998 to 2000, he conducted research at the Signal Processing Group of Cambridge University, UK. He is currently an assistant professor at the Department of Electrical Engineering of Melbourne University, Australia. His research interests include Bayesian statistics, dynamic models and Monte Carlo methods. Nando de Freitas obtained a Ph.D. degree in information engineering from Cambridge University in 1999. He is presently a research associate with the artificial intelligence group of the University of California at Berkeley. His main research interests are in Bayesian statistics and the application of on-line and batch Monte Carlo methods to machine learning. Neil Gordon obtained a Ph.D. in Statistics from Imperial College, University of London in 1993. He is with the Pattern and Information Processing group at the Defence Evaluation and Research Agency in the United Kingdom. His research interests are in time series, statistical data analysis, and pattern recognition with a particular emphasis on target tracking and missile guidance.
From the reviews:
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
"...a remarkable, successful effort at making these ideas available to statisticians. It gives an overview, presents available theory, gives a splendid development of various bells and whistles important in practical implementation, and finally gives a large number of detailed examples and case studies...The authors and editors have been careful to write in a unified, readable way...I find it remarkable that the editors and authors have combined to produce an accessible bible that will be studied and used for years to come."
"Usually, very few volumes edited from papers contributed by many different authors result in books which can serve as either good textbooks or as useful reference. However, in the case of this book, it is enough to read the foreword by Adrian Smith to realize that this particular volume is quite different. ... it is a good reference book for SMC." (Mohan Delampady, Sankhya: Indian Journal of Statistics, Vol. 64 (A), 2002)
"In this book the authors present sequential Monte Carlo (SMC) methods ... . Over the last few years several closely related algorithms have appeared under the names `boostrap filters', `particle filters', `Monte Carlo filters', and `survival of the fittest'. The book under review brings together many of these algorithms and presents theoretical developments ... . This book will be of great value to advanced students, researchers, and practitioners who want to learn about sequential Monte Carlo methods for the computational problems of Bayesian Statistics." (E. Novak, Metrika, May, 2003)
"This book provides a very good overview of the sequential Monte Carlo methods and contains many ideas on further research on methodologies and newer areas of application. ... It will be certainly a valuable reference book for students and researchers working in the area of on-line data analysis. ... the techniques discussed in this book are of great relevance to practitioners dealing with real time data." (Pradipta Sarkar, Technometrics, Vol. 45 (1), 2003)
Foreword | p. v |
Acknowledgments | p. vii |
Contributors | p. xxi |
Introduction | p. 1 |
An Introduction to Sequential Monte Carlo Methods | p. 3 |
Motivation | p. 3 |
Problem statement | p. 5 |
Monte Carlo methods | p. 6 |
Perfect Monte Carlo sampling | p. 7 |
Importance sampling | p. 8 |
The Bootstrap filter | p. 10 |
Discussion | p. 13 |
Theoretical Issues | p. 15 |
Particle Filters - A Theoretical Perspective | p. 17 |
Introduction | p. 17 |
Notation and terminology | p. 17 |
Markov chains and transition kernels | p. 18 |
The filtering problem | p. 19 |
Convergence of measure-valued random variables | p. 20 |
Convergence theorems | p. 21 |
The fixed observation case | p. 21 |
The random observation case | p. 24 |
Examples of particle filters | p. 25 |
Description of the particle filters | p. 25 |
Branching mechanisms | p. 28 |
Convergence of the algorithm | p. 31 |
Discussion | p. 33 |
Appendix | p. 35 |
Conditional probabilities and conditional expectations | p. 35 |
The recurrence formula for the conditional distribution of the signal | p. 38 |
Interacting Particle Filtering With Discrete Observations | p. 43 |
Introduction | p. 43 |
Nonlinear filtering: general facts | p. 46 |
An interacting particle system under Case A | p. 48 |
Subcase A1 | p. 48 |
Subcase A2 | p. 55 |
An interacting particle system under Case B | p. 60 |
Subcase B1 | p. 60 |
Subcase B2 | p. 67 |
Discretely observed stochastic differential equations | p. 71 |
Case A | p. 72 |
Case B | p. 73 |
Strategies for Improving Sequential Monte Carlo Methods | p. 77 |
Sequential Monte Carlo Methods for Optimal Filtering | p. 79 |
Introduction | p. 79 |
Bayesian filtering and sequential estimation | p. 79 |
Dynamic modelling and Bayesian filtering | p. 79 |
Alternative dynamic models | p. 80 |
Sequential Monte Carlo Methods | p. 82 |
Methodology | p. 82 |
A generic algorithm | p. 85 |
Convergence results | p. 86 |
Application to digital communications | p. 88 |
Model specification and estimation objectives | p. 89 |
SMC applied to demodulation | p. 91 |
Simulations | p. 93 |
Deterministic and Stochastic Particle Filters in State-Space Models | p. 97 |
Introduction | p. 97 |
General issues | p. 98 |
Model and exact filter | p. 98 |
Particle filters | p. 99 |
Gaussian quadrature | p. 100 |
Quadrature filters | p. 101 |
Numerical error | p. 102 |
A small illustrative example | p. 104 |
Case studies from ecology | p. 104 |
Problem area and models | p. 104 |
Quadrature filters in practice | p. 107 |
Numerical experiments | p. 110 |
Concluding remarks | p. 112 |
Appendix: Derivation of numerical errors | p. 114 |
RESAMPLE-MOVE Filtering with Cross-Model Jumps | p. 117 |
Introduction | p. 117 |
Problem statement | p. 118 |
The RESAMPLE-MOVE algorithm | p. 119 |
Comments | p. 124 |
Central limit theorem | p. 125 |
Dealing with model uncertainty | p. 126 |
Illustrative application | p. 129 |
Applying RESAMPLE-MOVE | p. 131 |
Simulation experiment | p. 134 |
Uncertainty about the type of target | p. 135 |
Conclusions | p. 138 |
Improvement Strategies for Monte Carlo Particle Filters | p. 139 |
Introduction | p. 139 |
General sequential importance sampling | p. 140 |
Markov chain moves | p. 143 |
The use of bridging densities with MCMC moves | p. 144 |
Simulation example: TVAR model in noise | p. 145 |
Particle filter algorithms for TVAR models | p. 146 |
Bootstrap (SIR) filter | p. 148 |
Auxiliary particle filter (APF) | p. 149 |
MCMC resampling | p. 150 |
Simulation results | p. 152 |
Summary | p. 157 |
Acknowledgements | p. 158 |
Approximating and Maximising the Likelihood for a General State-Space Model | p. 159 |
Introduction | p. 159 |
Bayesian methods | p. 159 |
Pointwise Monte Carlo approximation of the likelihood | p. 161 |
Examples | p. 161 |
Approximation of the likelihood function based on filter samples | p. 164 |
Approximations based on smoother samples | p. 166 |
Approximation of the likelihood function | p. 167 |
Stochastic EM-algorithm | p. 167 |
Comparison of the methods | p. 168 |
AR(1) process | p. 168 |
Nonlinear example, 3 parameters | p. 171 |
Nonlinear model, 5 parameters | p. 173 |
Discussion | p. 173 |
Recursive estimation | p. 173 |
Monte Carlo Smoothing and Self-Organising State-Space Model | p. 177 |
Introduction | p. 177 |
General state-space model and state estimation | p. 178 |
The model and the state estimation problem | p. 178 |
Non-Gaussian filter and smoother | p. 179 |
Monte Carlo filter and smoother | p. 180 |
Approximation of non-Gaussian distributions | p. 180 |
Monte Carlo filtering | p. 181 |
Derivation of the Monte Carlo filter | p. 182 |
Monte Carlo smoothing | p. 183 |
Non-Gaussian smoothing for the stochastic volatility model | p. 186 |
Nonlinear Smoothing | p. 188 |
Self-organising state-space models | p. 189 |
Likelihood of the model and parameter estimation | p. 189 |
Self-organising state-space model | p. 191 |
Examples | p. 192 |
Self-organising smoothing for the stochastic volatility model | p. 192 |
Time series with trend and stochastic volatility | p. 194 |
Conclusion | p. 195 |
Combined Parameter and State Estimation in Simulation-Based Filtering | p. 197 |
Introduction and historical perspective | p. 197 |
General framework | p. 199 |
Dynamic model and analysis perspective | p. 199 |
Filtering for states | p. 200 |
Filtering for states and parameters | p. 202 |
The treatment of model parameters | p. 202 |
Artificial evolution of parameters | p. 202 |
Kernel smoothing of parameters | p. 203 |
Reinterpreting artificial parameter evolutions | p. 204 |
A general algorithm | p. 206 |
Factor stochastic volatility modelling | p. 208 |
Discussion and future directions | p. 217 |
A Theoretical Framework for Sequential Importance Sampling with Resampling | p. 225 |
Introduction | p. 225 |
Sequential importance sampling principle | p. 227 |
Properly weighted sample | p. 227 |
Sequential build-up | p. 228 |
Operations for enhancing SIS | p. 229 |
Reweighting, resampling and reallocation | p. 230 |
Rejection control and partial rejection control | p. 231 |
Marginalisation | p. 234 |
Monte Carlo filter for state-space models | p. 234 |
The general state-space model | p. 235 |
Conditional dynamic linear model and the mixture Kalman filter | p. 236 |
Some examples | p. 237 |
A simple illustration | p. 237 |
Target tracking with MKF | p. 239 |
Discussion | p. 241 |
Acknowledgements | p. 242 |
Improving Regularised Particle Filters | p. 247 |
Introduction | p. 247 |
Particle filters | p. 249 |
The (classical) interacting particle filter (IPF) | p. 250 |
Regularised particle filters (RPF) | p. 251 |
Progressive correction | p. 255 |
Focus on the correction step | p. 256 |
Principle of progressive correction | p. 257 |
Adaptive choice of the decomposition | p. 258 |
The local rejection regularised particle filter (L2RPF) | p. 260 |
Description of the filter | p. 260 |
Computing the coefficient c[superscript (i) subscript t]([alpha subscript t]) | p. 263 |
Applications to tracking problems | p. 264 |
Range and bearing | p. 265 |
Bearings-only | p. 266 |
Multiple model particle filter (MMPF) | p. 269 |
Auxiliary Variable Based Particle Filters | p. 273 |
Introduction | p. 273 |
Particle filters | p. 274 |
The definition of particle filters | p. 274 |
Sampling the empirical prediction density | p. 274 |
Weaknesses of particle filters | p. 276 |
Auxiliary variable | p. 277 |
The basics | p. 277 |
A generic SIR based auxiliary proposal | p. 278 |
Examples of adaption | p. 283 |
Fixed-lag filtering | p. 288 |
Reduced random sampling | p. 289 |
Basic ideas | p. 289 |
Simple outlier example | p. 290 |
Conclusion | p. 292 |
Acknowledgements | p. 293 |
Improved Particle Filters and Smoothing | p. 295 |
Introduction | p. 295 |
The methods | p. 296 |
The smooth bootstrap | p. 296 |
Adaptive importance sampling | p. 300 |
The kernel sampler of Hurzeler and Kunsch | p. 302 |
Partially smooth bootstrap | p. 303 |
Roughening and sample augmentation | p. 305 |
Application of the methods in particle filtering and smoothing | p. 306 |
Application of smooth bootstrap procedures to a simple control problem | p. 308 |
Description of the problem | p. 308 |
An approach to the continuous-time version of the problem | p. 309 |
An adaptation of Titterington's method | p. 310 |
Probabilistic criterion 1 | p. 310 |
Probabilistic criterion 2: working directly with the cost | p. 311 |
Unknown variances | p. 311 |
Resampling implementation | p. 312 |
Simulation results | p. 314 |
Further work on this problem | p. 317 |
Applications | p. 319 |
Posterior Cramer-Rao Bounds for Sequential Estimation | p. 321 |
Introduction | p. 321 |
Review of the posterior Cramer-Rao bound | p. 322 |
Bounds for sequential estimation | p. 323 |
Estimation model | p. 324 |
Posterior Cramer-Rao bound | p. 325 |
Relative Monte Carlo evaluation | p. 327 |
Example--terrain navigation | p. 329 |
Conclusions | p. 338 |
Statistical Models of Visual Shape and Motion | p. 339 |
Introduction | p. 339 |
Statistical modelling of shape | p. 341 |
Statistical modelling of image observations | p. 343 |
Sampling methods | p. 345 |
Modelling dynamics | p. 346 |
Learning dynamics | p. 349 |
Particle filtering | p. 352 |
Dynamics with discrete states | p. 354 |
Conclusions | p. 355 |
Sequential Monte Carlo Methods for Neural Networks | p. 359 |
Introduction | p. 359 |
Model specification | p. 360 |
MLP models for regression and classification | p. 360 |
Variable dimension RBF models | p. 362 |
Estimation objectives | p. 365 |
General SMC algorithm | p. 366 |
Importance sampling step | p. 367 |
Selection step | p. 368 |
MCMC Step | p. 369 |
Exact step | p. 371 |
On-line classification | p. 371 |
Simple classification example | p. 372 |
An application to fault detection in marine diesel engines | p. 373 |
An application to financial time series | p. 375 |
Conclusions | p. 379 |
Sequential Estimation of Signals under Model Uncertainty | p. 381 |
Introduction | p. 381 |
The problem of parameter estimation under uncertainty | p. 383 |
Sequential updating of the solution | p. 384 |
Sequential algorithm for computing the solution | p. 389 |
A Sequential-Importance-Resampling scheme | p. 390 |
Sequential sampling scheme based on mixtures | p. 395 |
Example | p. 397 |
Conclusions | p. 400 |
Acknowledgment | p. 400 |
Particle Filters for Mobile Robot Localization | p. 401 |
Introduction | p. 401 |
Monte Carlo localization | p. 403 |
Bayes filtering | p. 403 |
Models of robot motion and perception | p. 404 |
Implementation as particle filters | p. 405 |
Robot results | p. 408 |
Comparison to grid-based localization | p. 410 |
MCL with mixture proposal distributions | p. 414 |
The need for better sampling | p. 414 |
An alternative proposal distribution | p. 416 |
The mixture proposal distribution | p. 419 |
Robot results | p. 420 |
Multi-robot MCL | p. 423 |
Basic considerations | p. 423 |
Robot results | p. 425 |
Conclusion | p. 426 |
Self-Organizing Time Series Model | p. 429 |
Introduction | p. 429 |
Generalised state-space model | p. 429 |
Monte Carlo filter | p. 430 |
Self-organizing time series model | p. 432 |
Genetic algorithm filter | p. 432 |
Self-organizing state-space model | p. 434 |
Resampling scheme for filtering | p. 435 |
Selection scheme | p. 435 |
Comparison of performance: simulation study | p. 436 |
Application | p. 438 |
Time-varying frequency wave in small count data | p. 438 |
Self-organizing state-space model for time-varying frequency wave | p. 439 |
Results | p. 440 |
Conclusions | p. 444 |
Sampling in Factored Dynamic Systems | p. 445 |
Introduction | p. 445 |
Structured probabilistic models | p. 448 |
Bayesian networks | p. 448 |
Hybrid networks | p. 449 |
Dynamic Bayesian networks | p. 451 |
Particle filtering for DBNs | p. 454 |
Experimental results | p. 457 |
Conclusions | p. 464 |
In-Situ Ellipsometry Solutions Using Sequential Monte Carlo | p. 465 |
Introduction | p. 465 |
Application background | p. 465 |
State-space model | p. 467 |
Ellipsometry measurement model | p. 468 |
System evolution model | p. 471 |
Particle filter | p. 472 |
Results | p. 474 |
Conclusion | p. 475 |
Acknowledgments | p. 477 |
Manoeuvring Target Tracking Using a Multiple-Model Bootstrap Filter | p. 479 |
Introduction | p. 479 |
Optimal multiple-model solution | p. 481 |
The IMM algorithm | p. 483 |
Multiple model bootstrap filter | p. 484 |
Example | p. 486 |
Target tracking examples | p. 488 |
Target scenarios | p. 488 |
Linear, Gaussian tests | p. 488 |
Polar simulation results | p. 492 |
Conclusions | p. 495 |
Acknowledgments | p. 496 |
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks | p. 499 |
Introduction | p. 499 |
RBPF in general | p. 500 |
How do we choose which nodes to sample? | p. 503 |
The RBPF algorithm in detail | p. 506 |
Application: concurrent localisation and map learning for a mobile robot | p. 508 |
Results on a one-dimensional grid | p. 511 |
Results on a two-dimensional grid | p. 514 |
Conclusions and future work | p. 515 |
Particles and Mixtures for Tracking and Guidance | p. 517 |
Introduction | p. 517 |
Guidance as a stochastic control problem | p. 518 |
Information state | p. 519 |
Dynamic programming and the dual effect | p. 520 |
Separability and certainty equivalence | p. 521 |
Sub-optimal strategies | p. 522 |
Derivation of control laws from particles | p. 523 |
Certainty equivalence control | p. 523 |
A scheme based on open-loop feedback control | p. 524 |
Guidance in the presence of intermittent spurious objects and clutter | p. 525 |
Introduction | p. 525 |
Problem formulation | p. 525 |
Simulation example | p. 526 |
Guidance results | p. 528 |
Monte Carlo Techniques for Automated Target Recognition | p. 533 |
Introduction | p. 533 |
The Bayesian posterior | p. 535 |
Inference engines | p. 536 |
Jump-diffusion sampling | p. 539 |
Diffusion Processes | p. 540 |
Jump processes | p. 541 |
Jump-diffusion algorithm | p. 544 |
Sensor models | p. 545 |
Experiments | p. 547 |
Acknowledgments | p. 552 |
Bibliography | p. 553 |
Index | p. 577 |
Table of Contents provided by Syndetics. All Rights Reserved. |
ISBN: 9780387951461
ISBN-10: 0387951466
Series: Statistics for Engineering and Information Science
Audience:
Professional
Format:
Hardcover
Language:
English
Number Of Pages: 582
Published: 21st June 2001
Publisher: Springer-Verlag New York Inc.
Country of Publication: US
Dimensions (cm): 24.13 x 16.51
x 3.18
Weight (kg): 0.98