+612 9045 4394
Probabilistic Networks and Expert Systems : Exact Computational Methods for Bayesian Networks - Robert G. Cowell

Probabilistic Networks and Expert Systems

Exact Computational Methods for Bayesian Networks


Published: 20th May 2003
Ships: 7 to 10 business days
7 to 10 business days
RRP $471.99
or 4 easy payments of $81.59 with Learn more

Other Available Formats (Hide)

Winner of the 2002 DeGroot Prize. Awarded by the International Society for Bayesian Analysis to a book judged to represent an important, timely, thorough, and notably original contribution to the statistics literature.Probabilistic expert systems are graphical networks that support the modelling of uncertainty and decisions in large complex domains, while retaining ease of calculation. Building on original research by the authors over a number of years, this book gives a thorough and rigorous mathematical treatment of the underlying ideas, structures, and algorithms, emphasizing those cases in which exact answers are obtainable. It covers both the updating of probabilistic uncertainty in the light of new evidence, and statistical inference, about unknown probabilities or unknown model structure, in the light of new data.The book will be of interest to researchers and graduate students in artificial intelligence who desire an understanding of the mathematical and statistical basis of probabilistic expert systems, and to students and research workers in statistics wanting an introduction to this fascinating and rapidly developing field. The careful attention to detail will also make this work an important reference source for all those involved in the theory and applications of probabilistic expert systems.Robert G. Cowell is a Lecturer in the Faculty of Actuarial Science and Statistics of the Sir John Cass Business School, City of London. He has been working in the field of probabilistic expert systems for over a decade, and has published a number of research and tutorial articles in the area.A. Philip Dawid is Pearson Professor of Statistics at University College London. He has served as Editor of the Journal of the Royal Statistical Society (Series B) and of Biometrika, and as President of the International Society for Bayesian Analysis. He holds the Royal Statistical Society Guy Medal in Bronze and in Silver, and the G. W. Snedecor Award for the Best Publication in Biometry.Steffen L. Lauritzen is Professor of Mathematics and Statistics at the University of Aalborg. He has served as Editor of the Scandinavian Journal of Statistics. He holds the Royal Statistical Guy Medal in Silver and is an Honorary Fellow of the same society. He has, jointly with David J. Spiegelhalter, received the American Statistical Association's award for an "Outstanding Statistical Application."David J. Spiegelhalter is a Senior Scientist at the MRC Biostatistics Unit in the Cambridge University Institute of Public Health. He has published extensively on Bayesian methodology and applications, and holds the Royal Statistical Society Guy Medal in Bronze and in Silver.

From the reviews:


"This important book fills a void in the graphical Markov models literature. The authors have summarized their extensive and influential work in this area and provided a valuable resource both for educators and for practitioners."

Prefacep. v
Introductionp. 1
What is this book about?p. 1
What is in this book?p. 2
What is not in this book?p. 3
How should this be book be used?p. 4
Logic, Uncertainty, and Probabilityp. 5
What is an expert system?p. 5
Diagnostic decision treesp. 6
Production systemsp. 7
Coping with uncertaintyp. 8
The naive probabilistic approachp. 10
Interpretations of probabilityp. 11
Axiomsp. 13
Bayes' theoremp. 14
Bayesian reasoning in expert systemsp. 17
A broader context for probabilistic expert systemsp. 21
Building and Using Probabilistic Networksp. 25
Graphical modelling of the domainp. 26
Qualitative modellingp. 27
Probabilistic modellingp. 28
Quantitative modellingp. 29
Further background to the elicitation processp. 29
From specification to inference enginep. 31
Moralizationp. 31
From moral graph to junction treep. 33
The inference processp. 34
The clique-marginal representationp. 36
Incorporation of evidencep. 36
Bayesian networks as expert systemsp. 37
Background references and further readingp. 40
Structuring the graphp. 40
Specifying the probability distributionp. 40
Graph Theoryp. 43
Basic conceptsp. 43
Chordal and decomposable graphsp. 49
Junction treesp. 52
From chain graph to junction treep. 55
Triangulationp. 57
Elimination treep. 59
Background references and further readingp. 61
Markov Properties on Graphsp. 63
Conditional independencep. 63
Markov fields over undirected graphsp. 66
Markov properties on directed acyclic graphsp. 70
Markov properties on chain graphsp. 75
Current research directionsp. 79
Markov equivalencep. 79
Other graphical representationsp. 80
Background references and further readingp. 80
Discrete Networksp. 83
An illustration of local computationp. 84
Definitionsp. 85
Basic operationsp. 86
Local computation on the junction treep. 87
Graphical specificationp. 87
Numerical specification and initializationp. 87
Chargesp. 88
Flow of information between adjacent cliquesp. 88
Active flowsp. 89
Reaching equilibriump. 90
Scheduling of flowsp. 92
Two-phase propagationp. 92
Entering and propagating evidencep. 93
A propagation examplep. 95
Generalized marginalization operationsp. 95
Maximizationp. 97
Degeneracy of the most probable configurationp. 99
Simulationp. 99
Finding the M most probable configurationsp. 101
Sampling without replacementp. 103
Fast retractionp. 104
Moments of functionsp. 106
Example: Ch-Asiap. 109
Descriptionp. 109
Graphical specificationp. 109
Numerical specificationp. 109
Initializationp. 112
Propagation without evidencep. 114
Propagation with evidencep. 114
Max-propagationp. 119
Dealing with large cliquesp. 120
Truncating small numbersp. 121
Splitting cliquesp. 122
Current research directions and further readingp. 123
Gaussian and Mixed Discrete-Gaussian Networksp. 125
CG distributionsp. 126
Basic operations on CG potentialsp. 127
Marked graphs and their junction treesp. 131
Decomposition of marked graphsp. 131
Junction trees with strong rootsp. 133
Model specificationp. 135
Operating in the junction treep. 137
Initializing the junction treep. 138
Chargesp. 138
Entering evidencep. 139
Flow of information between adjacent cliquesp. 139
Two-phase propagationp. 141
A simple Gaussian examplep. 143
Example: Wastep. 144
Structural specificationp. 145
Numerical specificationp. 146
Strong triangulationp. 147
Forming the junction treep. 148
Initializing the junction treep. 148
Entering evidencep. 149
Complexity considerationsp. 150
Numerical instability problemsp. 151
Exact marginal densitiesp. 152
Current research directionsp. 152
Background references and further readingp. 153
Discrete Multistage Decision Networksp. 155
The nature of multistage decision problemsp. 156
Solving the decision problemp. 157
Decision potentialsp. 159
Network specification and solutionp. 163
Structural and numerical specificationp. 163
Causal consistency lemmap. 165
Making the elimination treep. 166
Initializing the elimination treep. 167
Message passing in the elimination treep. 168
Proof of elimination tree solutionp. 169
Example: Oil Wildcatterp. 172
Specificationp. 172
Making the elimination treep. 175
Initializing the elimination treep. 176
Collecting evidencep. 177
Example: Dec-Asiap. 177
Triangulation issuesp. 183
Asymmetric problemsp. 184
Background references and further readingp. 187
Learning About Probabilitiesp. 189
Statistical modelling and parameter learningp. 189
Parametrizing a directed Markov modelp. 190
Maximum likelihood with complete datap. 192
Bayesian updating with complete datap. 193
Priors for DAG modelsp. 193
Specifying priors: An examplep. 197
Updating priors with complete data: An examplep. 199
Incomplete datap. 200
Sequential and batch methodsp. 201
Maximum likelihood with incomplete datap. 202
The EM algorithmp. 202
Penalized EM algorithmp. 204
Bayesian updating with incomplete datap. 204
Exact theoryp. 206
Retaining global independencep. 207
Retaining local independencep. 209
Reducing the mixturesp. 211
Simulation results: full mixture reductionp. 213
Simulation results: partial mixture reductionp. 214
Using Gibbs sampling for learningp. 216
Hyper Markov laws for undirected modelsp. 221
Current research directions and further readingp. 222
Checking Models Against Datap. 225
Scoring rulesp. 226
Standardizationp. 227
Parent-child monitorsp. 229
Batch monitorsp. 232
Missing datap. 233
Node monitorsp. 234
Global monitorsp. 235
Example: Childp. 236
Simulation experimentsp. 238
Further readingp. 241
Structural Learningp. 243
Purposes of modellingp. 244
Inference about modelsp. 244
Criteria for comparing modelsp. 245
Maximized likelihoodp. 246
Predictive assessmentp. 247
Marginal likelihoodp. 248
Model probabilitiesp. 249
Model selection and model averagingp. 250
Graphical models and conditional independencep. 251
Classes of modelsp. 253
Models containing only observed quantitiesp. 253
Models with latent or hidden variablesp. 254
Missing datap. 255
Handling multiple modelsp. 256
Search strategiesp. 256
Probability specificationp. 258
Prior information on parametersp. 260
Variable precisionp. 261
Epiloguep. 265
Conjugate Analysis for Discrete Datap. 267
Bernoulli processp. 267
Multinomial processp. 269
Gibbs Samplingp. 271
Gibbs samplingp. 271
Sampling from the moral graphp. 273
General probability densitiesp. 274
Further readingp. 275
Information and Software on the World Wide Webp. 277
Information about probabilistic networksp. 277
Software for probabilistic networksp. 279
Markov chain Monte Carlo methodsp. 280
Bibliographyp. 281
Author Indexp. 307
Subject Indexp. 313
Table of Contents provided by Syndetics. All Rights Reserved.

ISBN: 9780387987675
ISBN-10: 0387987673
Series: Statistics for Engineering and Information Science
Audience: Professional
Format: Hardcover
Language: English
Number Of Pages: 324
Published: 20th May 2003
Publisher: Springer-Verlag New York Inc.
Country of Publication: US
Dimensions (cm): 24.13 x 15.88  x 1.91
Weight (kg): 0.64