
Bayesian Logical Data Analysis for the Physical Sciences
A Comparative Approach with Mathematica Support
By:Â Phil Gregory
Hardcover | 27 April 2005
At a Glance
488 Pages
25.4 x 17.78 x 2.69
New Edition
Hardcover
RRP $237.95
$207.99
13%OFF
or 4 interest-free payments of $52.00 with
 orÂShips in 5 to 7 business days
Industry Reviews
| Preface | p. xiii |
| Software support | p. xv |
| Acknowledgements | p. xvii |
| Role of probability theory in science | p. 1 |
| Scientific inference | p. 1 |
| Inference requires a probability theory | p. 2 |
| The two rules for manipulating probabilities | p. 4 |
| Usual form of Bayes' theorem | p. 5 |
| Discrete hypothesis space | p. 5 |
| Continuous hypothesis space | p. 6 |
| Bayes' theorem - model of the learning process | p. 7 |
| Example of the use of Bayes' theorem | p. 8 |
| Probability and frequency | p. 10 |
| Example: incorporating frequency information | p. 11 |
| Marginalization | p. 12 |
| The two basic problems in statistical inference | p. 15 |
| Advantages of the Bayesian approach | p. 16 |
| Problems | p. 17 |
| Probability theory as extended logic | p. 21 |
| Overview | p. 21 |
| Fundamentals of logic | p. 21 |
| Logical propositions | p. 21 |
| Compound propositions | p. 22 |
| Truth tables and Boolean algebra | p. 22 |
| Deductive inference | p. 24 |
| Inductive or plausible inference | p. 25 |
| Brief history | p. 25 |
| An adequate set of operations | p. 26 |
| Examination of a logic function | p. 27 |
| Operations for plausible inference | p. 29 |
| The desiderata of Bayesian probability theory | p. 30 |
| Development of the product rule | p. 30 |
| Development of sum rule | p. 34 |
| Qualitative properties of product and sum rules | p. 36 |
| Uniqueness of the product and sum rules | p. 37 |
| Summary | p. 39 |
| Problems | p. 39 |
| The how-to of Bayesian inference | p. 41 |
| Overview | p. 41 |
| Basics | p. 41 |
| Parameter estimation | p. 43 |
| Nuisance parameters | p. 45 |
| Model comparison and Occam's razor | p. 45 |
| Sample spectral line problem | p. 50 |
| Background information | p. 50 |
| Odds ratio | p. 52 |
| Choice of prior p(T[vertical bar]M[subscript 1], I) | p. 53 |
| Calculation of p(D[vertical bar]M[subscript 1], T, I) | p. 55 |
| Calculation of p(D[vertical bar]M[subscript 2], I) | p. 58 |
| Odds, uniform prior | p. 58 |
| Odds, Jeffreys prior | p. 58 |
| Parameter estimation problem | p. 59 |
| Sensitivity of odds to T[subscript max] | p. 59 |
| Lessons | p. 61 |
| Ignorance priors | p. 63 |
| Systematic errors | p. 65 |
| Systematic error example | p. 66 |
| Problems | p. 69 |
| Assigning probabilities | p. 72 |
| Introduction | p. 72 |
| Binomial distribution | p. 72 |
| Bernoulli's law of large numbers | p. 75 |
| The gambler's coin problem | p. 75 |
| Bayesian analysis of an opinion poll | p. 77 |
| Multinomial distribution | p. 79 |
| Can you really answer that question? | p. 80 |
| Logical versus causal connections | p. 82 |
| Exchangeable distributions | p. 83 |
| Poisson distribution | p. 85 |
| Bayesian and frequentist comparison | p. 87 |
| Constructing likelihood functions | p. 89 |
| Deterministic model | p. 90 |
| Probabilistic model | p. 91 |
| Summary | p. 93 |
| Problems | p. 94 |
| Frequentist statistical inference | p. 96 |
| Overview | p. 96 |
| The concept of a random variable | p. 96 |
| Sampling theory | p. 97 |
| Probability distributions | p. 98 |
| Descriptive properties of distributions | p. 100 |
| Relative line shape measures for distributions | p. 101 |
| Standard random variable | p. 102 |
| Other measures of central tendency and dispersion | p. 103 |
| Median baseline subtraction | p. 104 |
| Moment generating functions | p. 105 |
| Some discrete probability distributions | p. 107 |
| Binomial distribution | p. 107 |
| The Poisson distribution | p. 109 |
| Negative binomial distribution | p. 112 |
| Continuous probability distributions | p. 113 |
| Normal distribution | p. 113 |
| Uniform distribution | p. 116 |
| Gamma distribution | p. 116 |
| Beta distribution | p. 117 |
| Negative exponential distribution | p. 118 |
| Central Limit Theorem | p. 119 |
| Bayesian demonstration of the Central Limit Theorem | p. 120 |
| Distribution of the sample mean | p. 124 |
| Signal averaging example | p. 125 |
| Transformation of a random variable | p. 125 |
| Random and pseudo-random number | p. 127 |
| Pseudo-random number generators | p. 131 |
| Tests for randomness | p. 132 |
| Summary | p. 136 |
| Problems | p. 137 |
| What is a statistic? | p. 139 |
| Introduction | p. 139 |
| The x[superscript 2] distribution | p. 141 |
| Sample variance S[superscript 2] | p. 143 |
| The Student's t distribution | p. 147 |
| F distribution (F-test) | p. 150 |
| Confidence intervals | p. 152 |
| Variance [sigma superscript 2] known | p. 152 |
| Confidence intervals for [mu], unknown variance | p. 156 |
| Confidence intervals: difference of two means | p. 158 |
| Confidence intervals for [sigma superscript 2] | p. 159 |
| Confidence intervals: ratio of two variances | p. 159 |
| Summary | p. 160 |
| Problems | p. 161 |
| Frequentist hypothesis testing | p. 162 |
| Overview | p. 162 |
| Basic idea | p. 162 |
| Hypothesis testing with the x[superscript 2] statistic | p. 163 |
| Hypothesis test on the difference of two means | p. 167 |
| One-sided and two-sided hypothesis tests | p. 170 |
| Are two distributions the same? | p. 172 |
| Pearson x[superscript 2] goodness-of-fit test | p. 173 |
| Comparison of two-binned data sets | p. 177 |
| Problem with frequentist hypothesis testing | p. 177 |
| Bayesian resolution to optional stopping problem | p. 179 |
| Problems | p. 181 |
| Maximum entropy probabilities | p. 184 |
| Overview | p. 184 |
| The maximum entropy principle | p. 185 |
| Shannon's theorem | p. 186 |
| Alternative justification of MaxEnt | p. 187 |
| Generalizing MaxEnt | p. 190 |
| Incorporating a prior | p. 190 |
| Continuous probability distributions | p. 191 |
| How to apply the MaxEnt principle | p. 191 |
| Lagrange multipliers of variational calculus | p. 191 |
| MaxEnt distributions | p. 192 |
| General properties | p. 192 |
| Uniform distribution | p. 194 |
| Exponential distribution | p. 195 |
| Normal and truncated Gaussian distributions | p. 197 |
| Multivariate Gaussian distribution | p. 202 |
| MaxEnt image reconstruction | p. 203 |
| The kangaroo justification | p. 203 |
| MaxEnt for uncertain constraints | p. 206 |
| Pixon multiresolution image reconstruction | p. 208 |
| Problems | p. 211 |
| Bayesian inference with Gaussian errors | p. 212 |
| Overview | p. 212 |
| Bayesian estimate of a mean | p. 212 |
| Mean: known noise [sigma] | p. 213 |
| Mean: known noise, unequal [sigma] | p. 217 |
| Mean: unknown noise [sigma] | p. 218 |
| Bayesian estimate of [sigma] | p. 224 |
| Is the signal variable? | p. 227 |
| Comparison of two independent samples | p. 228 |
| Do the samples differ? | p. 230 |
| How do the samples differ? | p. 233 |
| Results | p. 233 |
| The difference in means | p. 236 |
| Ratio of the standard deviations | p. 237 |
| Effect of the prior ranges | p. 239 |
| Summary | p. 240 |
| Problems | p. 241 |
| Linear model fitting (Gaussian errors) | p. 243 |
| Overview | p. 243 |
| Parameter estimation | p. 244 |
| Most probable amplitudes | p. 249 |
| More powerful matrix formulation | p. 253 |
| Regression analysis | p. 256 |
| The posterior is a Gaussian | p. 257 |
| Joint credible regions | p. 260 |
| Model parameter errors | p. 264 |
| Marginalization and the covariance matrix | p. 264 |
| Correlation coefficient | p. 268 |
| More on model parameter errors | p. 272 |
| Correlated data errors | p. 273 |
| Model comparison with Gaussian posteriors | p. 275 |
| Frequentist testing and errors | p. 279 |
| Other model comparison methods | p. 281 |
| Summary | p. 283 |
| Problems | p. 284 |
| Nonlinear model fitting | p. 287 |
| Introduction | p. 287 |
| Asymptotic normal approximation | p. 288 |
| Laplacian approximations | p. 291 |
| Bayes factor | p. 291 |
| Marginal parameter posteriors | p. 293 |
| Finding the most probable parameters | p. 294 |
| Simulated annealing | p. 296 |
| Genetic algorithm | p. 297 |
| Iterative linearization | p. 298 |
| Levenberg-Marquardt method | p. 300 |
| Marquardt's recipe | p. 301 |
| Mathematica example | p. 302 |
| Model comparison | p. 304 |
| Marginal and projected distributions | p. 306 |
| Errors in both coordinates | p. 307 |
| Summary | p. 309 |
| Problems | p. 309 |
| Markov chain Monte Carlo | p. 312 |
| Overview | p. 312 |
| Metropolis-Hastings algorithm | p. 313 |
| Why does Metropolis-Hastings work? | p. 319 |
| Simulated tempering | p. 321 |
| Parallel tempering | p. 321 |
| Example | p. 322 |
| Model comparison | p. 326 |
| Towards an automated MCMC | p. 330 |
| Extrasolar planet example | p. 331 |
| Model probabilities | p. 335 |
| Results | p. 337 |
| MCMC robust summary statistic | p. 342 |
| Summary | p. 346 |
| Problems | p. 349 |
| Bayesian revolution in spectral analysis | p. 352 |
| Overview | p. 352 |
| New insights on the periodogram | p. 352 |
| How to compute p(f[vertical bar]D,I) | p. 356 |
| Strong prior signal model | p. 358 |
| No specific prior signal model | p. 360 |
| X-ray astronomy example | p. 362 |
| Radio astronomy example | p. 363 |
| Generalized Lomb-Scargle periodogram | p. 365 |
| Relationship to Lomb-Scargle periodogram | p. 367 |
| Example | p. 367 |
| Non-uniform sampling | p. 370 |
| Problems | p. 373 |
| Bayesian inference with Poisson sampling | p. 376 |
| Overview | p. 376 |
| Infer a Poisson rate | p. 377 |
| Summary of posterior | p. 378 |
| Signal + known background | p. 379 |
| Analysis of ON/OFF measurements | p. 380 |
| Estimating the source rate | p. 381 |
| Source detection question | p. 384 |
| Time-varying Poisson rate | p. 386 |
| Problems | p. 388 |
| Singular value decomposition | p. 389 |
| Discrete Fourier Transforms | p. 392 |
| Overview | p. 392 |
| Orthogonal and orthonormal functions | p. 392 |
| Fourier series and integral transform | p. 394 |
| Fourier series | p. 395 |
| Fourier transform | p. 396 |
| Convolution and correlation | p. 398 |
| Convolution theorem | p. 399 |
| Correlation theorem | p. 400 |
| Importance of convolution in science | p. 401 |
| Waveform sampling | p. 403 |
| Nyquist sampling theorem | p. 404 |
| Astronomy example | p. 406 |
| Discrete Fourier Transform | p. 407 |
| Graphical development | p. 407 |
| Mathematical development of the DFT | p. 409 |
| Inverse DFT | p. 410 |
| Applying the DFT | p. 411 |
| DFT as an approximate Fourier transform | p. 411 |
| Inverse discrete Fourier transform | p. 413 |
| The Fast Fourier Transform | p. 415 |
| Discrete convolution and correlation | p. 417 |
| Deconvolving a noisy signal | p. 418 |
| Deconvolution with an optimal Weiner filter | p. 420 |
| Treatment of end effects by zero padding | p. 421 |
| Accurate amplitudes by zero padding | p. 422 |
| Power-spectrum estimation | p. 424 |
| Parseval's theorem and power spectral density | p. 424 |
| Periodogram power-spectrum estimation | p. 425 |
| Correlation spectrum estimation | p. 426 |
| Discrete power spectral density estimation | p. 428 |
| Discrete form of Parseval's theorem | p. 428 |
| One-sided discrete power spectral density | p. 429 |
| Variance of periodogram estimate | p. 429 |
| Yule's stochastic spectrum estimation model | p. 431 |
| Reduction of periodogram variance | p. 431 |
| Problems | p. 432 |
| Difference in two samples | p. 434 |
| Outline | p. 434 |
| Probabilities of the four hypotheses | p. 434 |
| Evaluation of p(C, S[vertical bar]D[subscript 1], D[subscript 2], I) | p. 434 |
| Evaluation of p(C, S[vertical bar]D[subscript 1], D[subscript 2], I) | p. 436 |
| Evaluation of p(C, S[vertical bar]D[subscript 1], D[subscript 2], I) | p. 438 |
| Evaluation of p(C, S[vertical bar]D[subscript 1], D[subscript 2], I) | p. 439 |
| The difference in the means | p. 439 |
| The two-sample problem | p. 440 |
| The Behrens-Fisher problem | p. 441 |
| The ratio of the standard deviations | p. 442 |
| Estimating the ratio, given the means are the same | p. 442 |
| Estimating the ratio, given the means are different | p. 443 |
| Poisson ON/OFF details | p. 445 |
| Derivation of p(s[vertical bar]N[subscript on], I) | p. 445 |
| Evaluation of Num | p. 446 |
| Evaluation of Den | p. 447 |
| Derivation of the Bayes factor B[subscript {s+b,b}] | p. 448 |
| Multivariate Gaussian from maximum entropy | p. 450 |
| References | p. 455 |
| Index | p. 461 |
| Table of Contents provided by Ingram. All Rights Reserved. |
ISBN: 9780521841504
ISBN-10: 052184150X
Published: 27th April 2005
Format: Hardcover
Language: English
Number of Pages: 488
Audience: College, Tertiary and University
Publisher: Cambridge University Press
Country of Publication: GB
Dimensions (cm): 25.4 x 17.78 x 2.69
Weight (kg): 1.14
Shipping
| Standard Shipping | Express Shipping | |
|---|---|---|
| Metro postcodes: | $9.99 | $14.95 |
| Regional postcodes: | $9.99 | $14.95 |
| Rural postcodes: | $9.99 | $14.95 |
Orders over $79.00 qualify for free shipping.
How to return your order
At Booktopia, we offer hassle-free returns in accordance with our returns policy. If you wish to return an item, please get in touch with Booktopia Customer Care.
Additional postage charges may be applicable.
Defective items
If there is a problem with any of the items received for your order then the Booktopia Customer Care team is ready to assist you.
For more info please visit our Help Centre.

























