
A First Course in Bayesian Statistical Methods
By: Peter D. Hoff
Hardcover | 15 June 2009 | Edition Number 2009
At a Glance
284 Pages
23.9 x 16.5 x 2.2
Hardcover
$129.00
or 4 interest-free payments of $32.25 with
orShips in 5 to 7 business days
This book provides a compact self-contained introduction to the theory and application of Bayesian statistical methods. The book is accessible to readers having a basic familiarity with probability, yet allows more advanced readers to quickly grasp the principles underlying Bayesian theory and methods. The examples and computer code allow the reader to understand and implement basic Bayesian data analyses using standard statistical models and to extend the standard models to specialized data analysis situations. The book begins with fundamental notions such as probability, exchangeability and Bayes' rule, and ends with modern topics such as variable selection in regression, generalized linear mixed effects models, and semiparametric copula estimation. Numerous examples from the social, biological and physical sciences show how to implement these methodologies in practice.
Monte Carlo summaries of posterior distributions play an important role in Bayesian data analysis. The open-source R statistical computing environment provides sufficient functionality to make Monte Carlo estimation very easy for a large number of statistical models and example R-code is provided throughout the text. Much of the example code can be run a oeas isa ' in R, and essentially all of it can be run after downloading the relevant datasets from the companion website for this book.
Industry Reviews
From the reviews:
This is an excellent book for its intended audience: statisticians who wish to learn Bayesian methods. Although designed for a statistics audience, it would also be a good book for econometricians who have been trained in frequentist methods, but wish to learn Bayes. In relatively few pages, it takes the reader through a vast amount of material, beginning with deep issues in statistical methodology such as de Finetti's theorem, through the nitty-gritty of Bayesian computation to sophisticated models such as generalized linear mixed effects models and copulas. And it does so in a simple manner, always drawing parallels and contrasts between Bayesian and frequentist methods, so as to allow the reader to see the similarities and differences with clarity. (Econometrics Journal) "Generally, I think this is an excellent choice for a text for a one-semester Bayesian Course. It provides a good overview of the basic tenets of Bayesian thinking for the common one and two parameter distributions and gives introductions to Bayesian regression, multivariate-response modeling, hierarchical modeling, and mixed effects models. The book includes an ample collection of exercises for all the chapters. A strength of the book is its good discussion of Gibbs sampling and Metropolis-Hastings algorithms. The author goes beyond a description of the MCMC algorithms, but also provides insight into why the algorithms work. ...I believe this text would be an excellent choice for my Bayesian class since it seems to cover a good number of introductory topics and giv the student a good introduction to the modern computational tools for Bayesian inference with illustrations using R. (Journal of the American Statistical Association, June 2010, Vol. 105, No. 490)
"Statisticians and applied scientists. The book is accessible to readers having a basic familiarity with probability theory and grounding statistical methods. The author has succeeded in writing an acceptable introduction to the theory and application of Bayesian statistical methods which is modern and covers both the theory and practice. ... this book can be useful as a quick introduction to Bayesian methods for self study. In addition, I highly recommend this book as a text for a course for Bayesian statistics." (Lasse Koskinen, International Statistical Review, Vol. 78 (1), 2010)
"The book under review covers a balanced choice of topics ... presented with a focus on the interplay between Bayesian thinking and the underlying mathematical concepts. ... the book by Peter D. Hoff appears to be an excellent choice for a main reading in an introductory course. After studying this text the student can go in a direction of his liking at the graduate level." (Krzysztof ÅatuszyÅski, Mathematical Reviews, Issue 2011 m)
"The book is a good introductory treatment of methods of Bayes analysis. It should especially appeal to the reader who has had some statistical courses in estimation and modeling, and wants to understand the Bayesian interpretation of those methods. Also, readers who are primarily interested in modeling data and who are working in areas outside of statistics should find this to be a good reference book. ... should appeal to the reader who wants to keep with modern approaches to data analysis." (Richard P. Heydorn, Technometrics, Vol. 54 (1), February, 2012)
| Introduction and examples | p. 1 |
| Introduction | p. 1 |
| Why Bayes? | p. 2 |
| Estimating the probability of a rare event | p. 3 |
| Building a predictive model | p. 8 |
| Where we are going | p. 11 |
| Discussion and further references | p. 12 |
| Belief, probability and exchangeability | p. 13 |
| Belief functions and probabilities | p. 13 |
| Events, partitions and Bayes' rule | p. 14 |
| Independence | p. 17 |
| Random variables | p. 17 |
| Discrete random variables | p. 18 |
| Continuous random variables | p. 19 |
| Descriptions of distributions | p. 21 |
| Joint distributions | p. 23 |
| Independent random variables | p. 26 |
| Exchangeability | p. 27 |
| de Finetti's theorem | p. 29 |
| Discussion and further references | p. 30 |
| One-parameter models | p. 31 |
| The binomial model | p. 31 |
| Inference for exchangeable binary data | p. 35 |
| Confidence regions | p. 41 |
| The Poisson model | p. 43 |
| Posterior inference | p. 45 |
| Example: Birth rates | p. 48 |
| Exponential families and conjugate priors | p. 51 |
| Discussion and further references | p. 52 |
| Monte Carlo approximation | p. 53 |
| The Monte Carlo method | p. 53 |
| Posterior inference for arbitrary functions | p. 57 |
| Sampling from predictive distributions | p. 60 |
| Posterior predictive model checking | p. 62 |
| Discussion and further references | p. 65 |
| The normal model | p. 67 |
| The normal model | p. 67 |
| Inference for the mean, conditional on the variance | p. 69 |
| Joint inference for the mean and variance | p. 73 |
| Bias, variance and mean squared error | p. 79 |
| Prior specification based on expectations | p. 83 |
| The normal model for non-normal data | p. 84 |
| Discussion and further references | p. 86 |
| Posterior approximation with the Gibbs sampler | p. 89 |
| A semiconjugate prior distribution | p. 89 |
| Discrete approximations | p. 90 |
| Sampling from the conditional distributions | p. 92 |
| Gibbs sampling | p. 93 |
| General properties of the Gibbs sampler | p. 96 |
| Introduction to MCMC diagnostics | p. 98 |
| Discussion and further references | p. 104 |
| The multivariate normal model | p. 105 |
| The multivariate normal density | p. 105 |
| A semiconjugate prior distribution for the mean | p. 107 |
| The inverse-Wishart distribution | p. 109 |
| Gibbs sampling of the mean and covariance | p. 112 |
| Missing data and imputation | p. 115 |
| Discussion and further references | p. 123 |
| Group comparisons and hierarchical modeling | p. 125 |
| Comparing two groups | p. 125 |
| Comparing multiple groups | p. 130 |
| Exchangeability and hierarchical models | p. 131 |
| The hierarchical normal model | p. 132 |
| Posterior inference | p. 133 |
| Example: Math scores in U.S. public schools | p. 135 |
| Prior distributions and posterior approximation | p. 137 |
| Posterior summaries and shrinkage | p. 140 |
| Hierarchical modeling of means and variances | p. 143 |
| Analysis of math score data | p. 145 |
| Discussion and further references | p. 146 |
| Linear regression | p. 149 |
| The linear regression model | p. 149 |
| Least squares estimation for the oxygen uptake data | p. 153 |
| Bayesian estimation for a regression model | p. 154 |
| A semiconjugate prior distribution | p. 154 |
| Default and weakly informative prior distributions | p. 155 |
| Model selection | p. 160 |
| Bayesian model comparison | p. 163 |
| Gibbs sampling and model averaging | p. 167 |
| Discussion and further references | p. 170 |
| Nonconjugate priors and Metropolis-Hastings algorithms | p. 171 |
| Generalized linear models | p. 171 |
| The Metropolis algorithm | p. 173 |
| The Metropolis algorithm for Poisson regression | p. 179 |
| Metropolis, Metropolis-Hastings and Gibbs | p. 181 |
| The Metropolis-Hastings algorithm | p. 182 |
| Why does the Metropolis-Hastings algorithm work? | p. 184 |
| Combining the Metropolis and Gibbs algorithms | p. 187 |
| A regression model with correlated errors | p. 188 |
| Analysis of the ice core data | p. 191 |
| Discussion and further references | p. 192 |
| Linear and generalized linear mixed effects models | p. 195 |
| A hierarchical regression model | p. 195 |
| Full conditional distributions | p. 198 |
| Posterior analysis of the math score data | p. 200 |
| Generalized linear mixed effects models | p. 201 |
| A Metropolis-Gibbs algorithm for posterior approximation | p. 202 |
| Analysis of tumor location data | p. 203 |
| Discussion and further references | p. 207 |
| Latent variable methods for ordinal data | p. 209 |
| Ordered probit regression and the rank likelihood | p. 209 |
| Probit regression | p. 211 |
| Transformation models and the rank likelihood | p. 214 |
| The Gaussian copula model | p. 217 |
| Rank likelihood for copula estimation | p. 218 |
| Discussion and further references | p. 223 |
| Exercises | p. 225 |
| Common distributions | p. 253 |
| References | p. 259 |
| Index | p. 267 |
| Table of Contents provided by Ingram. All Rights Reserved. |
ISBN: 9780387922997
ISBN-10: 0387922997
Series: Springer Texts in Statistics
Published: 15th June 2009
Format: Hardcover
Number of Pages: 284
Audience: College, Tertiary and University
Publisher: Springer Nature B.V.
Country of Publication: GB
Edition Number: 2009
Dimensions (cm): 23.9 x 16.5 x 2.2
Weight (kg): 0.58
Shipping
| Standard Shipping | Express Shipping | |
|---|---|---|
| Metro postcodes: | $9.99 | $14.95 |
| Regional postcodes: | $9.99 | $14.95 |
| Rural postcodes: | $9.99 | $14.95 |
Orders over $79.00 qualify for free shipping.
How to return your order
At Booktopia, we offer hassle-free returns in accordance with our returns policy. If you wish to return an item, please get in touch with Booktopia Customer Care.
Additional postage charges may be applicable.
Defective items
If there is a problem with any of the items received for your order then the Booktopia Customer Care team is ready to assist you.
For more info please visit our Help Centre.

























