
STATISTICAL METHODS IN EXPER PHY(2ED)
By:Â JAMES FREDERICK
Hardcover | 4 December 2006 | Edition Number 2
At a Glance
364 Pages
Revised
22.86 x 15.24 x 2.06
New Edition
Hardcover
RRP $131.99
$118.99
10%OFF
or 4 interest-free payments of $29.75 with
 orÂShips in 15 to 25 business days
The first edition of this classic book has become the authoritative reference for physicists desiring to master the finer points of statistical data analysis. This second edition contains all the important material of the first, much of it unavailable from any other sources. In addition, many chapters have been updated with considerable new material, especially in areas concerning the theory and practice of confidence intervals, including the important Feldman-Cousins method. Both frequentist and Bayesian methodologies are presented, with a strong emphasis on techniques useful to physicists and other scientists in the interpretation of experimental data and comparison with scientific theories. This is a valuable textbook for advanced graduate students in the physical sciences as well as a reference for active researchers.
Industry Reviews
| Preface to the Second Edition | p. v |
| Preface to the First Edition | p. vii |
| Introduction | p. 1 |
| Outline | p. 1 |
| Language | p. 2 |
| Two Philosophies | p. 3 |
| Notation | p. 4 |
| Basic Concepts in Probability | p. 9 |
| Definitions of Probability | p. 9 |
| Mathematical probability | p. 10 |
| Frequentist probability | p. 10 |
| Bayesian probability | p. 11 |
| Properties of Probability | p. 12 |
| Addition law for sets of elementary events | p. 12 |
| Conditional probability and independence | p. 13 |
| Example of the addition law: scanning efficiency | p. 14 |
| Bayes theorem for discrete events | p. 15 |
| Bayesian use of Bayes theorem | p. 16 |
| Random variable | p. 17 |
| Continuous Random Variables | p. 18 |
| Probability density function | p. 19 |
| Change of variable | p. 20 |
| Cumulative, marginal and conditional distributions | p. 21 |
| Bayes theorem for continuous variables | p. 22 |
| Bayesian use of Bayes theorem for continuous variables | p. 22 |
| Properties of Distributions | p. 24 |
| Expectation, mean and variance | p. 24 |
| Covariance and correlation | p. 26 |
| Linear functions of random variables | p. 28 |
| Ratio of random variables | p. 30 |
| Approximate variance formulae | p. 32 |
| Moments | p. 33 |
| Characteristic Function | p. 34 |
| Definition and properties | p. 34 |
| Cumulants | p. 37 |
| Probability generating function | p. 38 |
| Sums of a random number of random variables | p. 39 |
| Invariant measures | p. 41 |
| Convergence and the Law of Large Numbers | p. 43 |
| The Tchebycheff Theorem and Its Corollary | p. 43 |
| Tchebycheff theorem | p. 43 |
| Bienayme-Tchebycheff inequality | p. 44 |
| Convergence | p. 45 |
| Convergence in distribution | p. 45 |
| The Paul Levy theorem | p. 46 |
| Convergence in probability | p. 46 |
| Stronger types of convergence | p. 47 |
| The Law of Large Numbers | p. 47 |
| Monte Carlo integration | p. 48 |
| The Central Limit theorem | p. 49 |
| Example: Gaussian (Normal) random number generator | p. 51 |
| Probability Distributions | p. 53 |
| Discrete Distributions | p. 53 |
| Binomial distribution | p. 53 |
| Multinomial distribution | p. 56 |
| Poisson distribution | p. 57 |
| Compound Poisson distribution | p. 60 |
| Geometric distribution | p. 62 |
| Negative binomial distribution | p. 63 |
| Continuous Distributions | p. 64 |
| Normal one-dimensional (univariate Gaussian) | p. 64 |
| Normal many-dimensional (multivariate Gaussian) | p. 67 |
| Chi-square distribution | p. 70 |
| Student's t-distribution | p. 73 |
| Fisher-Snedecor F and Z distributions | p. 77 |
| Uniform distribution | p. 79 |
| Triangular distribution | p. 79 |
| Beta distribution | p. 80 |
| Exponential distribution | p. 82 |
| Gamma distribution | p. 83 |
| Cauchy, or Breit-Wigner, distribution | p. 84 |
| Log-Normal distribution | p. 85 |
| Extreme value distribution | p. 87 |
| Weibull distribution | p. 89 |
| Double exponential distribution | p. 89 |
| Asymptotic relationships between distributions | p. 90 |
| Handling of Real Life Distributions | p. 91 |
| General applicability of the Normal distribution | p. 91 |
| Johnson empirical distributions | p. 92 |
| Truncation | p. 93 |
| Experimental resolution | p. 94 |
| Examples of variable experimental resolution | p. 95 |
| Information | p. 99 |
| Basic Concepts | p. 100 |
| Likelihood function | p. 100 |
| Statistic | p. 100 |
| Information of R.A. Fisher | p. 101 |
| Definition of information | p. 101 |
| Properties of information | p. 101 |
| Sufficient Statistics | p. 103 |
| Sufficiency | p. 103 |
| Examples | p. 104 |
| Minimal sufficient statistics | p. 105 |
| Darmois theorem | p. 106 |
| Information and Sufficiency | p. 108 |
| Example of Experimental Design | p. 109 |
| Decision Theory | p. 111 |
| Basic Concepts in Decision Theory | p. 112 |
| Subjective probability, Bayesian approach | p. 112 |
| Definitions and terminology | p. 113 |
| Choice of Decision Rules | p. 114 |
| Classical choice: pre-ordering rules | p. 114 |
| Bayesian choice | p. 115 |
| Minimax decisions | p. 116 |
| Decision-theoretic Approach to Classical Problems | p. 117 |
| Point estimation | p. 117 |
| Interval estimation | p. 118 |
| Tests of hypotheses | p. 118 |
| Examples: Adjustment of an Apparatus | p. 121 |
| Adjustment given an estimate of the apparatus performance | p. 121 |
| Adjustment with estimation of the optimum adjustment | p. 123 |
| Conclusion: Indeterminacy in Classical and Bayesian Decisions | p. 124 |
| Theory of Estimators | p. 127 |
| Basic Concepts in Estimation | p. 127 |
| Consistency and convergence | p. 128 |
| Bias and consistency | p. 129 |
| Usual Methods of Constructing Consistent Estimators | p. 130 |
| The moments method | p. 131 |
| Implicitly defined estimators | p. 132 |
| The maximum likelihood method | p. 135 |
| Least squares methods | p. 137 |
| Asymptotic Distributions of Estimates | p. 139 |
| Asymptotic Normality | p. 139 |
| Asymptotic expansion of moments of estimates | p. 141 |
| Asymptotic bias and variance of the usual estimators | p. 144 |
| Information and the Precision of an Estimator | p. 146 |
| Lower bounds for the variance - Cramer-Rao inequality | p. 147 |
| Efficiency and minimum variance | p. 149 |
| Cramer-Rao inequality for several parameters | p. 151 |
| The Gauss-Markov theorem | p. 152 |
| Asymptotic efficiency | p. 153 |
| Bayesian Inference | p. 154 |
| Choice of prior density | p. 154 |
| Bayesian inference about the Poisson parameter | p. 156 |
| Priors closed under sampling | p. 157 |
| Bayesian inference about the mean, when the variance is known | p. 157 |
| Bayesian inference about the variance, when the mean is known | p. 159 |
| Bayesian inference about the mean and the variance | p. 161 |
| Summary of Bayesian inference for Normal parameters | p. 162 |
| Point Estimation in Practice | p. 163 |
| Choice of Estimator | p. 163 |
| Desirable properties of estimators | p. 164 |
| Compromise between statistical merits | p. 165 |
| Cures to obtain simplicity | p. 166 |
| Economic considerations | p. 168 |
| The Method of Moments | p. 170 |
| Orthogonal functions | p. 170 |
| Comparison of likelihood and moments methods | p. 172 |
| The Maximum Likelihood Method | p. 173 |
| Summary of properties of maximum likelihood | p. 173 |
| Example: determination of the lifetime of a particle in a restricted volume | p. 175 |
| Academic example of a poor maximum likelihood estimate | p. 177 |
| Constrained parameters in maximum likelihood | p. 179 |
| The Least Squares Method (Chi-Square) | p. 182 |
| The linear model | p. 183 |
| The polynomial model | p. 185 |
| Constrained parameters in the linear model | p. 186 |
| Normally distributed data in nonlinear models | p. 190 |
| Estimation from histograms; comparison of likelihood and least squares methods | p. 191 |
| Weights and Detection Efficiency | p. 193 |
| Ideal method maximum likelihood | p. 194 |
| Approximate method for handling weights | p. 196 |
| Exclusion of events with large weight | p. 199 |
| Least squares method | p. 201 |
| Reduction of Bias | p. 204 |
| Exact distribution of the estimate known | p. 204 |
| Exact distribution of the estimate unknown | p. 206 |
| Robust (Distribution-free) Estimation | p. 207 |
| Robust estimation of the centre of a distribution | p. 208 |
| Trimming and Winsorization | p. 210 |
| Generalized p[superscript th]-power norms | p. 211 |
| Estimates of location for asymmetric distributions | p. 213 |
| Interval Estimation | p. 215 |
| Normally distributed data | p. 216 |
| Confidence intervals for the mean | p. 216 |
| Confidence intervals for several parameters | p. 218 |
| Interpretation of the covariance matrix | p. 223 |
| The General Case in One Dimension | p. 225 |
| Confidence intervals and belts | p. 225 |
| Upper limits, lower limits and flip-flopping | p. 227 |
| Unphysical values and empty intervals | p. 229 |
| The unified approach | p. 229 |
| Confidence intervals for discrete data | p. 231 |
| Use of the Likelihood Function | p. 233 |
| Parabolic log-likelihood function | p. 233 |
| Non-parabolic log-likelihood functions | p. 234 |
| Profile likelihood regions in many parameters | p. 236 |
| Use of Asymptotic Approximations | p. 238 |
| Asymptotic Normality of the maximum likelihood estimate | p. 238 |
| Asymptotic Normality of a ln L / a[Theta] | p. 238 |
| aL / a[Theta] confidence regions in many parameters | p. 240 |
| Finite sample behaviour of three general methods of interval estimation | p. 240 |
| Summary: Confidence Intervals and the Ensemble | p. 246 |
| The Bayesian Approach | p. 248 |
| Confidence intervals and credible intervals | p. 249 |
| Summary: Bayesian or frequentist intervals? | p. 250 |
| Test of Hypotheses | p. 253 |
| Formulation of a Test | p. 254 |
| Basic concepts in testing | p. 254 |
| Example: Separation of two classes of events | p. 255 |
| Comparison of Tests | p. 257 |
| Power | p. 257 |
| Consistency | p. 259 |
| Bias | p. 260 |
| Choice of tests | p. 261 |
| Test of Simple Hypotheses | p. 263 |
| The Neyman-Pearson test | p. 263 |
| Example: Normal theory test versus sign test | p. 264 |
| Tests of Composite Hypotheses | p. 266 |
| Existence of a uniformly most powerful test for the exponential family | p. 267 |
| One- and two-sided tests | p. 268 |
| Maximizing local power | p. 269 |
| Likelihood Ratio Test | p. 270 |
| Test statistic | p. 270 |
| Asymptotic distribution for continuous families of hypotheses | p. 271 |
| Asymptotic power for continuous families of hypotheses | p. 273 |
| Examples | p. 274 |
| Small sample behaviour | p. 279 |
| Example of separate families of hypotheses | p. 282 |
| General methods for testing separate families | p. 285 |
| Tests and Decision Theory | p. 287 |
| Bayesian choice between families of distributions | p. 287 |
| Sequential tests for optimum number of observations | p. 292 |
| Sequential probability ratio test for a continuous family of hypotheses | p. 297 |
| Summary of Optimal Tests | p. 298 |
| Goodness-of-Fit Tests | p. 299 |
| GOF Testing: From the Test Statistic to the P-value | p. 299 |
| Pearson's Chi-square Test for Histograms | p. 301 |
| Moments of the Pearson statistic | p. 302 |
| Chi-square test with estimation of parameters | p. 303 |
| Choosing optimal bin size | p. 304 |
| Other Tests on Binned Data | p. 308 |
| Runs test | p. 308 |
| Empty cell test, order statistics | p. 309 |
| Neyman-Barton smooth test | p. 311 |
| Tests Free of Binning | p. 313 |
| Smirnov-Cramer-von Mises test | p. 314 |
| Kolmogorov test | p. 316 |
| More refined tests based on the EDF | p. 317 |
| Use of the likelihood function | p. 317 |
| Applications | p. 318 |
| Observation of a fine structure | p. 318 |
| Combining independent estimates | p. 323 |
| Comparing distributions | p. 327 |
| Combining Independent Tests | p. 330 |
| Independence of tests | p. 330 |
| Significance level of the combined test | p. 331 |
| References | p. 335 |
| Subject Index | p. 341 |
| Table of Contents provided by Ingram. All Rights Reserved. |
ISBN: 9789812567956
ISBN-10: 981256795X
Published: 4th December 2006
Format: Hardcover
Language: English
Number of Pages: 364
Audience: College, Tertiary and University
Publisher: World Scientific Publishing Co Pte Ltd
Country of Publication: SG
Edition Number: 2
Edition Type: Revised
Dimensions (cm): 22.86 x 15.24 x 2.06
Weight (kg): 0.67
Shipping
| Standard Shipping | Express Shipping | |
|---|---|---|
| Metro postcodes: | $9.99 | $14.95 |
| Regional postcodes: | $9.99 | $14.95 |
| Rural postcodes: | $9.99 | $14.95 |
Orders over $79.00 qualify for free shipping.
How to return your order
At Booktopia, we offer hassle-free returns in accordance with our returns policy. If you wish to return an item, please get in touch with Booktopia Customer Care.
Additional postage charges may be applicable.
Defective items
If there is a problem with any of the items received for your order then the Booktopia Customer Care team is ready to assist you.
For more info please visit our Help Centre.
You Can Find This Book In

Biofuel Perspectives
Methanol and Ethanol in Combustion Engines from Production to Practical Application
Paperback
RRP $363.95
$322.75
OFF
This product is categorised by
- Non-FictionEngineering & TechnologyTechnology in GeneralInstruments & Instrumentation Engineering
- Non-FictionMathematicsOptimisation
- Non-FictionMathematicsProbability & Statistics
- Non-FictionScienceScience in GeneralScientific Equipment
- Non-FictionSciencePhysics
- Non-FictionScienceScience in GeneralMaths for Scientists
- Booktopia Publisher ServicesWorld Scientific Publishing
























