| Preface to the second edition | p. xi |
| Preface to the first edition | p. xiii |
| Introduction | p. 1 |
| Chance and information | p. 1 |
| Mathematical models of chance phenomena | p. 2 |
| Mathematical structure and mathematical proof | p. 5 |
| Plan of this book | p. 7 |
| Combinatorics | p. 10 |
| Counting | p. 10 |
| Arrangements | p. 11 |
| Combinations | p. 13 |
| Multinomial coefficients | p. 16 |
| The gamma function | p. 18 |
| Exercises | p. 19 |
| Further reading | p. 21 |
| Sets and measures | p. 22 |
| The concept of a set | p. 22 |
| Set operations | p. 25 |
| Boolean algebras | p. 29 |
| Measures on Boolean algebras | p. 32 |
| Exercises | p. 37 |
| Further reading | p. 40 |
| Probability | p. 41 |
| The concept of probability | p. 41 |
| Probability in practice | p. 43 |
| Conditional probability | p. 48 |
| Independence | p. 55 |
| The interpretation of probability | p. 57 |
| The historical roots of probability | p. 62 |
| Exercises | p. 64 |
| Further reading | p. 68 |
| Discrete random variables | p. 70 |
| The concept of a random variable | p. 70 |
| Properties of random variables | p. 72 |
| Expectation and variance | p. 78 |
| Covariance and correlation | p. 83 |
| Independent random variables | p. 86 |
| I.I.D. random variables | p. 89 |
| Binomial and Poisson random variables | p. 91 |
| Geometric, negative binomial and hypergeometric random variables | p. 95 |
| Exercises | p. 99 |
| Further reading | p. 104 |
| Information and entropy | p. 105 |
| What is information? | p. 105 |
| Entropy | p. 108 |
| Joint and conditional entropies; mutual information | p. 111 |
| The maximum entropy principle | p. 115 |
| Entropy, physics and life | p. 117 |
| The uniqueness of entropy | p. 119 |
| Exercises | p. 123 |
| Further reading | p. 125 |
| Communication | p. 127 |
| Transmission of information | p. 127 |
| The channel capacity | p. 130 |
| Codes | p. 132 |
| Noiseless coding | p. 137 |
| Coding and transmission with noise - Shannon's theorem | p. 143 |
| Brief remarks about the history of information theory | p. 150 |
| Exercises | p. 151 |
| Further reading | p. 153 |
| Random variables with probability density functions | p. 155 |
| Random variables with continuous ranges | p. 155 |
| Probability density functions | p. 157 |
| Discretisation and integration | p. 161 |
| Laws of large numbers | p. 164 |
| Normal random variables | p. 167 |
| The central limit theorem | p. 172 |
| Entropy in the continuous case | p. 179 |
| Exercises | p. 182 |
| Further reading | p. 186 |
| Random vectors | p. 188 |
| Cartesian products | p. 188 |
| Boolean algebras and measures on products | p. 191 |
| Distributions of random vectors | p. 193 |
| Marginal distributions | p. 199 |
| Independence revisited | p. 201 |
| Conditional densities and conditional entropy | p. 204 |
| Mutual information and channel capacity | p. 208 |
| Exercises | p. 212 |
| Further reading | p. 216 |
| Markov chains and their entropy | p. 217 |
| Stochastic processes | p. 217 |
| Markov chains | p. 219 |
| The Chapman-Kolmogorov equations | p. 224 |
| Stationary processes | p. 227 |
| Invariant distributions and stationary Markov chains | p. 229 |
| Entropy rates for Markov chains | p. 235 |
| Exercises | p. 240 |
| Further reading | p. 243 |
| Exploring further | p. 245 |
| Proof by mathematical induction | p. 247 |
| Lagrange multipliers | p. 249 |
| Integration of exp (-1/2 x2) | p. 252 |
| Table of probabilities associated with the standard normal distribution | p. 254 |
| A rapid review of matrix algebra | p. 256 |
| Selected solutions | p. 260 |
| Index | p. 268 |
| Table of Contents provided by Ingram. All Rights Reserved. |