| Preface to the Second Edition | p. xiii |
| Preface to the First Edition | p. xvii |
| Introduction | p. 1 |
| Preliminaries | p. 13 |
| Pseudometric Spaces, Packing and Covering Numbers | p. 13 |
| Pseudometric Spaces | p. 13 |
| Packing and Covering Numbers | p. 14 |
| Compact and Totally Bounded Sets | p. 16 |
| Probability Measures | p. 17 |
| Definition of a Probability Space | p. 17 |
| A Pseudometric Induced by a Probability Measure | p. 18 |
| A Metric on the Set of Probability Measures | p. 19 |
| Random Variables | p. 21 |
| Conditional Expectations | p. 23 |
| Large Deviation Type Inequalities | p. 24 |
| Chernoff Bounds | p. 24 |
| Chernoff-Okamoto Bound | p. 26 |
| Hoeffding's Inequality | p. 26 |
| Stochastic Processes, Almost Sure Convergence | p. 29 |
| Probability Measures on Infinite Cartesian Products | p. 29 |
| Stochastic Processes | p. 29 |
| The Borel-Cantelli Lemma and Almost Sure Convergence | p. 30 |
| Mixing Properties of Stochastic Processes | p. 33 |
| Definitions of Various Kinds of Mixing Coefficients | p. 34 |
| Inequalities for Mixing Processes | p. 36 |
| Problem Formulations | p. 43 |
| Uniform Convergence of Empirical Means | p. 43 |
| The UCEM Property | p. 43 |
| The UCEMUP Property | p. 52 |
| Extension to Dependent Input Sequences | p. 54 |
| Learning Concepts and Functions | p. 55 |
| Concept Learning | p. 55 |
| Function Learning | p. 64 |
| Extension to Dependent Input Sequences | p. 65 |
| Assumptions Underlying the Model of Learning | p. 66 |
| Alternate Notions of Learnability | p. 70 |
| Model-Free Learning | p. 76 |
| Problem Formulation | p. 76 |
| Relationship to the Uniform Convergence of Empirical Means | p. 81 |
| Preservation of UCEMUP and PAC Properties | p. 83 |
| Preservation of UCEMUP Property with Beta-Mixing Inputs | p. 84 |
| Law of Large Numbers Under Alpha-Mixing Inputs | p. 89 |
| Preservation of PAC Learning Property with Beta-Mixing Inputs | p. 94 |
| Preservation of PAC Learning Property with Beta-Mixing Inputs: Continued | p. 95 |
| Replacing <$>{cal P}<$> by its Closure | p. 97 |
| Markov Chains and Beta-Mixing | p. 100 |
| Geometric Ergodicity and Beta-Mixing | p. 100 |
| Beta-Mixing Properties of Markov Sequences | p. 105 |
| Mixing Properties of Hidden Markov Models | p. 110 |
| Vapnik-Chervonenkis, Pseudo- and Fat-Shattering Dimen-sions | p. 115 |
| Definitions | p. 115 |
| The Vapnik-Chervonenkis Dimension | p. 115 |
| The Pseudo-Dimension | p. 120 |
| The Fat-Shattering Dimension | p. 122 |
| Bounds on Growth Functions | p. 123 |
| Growth Functions of Collections of Sets | p. 123 |
| Bounds on Covering Numbers Based on the Pseudo-Dimension | p. 128 |
| Metric Entropy Bounds for Families of Functions | p. 132 |
| Bounds on Covering Numbers Based on the Fat-Shattering Dimension | p. 139 |
| Growth Functions of Iterated Families | p. 141 |
| Uniform Convergence of Empirical Means | p. 149 |
| Restatement of the Problems Under Study | p. 149 |
| Equivalence of the UCEM and ASCEM Properties | p. 153 |
| Main Theorems | p. 155 |
| Preliminary Lemmas | p. 161 |
| Theorem 5.1: Proof of Necessity | p. 173 |
| Theorem 5.1: Proof of Sufficiency | p. 178 |
| Proofs of the Remaining Theorems | p. 190 |
| Uniform Convergence Properties of Iterated Families | p. 194 |
| Boolean Operations on Collections of Sets | p. 195 |
| Uniformly Continuous Mappings on Families of Functions | p. 196 |
| Families of Loss Functions | p. 200 |
| Learning Under a Fixed Probability Measure | p. 207 |
| Introduction | p. 207 |
| UCEM Property Implies ASEC Learnability | p. 209 |
| Finite Metric Entropy Implies Learnability | p. 216 |
| Consistent Learnability | p. 224 |
| Consistent PAC Learnability | p. 224 |
| Consistent PUAC Learnability | p. 226 |
| Examples | p. 230 |
| Learnable Concept Classes Have Finite Metric Entropy | p. 236 |
| Model-Free Learning | p. 242 |
| A Sufficient Condition for Learnability | p. 244 |
| A Necessary Condition | p. 248 |
| Dependent Inputs | p. 250 |
| Finite Metric Entropy and Alpha-Mixing Input Sequences | p. 250 |
| Consistent Learnability and Beta-Mixing Input Sequences | p. 251 |
| Distribution-Free Learning | p. 255 |
| Uniform Convergence of Empirical Means | p. 255 |
| Function Classes | p. 256 |
| Concept Classes | p. 258 |
| Loss Functions | p. 261 |
| Function Learning | p. 263 |
| Finite P-Dimension Implies PAC and PUAC Learnability | p. 264 |
| Finite P-Dimension is not Necessary for PAC Learnability | p. 267 |
| Concept Learning | p. 269 |
| Improved Upper Bound for the Sample Complexity | p. 269 |
| A Universal Lower Bound for the Sample Complexity | p. 273 |
| Learnability Implies Finite VC-Dimension | p. 278 |
| Learnability of Functions with a Finite Range | p. 280 |
| Learning Under an Intermediate Family of Probabilities | p. 285 |
| General Families of Probabilities | p. 287 |
| Uniform Convergence of Empirical Means | p. 287 |
| Function Learning | p. 288 |
| Concept Learning | p. 292 |
| Totally Bounded Families of Probabilities | p. 297 |
| Families of Probabilities with a Nonempty Interior | p. 308 |
| Alternate Models of Learning | p. 311 |
| Efficient Learning | p. 312 |
| Definition of Efficient Learnability | p. 313 |
| The Complexity of Finding a Consistent Hypothesis | p. 317 |
| Active Learning | p. 326 |
| Fixed-Distribution Learning | p. 329 |
| Distribution-Free Learning | p. 332 |
| Learning with Prior Information: Necessary and Sufficient Conditions | p. 335 |
| Definition of Learnability with Prior Information | p. 335 |
| Some Simple Sufficient Conditions | p. 337 |
| Dispersability of Function Classes | p. 341 |
| Connections Between Dispersability and Learnability WPI | p. 344 |
| Distribution-Free Learning with Prior Information | p. 348 |
| Learning with Prior Information: Bounds on Learning Rates | p. 352 |
| Applications to Neural Networks | p. 365 |
| What is a Neural Network? | p. 366 |
| Learning in Neural Networks | p. 369 |
| Problem Formulation | p. 369 |
| Reprise of Sample Complexity Estimates | p. 372 |
| Complexity-Theoretic Limits to Learnability | p. 377 |
| Estimates of VC-Dimensions of Families of Networks | p. 381 |
| Multi-Layer Perceptron Networks | p. 382 |
| A Network with Infinite VC-Dimension | p. 388 |
| Neural Networks as Verifiers of Formulas | p. 390 |
| Neural Networks with Piecewise-Polynomial Activation Functions | p. 396 |
| A General Approach | p. 402 |
| An Improved Bound | p. 406 |
| Networks with Pfaffian Activation Functions | p. 410 |
| Results Based on Order-Minimality | p. 413 |
| Structural Risk Minimization | p. 415 |
| Applications to Control Systems | p. 421 |
| Randomized Algorithms for Robustness Analysis | p. 421 |
| Introduction to Robust Control | p. 421 |
| Some NP-Hard Problems in Robust Control | p. 424 |
| Randomized Algorithms for Robustness Analysis | p. 426 |
| Randomized Algorithms for Robust Controller Synthesis: Gen-eral Approach | p. 429 |
| Paradigm of Robust Controller Synthesis Problem | p. 429 |
| Various Types of "Near" Minima | p. 432 |
| A General Approach to Randomized Algorithms | p. 435 |
| Two Algorithms for Finding Probably Approximate Near Minima | p. 436 |
| VC-Dimension Estimates for Problems in Robust Controller Synthesis | p. 438 |
| A General Result | p. 438 |
| Robust Stabilization | p. 438 |
| Weighted H∞-Norm Minimization | p. 441 |
| Weighted H2-Norm Minimization | p. 444 |
| Sample Complexity Considerations | p. 445 |
| Robust Controller Design Using Randomized Algorithms: An Example | p. 449 |
| A Learning Theory Approach to System Identification | p. 453 |
| Problem Formulation | p. 453 |
| A General Result | p. 455 |
| Sufficient Conditions for the UCEM Property | p. 458 |
| Bounds on the P-Dimension | p. 461 |
| Some Open Problems | p. 465 |
| Table of Contents provided by Publisher. All Rights Reserved. |