+612 9045 4394
 
CHECKOUT
Neural Network Learning : Theoretical Foundations - Martin Anthony

Neural Network Learning

Theoretical Foundations

Hardcover

Published: 13th November 1999
Ships: 7 to 10 business days
7 to 10 business days
$253.50
or 4 easy payments of $63.38 with Learn more

Other Available Formats (Hide)

This important work describes recent theoretical advances in the study of artificial neural networks. It explores probabilistic models of supervised learning problems, and addresses the key statistical and computational questions. Chapters survey research on pattern classification with binary-output networks, including a discussion of the relevance of the Vapnik Chervonenkis dimension, and of estimates of the dimension for several neural network models. In addition, Anthony and Bartlett develop a model of classification by real-output networks, and demonstrate the usefulness of classification with a "large margin." The authors explain the role of scale-sensitive versions of the Vapnik Chervonenkis dimension in large margin classification, and in real prediction. Key chapters also discuss the computational complexity of neural network learning, describing a variety of hardness results, and outlining two efficient, constructive learning algorithms. The book is self-contained and accessible to researchers and graduate students in computer science, engineering, and mathematics.

'The book is a useful and readable mongraph. For beginners it is a nice introduction to the subject, for experts a valuable reference.' Zentralblatt MATH

Introduction
Pattern Recognition with Binary-output Neural Networks
The pattern recognition problem
The growth function and VC-dimension
General upper bounds on sample complexity
General lower bounds
The VC-dimension of linear threshold networks
Bounding the VC-dimension using geometric techniques
VC-dimension bounds for neural networks
Pattern Recognition with Real-output Neural Networks
Classification with real values
Covering numbers and uniform convergence
The pseudo-dimension and fat-shattering dimension
Bounding covering numbers with dimensions
The sample complexity of classification learning
The dimensions of neural networks
Model selection
Learning Real-Valued Functions
Learning classes of real functions
Uniform convergence results for real function classes
Bounding covering numbers
The sample complexity of learning function classes
Convex classes
Other learning problems
Algorithmics
Efficient learning
Learning as optimisation
The Boolean perceptron
Hardness results for feed-forward networks
Constructive learning algorithms for two-layered networks
Table of Contents provided by Publisher. All Rights Reserved.

ISBN: 9780521573535
ISBN-10: 052157353X
Audience: Professional
Format: Hardcover
Language: English
Number Of Pages: 404
Published: 13th November 1999
Publisher: CAMBRIDGE UNIV PR
Country of Publication: GB
Dimensions (cm): 23.44 x 15.82  x 2.67
Weight (kg): 0.64