An up-to-date, self-contained introduction to a state-of-the-art machine learning approach, Ensemble Methods: Foundations and Algorithms shows how these accurate methods are used in real-world tasks. It gives you the necessary groundwork to carry out further research in this evolving field.
After presenting background and terminology, the book covers the main algorithms and theories, including Boosting, Bagging, Random Forest, averaging and voting schemes, the Stacking method, mixture of experts, and diversity measures. It also discusses multiclass extension, noise tolerance, error-ambiguity and bias-variance decompositions, and recent progress in information theoretic diversity.
Moving on to more advanced topics, the author explains how to achieve better performance through ensemble pruning and how to generate better clustering results by combining multiple clusterings. In addition, he describes developments of ensemble methods in semi-supervised learning, active learning, cost-sensitive learning, class-imbalance learning, and comprehensibility enhancement.
Professor Zhou's book is a comprehensive introduction to ensemble methods in machine learning. It reviews the latest research in this exciting area. I learned a lot reading it! -Thomas G. Dietterich, Professor and Director of Intelligent Systems Research, Oregon State University, Corvallis, USA; ACM Fellow; and Founding President of the International Machine Learning Society This is a timely book. Right time and right book ... with an authoritative but inclusive style that will allow many readers to gain knowledge on the topic. -Fabio Roli, University of Cagliari, Italy
Series: Chapman & Hall/CRC Machine Learning & Pattern Recognition
Audience: Tertiary; University or College
Number Of Pages: 236
Published: 5th August 2012
Dimensions (cm): 23.5 x 15.6
Weight (kg): 0.48