Take your machine learning models to the next level by learning how to leverage hyperparameter tuning, allowing you to control the model's finest details.
Key Features
- Gain a deep understanding of how hyperparameter tuning works
- Explore exhaustive search, heuristic search, and Bayesian and Multi-fidelity Optimization methods
- Learn which method should be used to solve a specific situation or problem
Book Description
Hyperparameters are an important element in building useful machine learning models. This book curates numerous hyperparameter tuning methods for Python, one of the most popular coding languages for machine learning. Alongside in-depth explanations of how each method works, you will use a decision map that can help you identify the best tuning method for your requirements.
We will start the book with an introduction to hyperparameter tuning and explain why it's important. You'll learn the best methods for hyperparameter tuning for a variety of use cases and specific algorithm types. The book will not only cover the usual grid or random search but also other powerful underdog methods. Individual chapters are dedicated to giving full attention to the three main groups of hyperparameter tuning methods: exhaustive search, heuristic search, Bayesian Optimization, and Multi-Fidelity Optimization.
Later in the book, you will learn about tops frameworks like Scikit, Hyperopt, Optuna, NNI, and DEAP to implement hyperparameter tuning. Finally, we will cover hyperparameters of popular algorithms and best practices that will help you efficiently tune your hyperparameter.
By the end of the book, you will have the skills you need to take full control over your machine learning models and get the best models for the best results.
What you will learn
- Discover hyperparameter space and types of hyperparameter distributions
- Explore manual, grid, and random search, and the pros and cons of each
- Learn powerful underdog methods along with best practices
- Explore the hyperparameters of popular algorithms
- See how to tune hyperparameters in different frameworks and libraries
- Deep dive into top frameworks Scikit, Hyperopt, Optuna, NNI, and DEAP
- Best practices to apply to your machine learning models right away
Who This Book Is For
The book is intended for Data Scientists and ML engineers who are working with Python and want to further boost their ML model's performance by utilizing the appropriate hyperparameter tuning method.
The reader will need to have a basic understanding of ML and how to code in Python but will require no prior knowledge of hyperparameter tuning in Python.
Table of Contents
- Evaluating Machine Learning Models
- Introduction to Hyperparameter Tuning
- Exhaustive Search
- Bayesian Optimization
- Heuristic Search
- Exploring Multi-Fidelity Optimization
- Hyperparameter Tuning via Scikit
- Hyperparameter Tuning via Hyperopt
- Hyperparameter Tuning via Optuna
- Advanced Hyperparameter Tuning with DEAP and Microsoft NNI
- Understanding Hyperparameters of Popular Algorithms
- The Hyperparameter Space x Training Time (HST) Decision Map
- Tracking Hyperparameter Tuning Experiments
- Conclusions and Next Steps