Hyperparameter Optimization for Machine Learning Course
Learn grid and random search, Bayesian optimization, multi-fidelity models, Optuna, Hyperopt, Scikit-Optimize & more.
Welcome to Hyperparameter Optimization for Machine Learning. In this course, you will learn multiple techniques to select the best hyperparameters and improve the performance of your machine learning models.
If you are regularly training machine learning models as a hobby or for your organization and want to improve the performance of your models, if you are keen to jump up in the leader board of a data science competition, or you simply want to learn more about how to tune hyperparameters of machine learning models, this course will show you how.
We’ll take you step-by-step through engaging video tutorials and teach you everything you need to know about hyperparameter tuning. Throughout this comprehensive course, we cover almost every available approach to optimize hyperparameters, discussing their rationale, their advantages and shortcomings, the considerations to have when using the technique and their implementation in Python.
Best Seller Course: Math 0-1: Calculus for Data Science & Machine Learning
What you’ll learn
- Hyperparameter Tunning and why it matters
- Cross-validation and nested cross-validation
- Hyperparameter Tunning with Grid and Random search
- Bayesian Optimization
- Tree-Structured Parzen Estimators, Population Based Training and SMAC
- Hyperparameter Tunning tools, i.e., Hyperopt, Optuna, Scikit-optimize, Keras Turner and others
Recommended Course: The Complete Visual Guide to Machine Learning & Data Science