Hyperparameter tuning is crucial for getting the most out of your machine learning models, but it can be tedious and time-consuming if done manually. In this code-driven video, learn how to effectively and efficiently tune hyperparameters like learning rate, number of trees, and tree depth using the tidymodels packages tune and finetune in R.
I'll demonstrate how to set up grid search to methodically explore the hyperparameter space and zero in on the ideal model configurations. Using the penguins dataset, we'll build gradient boosted tree models with XGBoost and fine tune hyperparameters like learning rate and number of leaves to improve model performance.
I share my tips on leveraging tools like using latin hypercube designs and parallel processing to make hyperparameter tuning faster and more effective. If you want to squeeze more performance out of your machine learning models in R, boosting their predictive power through systematic hyperparameter tuning, this video is for you.s
Slides: https://jameshwade.quarto.pub/hyperparameters-tuning-with-tidymodels/
Code: https://github.com/jameshwade/r-mlops
Simon Couch's tutorial on speeding up hyperparameter tuning: https://www.simonpcouch.com/blog/parallel-racing/
00:00 - Intro
00:42 - A Warning & Context
03:00 - Specifying a Model
04:20 - Building a Tuning Grid
05:30 - Fit Models & Tune Hyperparameters
06:45 - Tuning Results
08:05 - Racing Hyperparameters
10:18 - Going Faster in Parallel
13:29 - Summary and Next Steps
#r #mlops #machinelearning #tidymodels #tidyverse #model #rstats #datascience #ai #dataanalytics #hyperparameter #tuning #modelperformance #data #ml #tune #finetune