In this video we will cover 3 different methods for hyper parameter tuning in XGBoost. These include:
1. Grid Search
2. Randomized Search
3. Bayesian Optimization
The break-down of the video is as follows:
00:00 Video Introduction
00:38 What are Hyperparameters?
02:25 Number of Weak Learners
03:40 Learning Rate
04:34 Maximum Depth
05:31 L1 Regularization
06:43 L2 Regularization
07:52 Methods for Hyperparameter Tuning
11:25 Start Jupyter Notebook
14:08 Grid Search
17:07 Randomized Search
20:14 Bayesian Optimization
22:29 Concluding Remarks
The best way to keep up-to-date with my video/blog content is to sign up for my monthly Newsletter! Please visit: https://insidelearningmachines.com/newsletter/ to register.
The notebook presented here can be found at: https://github.com/insidelearningmachines/Blog/blob/main/3%20Methods%20for%20Hyperparameter%20Tuning%20with%20XGBoost.ipynb
The homepage of my blog is: https://insidelearningmachines.com
The home page of XGBoost is: https://xgboost.ai
Other social media includes:
Twitter: https://twitter.com/inside_machines
Facebook: https://www.facebook.com/Inside-Learning-Machines-112215488183517
#machinelearning #datascience #boosting #xgboost #insidelearningmachines