3 Methods for Hyperparameter Tuning with XGBoost

Описание к видео 3 Methods for Hyperparameter Tuning with XGBoost

In this video we will cover 3 different methods for hyper parameter tuning in XGBoost. These include:

1. Grid Search
2. Randomized Search
3. Bayesian Optimization

The break-down of the video is as follows:

00:00 Video Introduction
00:38 What are Hyperparameters?
02:25 Number of Weak Learners
03:40 Learning Rate
04:34 Maximum Depth
05:31 L1 Regularization
06:43 L2 Regularization
07:52 Methods for Hyperparameter Tuning
11:25 Start Jupyter Notebook
14:08 Grid Search
17:07 Randomized Search
20:14 Bayesian Optimization
22:29 Concluding Remarks

The best way to keep up-to-date with my video/blog content is to sign up for my monthly Newsletter! Please visit: https://insidelearningmachines.com/ne... to register.

The notebook presented here can be found at: https://github.com/insidelearningmach...

The homepage of my blog is: https://insidelearningmachines.com

The home page of XGBoost is: https://xgboost.ai

Other social media includes:
Twitter:   / inside_machines  
Facebook:   / inside-learning-machines-112215488183517  

#machinelearning #datascience #boosting #xgboost #insidelearningmachines

Комментарии

Информация по комментариям в разработке