7.5 Gradient Boosting (L07: Ensemble Methods)

Описание к видео 7.5 Gradient Boosting (L07: Ensemble Methods)

Sebastian's books: https://sebastianraschka.com/books/

In this video, we will take the concept of boosting a step further and talk about gradient boosting. Where AdaBoost uses weights for training examples to boost the trees in the next round, gradient boosting uses the gradients of the loss to compute residuals on which the next tree in the sequence is fit.

XGBoost paper mentioned in the video: https://dl.acm.org/doi/pdf/10.1145/29...

Link to the code: https://github.com/rasbt/stat451-mach...

-------

This video is part of my Introduction of Machine Learning course.

Next video:    • 7.6 Random Forests (L07: Ensemble Met...  

The complete playlist:    • Intro to Machine Learning and Statist...  

A handy overview page with links to the materials: https://sebastianraschka.com/blog/202...

-------

If you want to be notified about future videos, please consider subscribing to my channel:    / sebastianraschka  

Комментарии

Информация по комментариям в разработке