XGBoost Made Easy | Extreme Gradient Boosting | AWS SageMaker

Описание к видео XGBoost Made Easy | Extreme Gradient Boosting | AWS SageMaker

Recently, XGBoost is the go to algorithm for most developers and has won several Kaggle competitions.

Since the technique is an ensemble algorithm, it is very robust and could work well with several data types and complex distributions.

Xgboost has a many tunable hyperparameters that could improve model fitting.

XGBoost is an example of ensemble learning and works for both regression and classification tasks.

Ensemble techniques such as bagging and boosting can offer an extremely powerful algorithm by combining a group of relatively weak/average ones.

For example, you can combine several decision trees to create a powerful random forest algorithm.

By Combining votes from a pool of experts, each will bring their own experience and background to solve the problem resulting in a better
outcome.

Boosting can reduce variance and overfitting and increase the model robustness.

I hope you will enjoy this video and find it useful and informative!

Thanks.


#xgboost #aws #sagemaker

Комментарии

Информация по комментариям в разработке