ENERGY Prediction: 36 Scikit Learn & XGBoost ML Models + XAI with SHAP // Hands-on Python Tutorial

Описание к видео ENERGY Prediction: 36 Scikit Learn & XGBoost ML Models + XAI with SHAP // Hands-on Python Tutorial

#scikitlearn #xgboost #datascienceprojects
This is the 2nd video in a series on End to End Data Science Projects with Machine Learning and Deep Learning. Transform your data science skills using 36 Regression Machine Learning Algorithms using ScikitLearn and XGBoost python libraries in this ML end-to-end project on Kaggle dataset energy efficiency data of houses (heating load and cooling load), including AI Explainability with SHAP Python Library. A Hands-on Tutorial, End to End Data Science Project for learning AI Algorithms like random forest regression and building your data science portfolio with full Python code included in Google Colab Notebook.

===========================
Get Access to My 20+ Years Experience in AI:
===========================
⚡️Free guide: https://www.maryammiradi.com/free-guide
⚡️AI Training: https://www.maryammiradi.com/training

===========================
Connect with Me!
=============================
Linkedin ➡︎   / maryammiradi  

===========================
Mentioned in this Video
=============================
📚 Link to Python Code ➡︎
https://colab.research.google.com/dri...
📺 Data Science Projects ➡︎    • Data Science Projects End to End  

⏰ Timecodes ⏰

0:00 Introduction
0:10 Loading Data & Libraries in Google COLAB for 8 Groups of ScikitLearn Models (Generalized Linear Models, Support Vector Machines, Nearest Neighbors, Decision Trees and Ensemble Methods,
Gaussian Processes, Naive Bayes, Discriminant Analysis and Neural Networks)
3:00 Explanation of Energy Efficiency
4:12 Data Exploration of Input features and Target Features using Pandas
5:19 Model Building and Comparison with 36 SKlearn Models being, Multivariable Linear Regression, Ridge, Lasso, ElasticNet, LARS, LassoLARS, OMP, Bayesian Ridge, ARD Regression, SGD, Passive Aggressive Regressor
Huber, Quantile Regressor, Support Vector Regression, NuSVR, LinearSVR, K Nearest Neighbors, Radius Neighbors, Decision Tree, Random Forest, Extra Trees Regressors, Gradient Boosting, ADABoost, Bagging Regressors Voting, Stacking Regressors, XGBoost Regressor
Gaussian Process Regressor, Naive Bayes (Gaussian, Multinomial, Bernoulli, Complement, Categorical), Discriminant Analysis (Linear, Quadratic) and MLP Regressor
15:16 Model Comparison using k fold cross validation for model selection
20:48 Build Final Model with XGBoost Python Library
22:18 Test Results to Final Model Trained with XGBoost Algorithm
22:39 AI Explainability with SHAP Values (Global Explainbility)
30:16 Outro

__________

✍️ Leave any questions you have about AI & Data Science in the comments!

#ai #python #regression #machinelearning

Комментарии

Информация по комментариям в разработке