300 - Picking the best model and corresponding hyperparameters using Gridsearch

Описание к видео 300 - Picking the best model and corresponding hyperparameters using Gridsearch

Code generated in the video can be downloaded from here:
https://github.com/bnsreenu/python_fo...

Picking the best model and corresponding hyperparameters
using cross validation inside a Gridsearch

The grid search provided by GridSearchCV exhaustively generates candidates
from a grid of parameter values specified with the param_grid parameter
Example:
param1 = {}
param1['classifier__n_estimators'] = [10, 50, 100, 250]
param1['classifier__max_depth'] = [5, 10, 20]
param1['classifier__class_weight'] = [None, {0:1,1:5}, {0:1,1:10}, {0:1,1:25}]
param1['classifier'] = [RandomForestClassifier(random_state=42)]

The GridSearchCV instance when “fitting” on a dataset, all the possible
combinations of parameter values are evaluated and the best combination is retained.

cv parameter can be defined for the cross-validation splitting strategy.

Wisconsin breast cancer example
Dataset link: https://www.kaggle.com/datasets/uciml...

Комментарии

Информация по комментариям в разработке