Pass A Scoring Function From Sklearn.metrics To Gridsearchcv
GridSearchCV's documentations states that I can pass a scoring function. scoring : string, callable or None, default=None I would like to use a native accuracy_score as a scoring
Solution 1:
It will work if you change scoring=accuracy_score
to scoring='accuracy'
(see the documentation for the full list of scorers you can use by name in this way.)
In theory, you should be able to pass custom scoring functions like you're trying, but my guess is that you're right and accuracy_score
doesn't have the right API.
Solution 2:
Here is an example of using Weighted Kappa as scoring metric for GridSearchCV for a simple Random Forest model. The key learning for me was to use the parameters related to the scorer in the 'make_scorer' function.
from sklearn.model_selection import GridSearchCV
from sklearn.metrics import cohen_kappa_score, make_scorer
kappa_scorer = make_scorer(cohen_kappa_score,weights="quadratic")
# Create the parameter grid based on the results of random search
param_grid = {
'bootstrap': [True],
'max_features': range(2,10), # try features from 2 to 10'min_samples_leaf': [3, 4, 5],
'n_estimators' : [100,300,500],
'max_depth': [5]
}
# Create a based model
random_forest = RandomForestClassifier(class_weight ="balanced_subsample",random_state=1)
# Instantiate the grid search model
grid_search = GridSearchCV(estimator = random_forest, param_grid = param_grid,
cv = 5, n_jobs = -1, verbose = 2, scoring = kappa_scorer) # search for best model using roc_auc# Fit the grid search to the data
grid_search.fit(final_tr, yTrain)
Post a Comment for "Pass A Scoring Function From Sklearn.metrics To Gridsearchcv"