Hyperparameter Tuning with Python by Louis Owen

Hyperparameter Tuning with Python by Louis Owen

Author:Louis Owen
Language: eng
Format: epub
Publisher: Packt Publishing Pvt Ltd
Published: 2022-07-28T00:00:00+00:00


Implementing Coarse-to-Fine Search

Coarse-to-Fine Search (CFS) is part of the Multi-Fidelity Optimization group that utilizes Grid Search and/or Random Search during the hyperparameter tuning process (see Chapter 6, Exploring Multi-Fidelity Optimization). Although CFS is not implemented directly in the sklearn package, you can find the implemented custom class, CoarseToFineSearchCV, in the repo mentioned in the Technical Requirements section.

Let’s use the same example and hyperparameter space as in the Implementing Random Search section, to see how CoarseToFineSearchCV works in practice. Note that this implementation of CFS only utilizes Random Search and uses the top N percentiles scheme to define the promising subspace in each iteration, similar to the example shown in Chapter 6. However, you can edit the code based on your own preference since CFS is a very simple method with customizable modules.

The following code shows you how to perform CFS with the CoarseToFineSearchCV class. It is worth noting that this class has very similar parameters to the RandomizedSearchCV class, with several additional parameters. The random_iters parameter controls the number of iterations for each random search trial, top_n_percentile controls the N value within the top N percentiles promising subspace definition (see Chapter 6), n_iter defines the number of CFS iterations to be performed, and continuous_hyperparams stores the list of continuous hyperparameters in the predefined space.

Initiate the CoarseToFineSearchCV class:

clf = CoarseToFineSearchCV(pipe, hyperparameter_space,

random_iters=25, top_n_percentile=50, n_iter=10,

continuous_hyperparams=['model__min_samples_split'],

random_state=0, scoring='f1', cv=5,

n_jobs=-1, refit=True)

Run the CoarseToFineSearchCV class:

clf.fit(X_train_full, y_train)

Print the best set of hyperparameters:

print(clf.best_params_, clf.best_score_)

Evaluate the final trained model on the test data:

y_pred = clf.predict(X_test_full)

print(f1_score(y_test, y_pred))

Based on the preceding code, we get around 0.561 for the F1-score when testing our final trained RF model with the best set of hyperparameters on the test set. The best set of hyperparameters is {‘model__class_weight’: ‘balanced_subsample’, ‘model__criterion’: ‘entropy’, ‘model__min_samples_split’: 0.005867409821769845, ‘model__n_estimators’: 106} with an objective function score of 0.560.

In this section, we have learned how to implement CFS using a custom class on top of sklearn through the CoarseToFineSearchCV class. In the next section, we will learn how to perform SH with sklearn.



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.