5.1. Exercises

Hyperparameter Quick Questions

Using Automated Hyperparameter Optimization in Action!

Instructions:
Running a coding exercise for the first time could take a bit of time for everything to load. Be patient, it could take a few minutes.

When you see ____ in a coding exercise, replace it with what you assume to be the correct code. Run it and see if you obtain the desired output. Submit your code to validate if you were correct.

Make sure you remove the hash (#) symbol in the coding portions of this question. We have commented them so that the line wonโ€™t execute and you can test your code after each step.

Now that we have built a pipeline in the last interactive exercises, letโ€™s pair that with grid search to optimize our hyperparameters.

Tasks:

  • Using the pipeline provided, create a grid of parameters to search over named param_grid. Search over the values 1, 5, 10, 20, 30, 40, and 50 for the hyperparameter n_neighbors and โ€˜uniformโ€™ and โ€˜distanceโ€™ for the hyperparameter weights (make sure to call them appropriately).
  • Use GridSearchCV to hyper-parameter tune using cross-validate equal to 10 folds. Make sure to specify the arguments verbose=1 and n_jobs=-1. Name the object grid_search.
  • Find the best hyperparameter values and save them in an object named best_hyperparams. Make sure to print these results.
  • Lastly, score your model on the test set and save your results in an object named bb_test_score.
Hint 1
  • Are you specifying knn__n_neighbors and knn__weights in param_grid and specifying the hyperparameter values in a list?
  • Are you using GridSearchCV(bb_pipe, param_grid, cv=10, verbose=1, n_jobs=-1) and remembering to fit it?
  • Are you using grid_search.best_params_ to find the most optimal hyperparameter values?
  • Are you using grid_search.score(X_test, y_test) to calculate your test score?
Fully worked solution: