Path: blob/master/guides/keras_tuner/tailor_the_search_space.py
3293 views
"""1Title: Tailor the search space2Authors: Luca Invernizzi, James Long, Francois Chollet, Tom O'Malley, Haifeng Jin3Date created: 2019/05/314Last modified: 2021/10/275Description: Tune a subset of the hyperparameters without changing the hypermodel.6Accelerator: None7"""89"""shell10pip install keras-tuner -q11"""1213"""14In this guide, we will show how to tailor the search space without changing the15`HyperModel` code directly. For example, you can only tune some of the16hyperparameters and keep the rest fixed, or you can override the compile17arguments, like `optimizer`, `loss`, and `metrics`.1819## The default value of a hyperparameter2021Before we tailor the search space, it is important to know that every22hyperparameter has a default value. This default value is used as the23hyperparameter value when not tuning it during our tailoring the search space.2425Whenever you register a hyperparameter, you can use the `default` argument to26specify a default value:2728```python29hp.Int("units", min_value=32, max_value=128, step=32, default=64)30```3132If you don't, hyperparameters always have a default default (for `Int`, it is33equal to `min_value`).3435In the following model-building function, we specified the default value for36the `units` hyperparameter as 64.37"""3839import keras40from keras import layers41import keras_tuner42import numpy as np434445def build_model(hp):46model = keras.Sequential()47model.add(layers.Flatten())48model.add(49layers.Dense(50units=hp.Int("units", min_value=32, max_value=128, step=32, default=64)51)52)53if hp.Boolean("dropout"):54model.add(layers.Dropout(rate=0.25))55model.add(layers.Dense(units=10, activation="softmax"))56model.compile(57optimizer=keras.optimizers.Adam(58learning_rate=hp.Choice("learning_rate", values=[1e-2, 1e-3, 1e-4])59),60loss="sparse_categorical_crossentropy",61metrics=["accuracy"],62)63return model646566"""67We will reuse this search space in the rest of the tutorial by overriding the68hyperparameters without defining a new search space.6970## Search a few and fix the rest7172If you have an existing hypermodel, and you want to search over only a few73hyperparameters, and keep the rest fixed, you don't have to change the code in74the model-building function or the `HyperModel`. You can pass a75`HyperParameters` to the `hyperparameters` argument to the tuner constructor76with all the hyperparameters you want to tune. Specify77`tune_new_entries=False` to prevent it from tuning other hyperparameters, the78default value of which would be used.7980In the following example, we only tune the `learning_rate` hyperparameter, and81changed its type and value ranges.82"""8384hp = keras_tuner.HyperParameters()8586# This will override the `learning_rate` parameter with your87# own selection of choices88hp.Float("learning_rate", min_value=1e-4, max_value=1e-2, sampling="log")8990tuner = keras_tuner.RandomSearch(91hypermodel=build_model,92hyperparameters=hp,93# Prevents unlisted parameters from being tuned94tune_new_entries=False,95objective="val_accuracy",96max_trials=3,97overwrite=True,98directory="my_dir",99project_name="search_a_few",100)101102# Generate random data103x_train = np.random.rand(100, 28, 28, 1)104y_train = np.random.randint(0, 10, (100, 1))105x_val = np.random.rand(20, 28, 28, 1)106y_val = np.random.randint(0, 10, (20, 1))107108# Run the search109tuner.search(x_train, y_train, epochs=1, validation_data=(x_val, y_val))110111"""112If you summarize the search space, you will see only one hyperparameter.113"""114115tuner.search_space_summary()116117"""118## Fix a few and tune the rest119120In the example above we showed how to tune only a few hyperparameters and keep121the rest fixed. You can also do the reverse: only fix a few hyperparameters122and tune all the rest.123124In the following example, we fixed the value of the `learning_rate`125hyperparameter. Pass a `hyperparameters` argument with a `Fixed` entry (or any126number of `Fixed` entries). Also remember to specify `tune_new_entries=True`,127which allows us to tune the rest of the hyperparameters.128"""129130hp = keras_tuner.HyperParameters()131hp.Fixed("learning_rate", value=1e-4)132133tuner = keras_tuner.RandomSearch(134build_model,135hyperparameters=hp,136tune_new_entries=True,137objective="val_accuracy",138max_trials=3,139overwrite=True,140directory="my_dir",141project_name="fix_a_few",142)143144tuner.search(x_train, y_train, epochs=1, validation_data=(x_val, y_val))145146"""147If you summarize the search space, you will see the `learning_rate` is marked148as fixed, and the rest of the hyperparameters are being tuned.149"""150151tuner.search_space_summary()152153"""154## Overriding compilation arguments155156If you have a hypermodel for which you want to change the existing optimizer,157loss, or metrics, you can do so by passing these arguments to the tuner158constructor:159"""160161tuner = keras_tuner.RandomSearch(162build_model,163optimizer=keras.optimizers.Adam(1e-3),164loss="mse",165metrics=[166"sparse_categorical_crossentropy",167],168objective="val_loss",169max_trials=3,170overwrite=True,171directory="my_dir",172project_name="override_compile",173)174175tuner.search(x_train, y_train, epochs=1, validation_data=(x_val, y_val))176177"""178If you get the best model, you can see the loss function has changed to MSE.179"""180181tuner.get_best_models()[0].loss182183"""184## Tailor the search space of pre-build HyperModels185186You can also use these techniques with the pre-build models in KerasTuner, like187`HyperResNet` or `HyperXception`. However, to see what hyperparameters are in188these pre-build `HyperModel`s, you will have to read the source code.189190In the following example, we only tune the `learning_rate` of `HyperXception`191and fixed all the rest of the hyperparameters. Because the default loss of192`HyperXception` is `categorical_crossentropy`, which expect the labels to be193one-hot encoded, which doesn't match our raw integer label data, we need to194change it by overriding the `loss` in the compile args to195`sparse_categorical_crossentropy`.196"""197198hypermodel = keras_tuner.applications.HyperXception(input_shape=(28, 28, 1), classes=10)199200hp = keras_tuner.HyperParameters()201202# This will override the `learning_rate` parameter with your203# own selection of choices204hp.Choice("learning_rate", values=[1e-2, 1e-3, 1e-4])205206tuner = keras_tuner.RandomSearch(207hypermodel,208hyperparameters=hp,209# Prevents unlisted parameters from being tuned210tune_new_entries=False,211# Override the loss.212loss="sparse_categorical_crossentropy",213metrics=["accuracy"],214objective="val_accuracy",215max_trials=3,216overwrite=True,217directory="my_dir",218project_name="helloworld",219)220221# Run the search222tuner.search(x_train, y_train, epochs=1, validation_data=(x_val, y_val))223tuner.search_space_summary()224225226