Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
keras-team
GitHub Repository: keras-team/keras-io
Path: blob/master/guides/keras_tuner/tailor_the_search_space.py
3293 views
1
"""
2
Title: Tailor the search space
3
Authors: Luca Invernizzi, James Long, Francois Chollet, Tom O'Malley, Haifeng Jin
4
Date created: 2019/05/31
5
Last modified: 2021/10/27
6
Description: Tune a subset of the hyperparameters without changing the hypermodel.
7
Accelerator: None
8
"""
9
10
"""shell
11
pip install keras-tuner -q
12
"""
13
14
"""
15
In this guide, we will show how to tailor the search space without changing the
16
`HyperModel` code directly. For example, you can only tune some of the
17
hyperparameters and keep the rest fixed, or you can override the compile
18
arguments, like `optimizer`, `loss`, and `metrics`.
19
20
## The default value of a hyperparameter
21
22
Before we tailor the search space, it is important to know that every
23
hyperparameter has a default value. This default value is used as the
24
hyperparameter value when not tuning it during our tailoring the search space.
25
26
Whenever you register a hyperparameter, you can use the `default` argument to
27
specify a default value:
28
29
```python
30
hp.Int("units", min_value=32, max_value=128, step=32, default=64)
31
```
32
33
If you don't, hyperparameters always have a default default (for `Int`, it is
34
equal to `min_value`).
35
36
In the following model-building function, we specified the default value for
37
the `units` hyperparameter as 64.
38
"""
39
40
import keras
41
from keras import layers
42
import keras_tuner
43
import numpy as np
44
45
46
def build_model(hp):
47
model = keras.Sequential()
48
model.add(layers.Flatten())
49
model.add(
50
layers.Dense(
51
units=hp.Int("units", min_value=32, max_value=128, step=32, default=64)
52
)
53
)
54
if hp.Boolean("dropout"):
55
model.add(layers.Dropout(rate=0.25))
56
model.add(layers.Dense(units=10, activation="softmax"))
57
model.compile(
58
optimizer=keras.optimizers.Adam(
59
learning_rate=hp.Choice("learning_rate", values=[1e-2, 1e-3, 1e-4])
60
),
61
loss="sparse_categorical_crossentropy",
62
metrics=["accuracy"],
63
)
64
return model
65
66
67
"""
68
We will reuse this search space in the rest of the tutorial by overriding the
69
hyperparameters without defining a new search space.
70
71
## Search a few and fix the rest
72
73
If you have an existing hypermodel, and you want to search over only a few
74
hyperparameters, and keep the rest fixed, you don't have to change the code in
75
the model-building function or the `HyperModel`. You can pass a
76
`HyperParameters` to the `hyperparameters` argument to the tuner constructor
77
with all the hyperparameters you want to tune. Specify
78
`tune_new_entries=False` to prevent it from tuning other hyperparameters, the
79
default value of which would be used.
80
81
In the following example, we only tune the `learning_rate` hyperparameter, and
82
changed its type and value ranges.
83
"""
84
85
hp = keras_tuner.HyperParameters()
86
87
# This will override the `learning_rate` parameter with your
88
# own selection of choices
89
hp.Float("learning_rate", min_value=1e-4, max_value=1e-2, sampling="log")
90
91
tuner = keras_tuner.RandomSearch(
92
hypermodel=build_model,
93
hyperparameters=hp,
94
# Prevents unlisted parameters from being tuned
95
tune_new_entries=False,
96
objective="val_accuracy",
97
max_trials=3,
98
overwrite=True,
99
directory="my_dir",
100
project_name="search_a_few",
101
)
102
103
# Generate random data
104
x_train = np.random.rand(100, 28, 28, 1)
105
y_train = np.random.randint(0, 10, (100, 1))
106
x_val = np.random.rand(20, 28, 28, 1)
107
y_val = np.random.randint(0, 10, (20, 1))
108
109
# Run the search
110
tuner.search(x_train, y_train, epochs=1, validation_data=(x_val, y_val))
111
112
"""
113
If you summarize the search space, you will see only one hyperparameter.
114
"""
115
116
tuner.search_space_summary()
117
118
"""
119
## Fix a few and tune the rest
120
121
In the example above we showed how to tune only a few hyperparameters and keep
122
the rest fixed. You can also do the reverse: only fix a few hyperparameters
123
and tune all the rest.
124
125
In the following example, we fixed the value of the `learning_rate`
126
hyperparameter. Pass a `hyperparameters` argument with a `Fixed` entry (or any
127
number of `Fixed` entries). Also remember to specify `tune_new_entries=True`,
128
which allows us to tune the rest of the hyperparameters.
129
"""
130
131
hp = keras_tuner.HyperParameters()
132
hp.Fixed("learning_rate", value=1e-4)
133
134
tuner = keras_tuner.RandomSearch(
135
build_model,
136
hyperparameters=hp,
137
tune_new_entries=True,
138
objective="val_accuracy",
139
max_trials=3,
140
overwrite=True,
141
directory="my_dir",
142
project_name="fix_a_few",
143
)
144
145
tuner.search(x_train, y_train, epochs=1, validation_data=(x_val, y_val))
146
147
"""
148
If you summarize the search space, you will see the `learning_rate` is marked
149
as fixed, and the rest of the hyperparameters are being tuned.
150
"""
151
152
tuner.search_space_summary()
153
154
"""
155
## Overriding compilation arguments
156
157
If you have a hypermodel for which you want to change the existing optimizer,
158
loss, or metrics, you can do so by passing these arguments to the tuner
159
constructor:
160
"""
161
162
tuner = keras_tuner.RandomSearch(
163
build_model,
164
optimizer=keras.optimizers.Adam(1e-3),
165
loss="mse",
166
metrics=[
167
"sparse_categorical_crossentropy",
168
],
169
objective="val_loss",
170
max_trials=3,
171
overwrite=True,
172
directory="my_dir",
173
project_name="override_compile",
174
)
175
176
tuner.search(x_train, y_train, epochs=1, validation_data=(x_val, y_val))
177
178
"""
179
If you get the best model, you can see the loss function has changed to MSE.
180
"""
181
182
tuner.get_best_models()[0].loss
183
184
"""
185
## Tailor the search space of pre-build HyperModels
186
187
You can also use these techniques with the pre-build models in KerasTuner, like
188
`HyperResNet` or `HyperXception`. However, to see what hyperparameters are in
189
these pre-build `HyperModel`s, you will have to read the source code.
190
191
In the following example, we only tune the `learning_rate` of `HyperXception`
192
and fixed all the rest of the hyperparameters. Because the default loss of
193
`HyperXception` is `categorical_crossentropy`, which expect the labels to be
194
one-hot encoded, which doesn't match our raw integer label data, we need to
195
change it by overriding the `loss` in the compile args to
196
`sparse_categorical_crossentropy`.
197
"""
198
199
hypermodel = keras_tuner.applications.HyperXception(input_shape=(28, 28, 1), classes=10)
200
201
hp = keras_tuner.HyperParameters()
202
203
# This will override the `learning_rate` parameter with your
204
# own selection of choices
205
hp.Choice("learning_rate", values=[1e-2, 1e-3, 1e-4])
206
207
tuner = keras_tuner.RandomSearch(
208
hypermodel,
209
hyperparameters=hp,
210
# Prevents unlisted parameters from being tuned
211
tune_new_entries=False,
212
# Override the loss.
213
loss="sparse_categorical_crossentropy",
214
metrics=["accuracy"],
215
objective="val_accuracy",
216
max_trials=3,
217
overwrite=True,
218
directory="my_dir",
219
project_name="helloworld",
220
)
221
222
# Run the search
223
tuner.search(x_train, y_train, epochs=1, validation_data=(x_val, y_val))
224
tuner.search_space_summary()
225
226