Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
keras-team
GitHub Repository: keras-team/keras-io
Path: blob/master/templates/api/optimizers/index.md
3297 views

Optimizers

Available optimizers

{{toc}}


Usage with compile() & fit()

An optimizer is one of the two arguments required for compiling a Keras model:

import keras from keras import layers model = keras.Sequential() model.add(layers.Dense(64, kernel_initializer='uniform', input_shape=(10,))) model.add(layers.Activation('softmax')) opt = keras.optimizers.Adam(learning_rate=0.01) model.compile(loss='categorical_crossentropy', optimizer=opt)

You can either instantiate an optimizer before passing it to model.compile() , as in the above example, or you can pass it by its string identifier. In the latter case, the default parameters for the optimizer will be used.

# pass optimizer by name: default parameters will be used model.compile(loss='categorical_crossentropy', optimizer='adam')

Learning rate decay / scheduling

You can use a learning rate schedule to modulate how the learning rate of your optimizer changes over time:

lr_schedule = keras.optimizers.schedules.ExponentialDecay( initial_learning_rate=1e-2, decay_steps=10000, decay_rate=0.9) optimizer = keras.optimizers.SGD(learning_rate=lr_schedule)

Check out the learning rate schedule API documentation for a list of available schedules.


Base Optimizer API

These methods and attributes are common to all Keras optimizers.

{{autogenerated}}