Path: blob/master/site/en-snapshot/addons/tutorials/optimizers_lazyadam.ipynb
25118 views
Copyright 2020 The TensorFlow Authors.
TensorFlow Addons Optimizers: LazyAdam
Overview
This notebook will demonstrate how to use the lazy adam optimizer from the Addons package.
LazyAdam
LazyAdam is a variant of the Adam optimizer that handles sparse updates more efficiently. The original Adam algorithm maintains two moving-average accumulators for each trainable variable; the accumulators are updated at every step. This class provides lazier handling of gradient updates for sparse variables. It only updates moving-average accumulators for sparse variable indices that appear in the current batch, rather than updating the accumulators for all indices. Compared with the original Adam optimizer, it can provide large improvements in model training throughput for some applications. However, it provides slightly different semantics than the original Adam algorithm, and may lead to different empirical results.
Setup
Build the Model
Prepare the Data
Train and Evaluate
Simply replace typical keras optimizers with the new tfa optimizer