Path: blob/master/second_edition/chapter07_working-with-keras.ipynb
713 views
This is a companion notebook for the book Deep Learning with Python, Second Edition. For readability, it only contains runnable code blocks and section titles, and omits everything else in the book: text paragraphs, figures, and pseudocode.
If you want to be able to follow what's going on, I recommend reading the notebook side by side with your copy of the book.
This notebook was generated for TensorFlow 2.6.
Working with Keras: A deep dive
A spectrum of workflows
Different ways to build Keras models
The Sequential model
The Sequential
class
Incrementally building a Sequential model
Calling a model for the first time to build it
The summary method
Naming models and layers with the name
argument
Specifying the input shape of your model in advance
The Functional API
A simple example
A simple Functional model with two Dense
layers
Multi-input, multi-output models
A multi-input, multi-output Functional model
Training a multi-input, multi-output model
Training a model by providing lists of input & target arrays
Training a model by providing dicts of input & target arrays
The power of the Functional API: Access to layer connectivity
Retrieving the inputs or outputs of a layer in a Functional model
Creating a new model by reusing intermediate layer outputs
Subclassing the Model class
Rewriting our previous example as a subclassed model
A simple subclassed model
Beware: What subclassed models don't support
Mixing and matching different components
Creating a Functional model that includes a subclassed model
Creating a subclassed model that includes a Functional model
Remember: Use the right tool for the job
Using built-in training and evaluation loops
The standard workflow: compile()
, fit()
, evaluate()
, predict()
Writing your own metrics
Implementing a custom metric by subclassing the Metric
class
Using callbacks
The EarlyStopping and ModelCheckpoint callbacks
Using the callbacks
argument in the fit()
method
Writing your own callbacks
Creating a custom callback by subclassing the Callback
class
Monitoring and visualization with TensorBoard
Writing your own training and evaluation loops
Training versus inference
Low-level usage of metrics
A complete training and evaluation loop
Writing a step-by-step training loop: the training step function
Writing a step-by-step training loop: resetting the metrics
Writing a step-by-step training loop: the loop itself
Writing a step-by-step evaluation loop
Make it fast with tf.function
Adding a tf.function
decorator to our evaluation-step function
Leveraging fit() with a custom training loop
Implementing a custom training step to use with fit()