Path: blob/master/chapter03_introduction-to-ml-frameworks.ipynb
709 views
Kernel: Python 3
This is a companion notebook for the book Deep Learning with Python, Third Edition. For readability, it only contains runnable code blocks and section titles, and omits everything else in the book: text paragraphs, figures, and pseudocode.
If you want to be able to follow what's going on, I recommend reading the notebook side by side with your copy of the book.
The book's contents are available online at deeplearningwithpython.io.
In [0]:
In [0]:
In [0]:
Introduction to TensorFlow, PyTorch, JAX, and Keras
A brief history of deep learning frameworks
How these frameworks relate to each other
Introduction to TensorFlow
First steps with TensorFlow
Tensors and variables in TensorFlow
Constant tensors
In [0]:
In [0]:
In [0]:
Random tensors
In [0]:
In [0]:
Tensor assignment and the Variable class
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
Tensor operations: Doing math in TensorFlow
In [0]:
In [0]:
Gradients in TensorFlow: A second look at the GradientTape API
In [0]:
In [0]:
In [0]:
Making TensorFlow functions fast using compilation
In [0]:
In [0]:
An end-to-end example: A linear classifier in pure TensorFlow
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
What makes the TensorFlow approach unique
Introduction to PyTorch
First steps with PyTorch
Tensors and parameters in PyTorch
Constant tensors
In [0]:
In [0]:
In [0]:
Random tensors
In [0]:
In [0]:
Tensor assignment and the Parameter class
In [0]:
In [0]:
Tensor operations: Doing math in PyTorch
In [0]:
In [0]:
Computing gradients with PyTorch
In [0]:
In [0]:
In [0]:
An end-to-end example: A linear classifier in pure PyTorch
In [0]:
In [0]:
In [0]:
In [0]:
Packaging state and computation with the Module class
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
Making PyTorch modules fast using compilation
In [0]:
In [0]:
What makes the PyTorch approach unique
Introduction to JAX
First steps with JAX
Tensors in JAX
In [0]:
In [0]:
In [0]:
Random number generation in JAX
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
Tensor assignment
In [0]:
Tensor operations: Doing math in JAX
In [0]:
In [0]:
Computing gradients with JAX
In [0]:
In [0]:
In [0]:
JAX gradient-computation best practices
Returning the loss value
In [0]:
Getting gradients for a complex function
Returning auxiliary outputs
Making JAX functions fast with @jax.jit
In [0]:
An end-to-end example: A linear classifier in pure JAX
In [0]:
In [0]:
In [0]:
In [0]:
In [0]:
What makes the JAX approach unique
Introduction to Keras
First steps with Keras
Picking a backend framework
In [0]:
Layers: The building blocks of deep learning
The base Layer
class in Keras
In [0]:
In [0]:
Automatic shape inference: Building layers on the fly
In [0]:
In [0]:
In [0]:
From layers to models
The "compile" step: Configuring the learning process
In [0]:
In [0]:
Picking a loss function
Understanding the fit method
In [0]:
In [0]:
Monitoring loss and metrics on validation data
In [0]:
Inference: Using a model after training
In [0]: