Path: blob/master/second_edition/chapter10_dl-for-timeseries.ipynb
713 views
This is a companion notebook for the book Deep Learning with Python, Second Edition. For readability, it only contains runnable code blocks and section titles, and omits everything else in the book: text paragraphs, figures, and pseudocode.
If you want to be able to follow what's going on, I recommend reading the notebook side by side with your copy of the book.
This notebook was generated for TensorFlow 2.6.
Deep learning for timeseries
Different kinds of timeseries tasks
A temperature-forecasting example
Inspecting the data of the Jena weather dataset
Parsing the data
Plotting the temperature timeseries
Plotting the first 10 days of the temperature timeseries
Computing the number of samples we'll use for each data split
Preparing the data
Normalizing the data
Instantiating datasets for training, validation, and testing
Inspecting the output of one of our datasets
A common-sense, non-machine-learning baseline
Computing the common-sense baseline MAE
Let's try a basic machine-learning model
Training and evaluating a densely connected model
Plotting results
Let's try a 1D convolutional model
A first recurrent baseline
A simple LSTM-based model
Understanding recurrent neural networks
NumPy implementation of a simple RNN
A recurrent layer in Keras
An RNN layer that can process sequences of any length
An RNN layer that returns only its last output step
An RNN layer that returns its full output sequence
Stacking RNN layers
Advanced use of recurrent neural networks
Using recurrent dropout to fight overfitting
Training and evaluating a dropout-regularized LSTM
Stacking recurrent layers
Training and evaluating a dropout-regularized, stacked GRU model
Using bidirectional RNNs
Training and evaluating a bidirectional LSTM