Path: blob/master/second_edition/chapter04_getting-started-with-neural-networks.ipynb
713 views
This is a companion notebook for the book Deep Learning with Python, Second Edition. For readability, it only contains runnable code blocks and section titles, and omits everything else in the book: text paragraphs, figures, and pseudocode.
If you want to be able to follow what's going on, I recommend reading the notebook side by side with your copy of the book.
This notebook was generated for TensorFlow 2.6.
Getting started with neural networks: Classification and regression
Classifying movie reviews: A binary classification example
The IMDB dataset
Loading the IMDB dataset
Decoding reviews back to text
Preparing the data
Encoding the integer sequences via multi-hot encoding
Building your model
Model definition
Compiling the model
Validating your approach
Setting aside a validation set
Training your model
Plotting the training and validation loss
Plotting the training and validation accuracy
Retraining a model from scratch
Using a trained model to generate predictions on new data
Further experiments
Wrapping up
Classifying newswires: A multiclass classification example
The Reuters dataset
Loading the Reuters dataset
Decoding newswires back to text
Preparing the data
Encoding the input data
Encoding the labels
Building your model
Model definition
Compiling the model
Validating your approach
Setting aside a validation set
Training the model
Plotting the training and validation loss
Plotting the training and validation accuracy
Retraining a model from scratch
Generating predictions on new data
A different way to handle the labels and the loss
The importance of having sufficiently large intermediate layers
A model with an information bottleneck
Further experiments
Wrapping up
Predicting house prices: A regression example
The Boston Housing Price dataset
Loading the Boston housing dataset
Preparing the data
Normalizing the data
Building your model
Model definition
Validating your approach using K-fold validation
K-fold validation
Saving the validation logs at each fold
Building the history of successive mean K-fold validation scores
Plotting validation scores
Plotting validation scores, excluding the first 10 data points
Training the final model