Path: blob/master/second_edition/chapter05_fundamentals-of-ml.ipynb
713 views
This is a companion notebook for the book Deep Learning with Python, Second Edition. For readability, it only contains runnable code blocks and section titles, and omits everything else in the book: text paragraphs, figures, and pseudocode.
If you want to be able to follow what's going on, I recommend reading the notebook side by side with your copy of the book.
This notebook was generated for TensorFlow 2.6.
Fundamentals of machine learning
Generalization: The goal of machine learning
Underfitting and overfitting
Noisy training data
Ambiguous features
Rare features and spurious correlations
Adding white-noise channels or all-zeros channels to MNIST
Training the same model on MNIST data with noise channels or all-zero channels
Plotting a validation accuracy comparison
The nature of generalization in deep learning
Fitting a MNIST model with randomly shuffled labels
The manifold hypothesis
Interpolation as a source of generalization
Why deep learning works
Training data is paramount
Evaluating machine-learning models
Training, validation, and test sets
Simple hold-out validation
K-fold validation
Iterated K-fold validation with shuffling
Beating a common-sense baseline
Things to keep in mind about model evaluation
Improving model fit
Tuning key gradient descent parameters
Training a MNIST model with an incorrectly high learning rate
The same model with a more appropriate learning rate
Leveraging better architecture priors
Increasing model capacity
A simple logistic regression on MNIST
Improving generalization
Dataset curation
Feature engineering
Using early stopping
Regularizing your model
Reducing the network's size
Original model
Version of the model with lower capacity
Version of the model with higher capacity
Adding weight regularization
Adding L2 weight regularization to the model
Different weight regularizers available in Keras
Adding dropout
Adding dropout to the IMDB model