Path: blob/master/18_convolutional_neural_nets/10_transfer_learning.ipynb
2923 views
How to further train a pre-trained model
We will demonstrate how to freeze some or all of the layers of a pre-trained model and continue training using a new fully-connected set of layers and data with a different format.
Adapted from the Tensorflow 2.0 transfer learning tutorial.
Imports & Settings
Load TensorFlow Cats vs Dog Dataset
TensorFlow includes a large number of built-in dataset:
We will use a set of cats and dog images for binary classification.
Show sample images
Preprocessing
All images will be resized to 160x160:
Load the VGG-16 Bottleneck Features
We use the VGG16 weights, pre-trained on ImageNet with the much smaller 32 x 32 CIFAR10 data. Note that we indicate the new input size upon import and set all layers to not trainable:
Freeze model layers
Add new layers to model
Using the Sequential model API
Using the Functional model API
We use Keras’ functional API to define the vgg16 output as input into a new set of fully-connected layers like so:
We define a new model in terms of inputs and output, and proceed from there on as before:
Compute baseline metrics
Train VGG16 transfer model
Plot Learning Curves
Fine-tune VGG16 weights
Unfreeze selected layers
How many layers are in the base model:
Define callbacks
Continue Training
And now we proceed to train the model: