Path: blob/master/18_transfer_learning/cnn_transfer_learning.ipynb
1141 views
Kernel: Python 3
Transfer learning in image classification
In this notebook we will use transfer learning and take pre-trained model from google's Tensorflow Hub and re-train that on flowers dataset. Using pre-trained model saves lot of time and computational budget for new classification problem at hand
In [2]:
In [3]:
Make predictions using ready made model (without any training)
In [4]:
In [5]:
Out[5]:
In [6]:
Out[6]:
(224, 224, 3)
In [7]:
Out[7]:
array([[[[0.28235294, 0.33333333, 0.07058824],
[0.31372549, 0.37254902, 0.09019608],
[0.34901961, 0.41960784, 0.11764706],
...,
[0.32941176, 0.39215686, 0.00392157],
[0.32156863, 0.38431373, 0.00392157],
[0.30980392, 0.36862745, 0. ]],
[[0.28627451, 0.33333333, 0.08235294],
[0.3254902 , 0.38039216, 0.10980392],
[0.35294118, 0.42352941, 0.12941176],
...,
[0.32156863, 0.38039216, 0.00392157],
[0.31372549, 0.37254902, 0.00392157],
[0.30196078, 0.36078431, 0. ]],
[[0.28627451, 0.33333333, 0.08627451],
[0.31372549, 0.36862745, 0.10196078],
[0.34509804, 0.41568627, 0.12941176],
...,
[0.31764706, 0.37647059, 0.00392157],
[0.30980392, 0.36862745, 0.00784314],
[0.29803922, 0.35686275, 0.00392157]],
...,
[[0.05490196, 0.10980392, 0.01568627],
[0.05098039, 0.11372549, 0.01960784],
[0.05098039, 0.12156863, 0.02352941],
...,
[0.15686275, 0.21960784, 0.03921569],
[0.15686275, 0.22352941, 0.03529412],
[0.16078431, 0.22352941, 0.03137255]],
[[0.0627451 , 0.1254902 , 0.01568627],
[0.05882353, 0.13333333, 0.01960784],
[0.05490196, 0.1372549 , 0.01960784],
...,
[0.1372549 , 0.20392157, 0.04705882],
[0.14117647, 0.20784314, 0.04313725],
[0.14117647, 0.20784314, 0.03529412]],
[[0.06666667, 0.14509804, 0.01176471],
[0.07058824, 0.15294118, 0.01960784],
[0.05490196, 0.14901961, 0.01176471],
...,
[0.11372549, 0.18039216, 0.04313725],
[0.11764706, 0.18431373, 0.03921569],
[0.11764706, 0.18823529, 0.03529412]]]])
In [8]:
Out[8]:
(1, 1001)
In [9]:
Out[9]:
2
In [10]:
Out[10]:
['background', 'tench', 'goldfish', 'great white shark', 'tiger shark']
In [11]:
Out[11]:
'goldfish'
Load flowers dataset
In [79]:
In [80]:
Out[80]:
'.\\datasets\\flower_photos'
In [81]:
Out[81]:
WindowsPath('datasets/flower_photos')
In [82]:
Out[82]:
[WindowsPath('datasets/flower_photos/daisy/100080576_f52e8ee070_n.jpg'),
WindowsPath('datasets/flower_photos/daisy/10140303196_b88d3d6cec.jpg'),
WindowsPath('datasets/flower_photos/daisy/10172379554_b296050f82_n.jpg'),
WindowsPath('datasets/flower_photos/daisy/10172567486_2748826a8b.jpg'),
WindowsPath('datasets/flower_photos/daisy/10172636503_21bededa75_n.jpg')]
In [83]:
Out[83]:
3670
In [84]:
Out[84]:
[WindowsPath('datasets/flower_photos/roses/10090824183_d02c613f10_m.jpg'),
WindowsPath('datasets/flower_photos/roses/102501987_3cdb8e5394_n.jpg'),
WindowsPath('datasets/flower_photos/roses/10503217854_e66a804309.jpg'),
WindowsPath('datasets/flower_photos/roses/10894627425_ec76bbc757_n.jpg'),
WindowsPath('datasets/flower_photos/roses/110472418_87b6a3aa98_m.jpg')]
In [85]:
Out[85]:
In [86]:
Out[86]:
Read flowers images from disk into numpy array using opencv
In [87]:
In [88]:
In [89]:
Out[89]:
[WindowsPath('datasets/flower_photos/roses/10090824183_d02c613f10_m.jpg'),
WindowsPath('datasets/flower_photos/roses/102501987_3cdb8e5394_n.jpg'),
WindowsPath('datasets/flower_photos/roses/10503217854_e66a804309.jpg'),
WindowsPath('datasets/flower_photos/roses/10894627425_ec76bbc757_n.jpg'),
WindowsPath('datasets/flower_photos/roses/110472418_87b6a3aa98_m.jpg')]
In [90]:
Out[90]:
'datasets\\flower_photos\\roses\\10090824183_d02c613f10_m.jpg'
In [91]:
In [92]:
Out[92]:
(240, 179, 3)
In [93]:
Out[93]:
(224, 224, 3)
In [94]:
In [95]:
Train test split
In [96]:
Preprocessing: scale images
In [97]:
Make prediction using pre-trained model on new flowers dataset
In [41]:
Out[41]:
(180, 180, 3)
In [42]:
Out[42]:
(224, 224, 3)
In [60]:
In [61]:
Out[61]:
<matplotlib.image.AxesImage at 0x1e7aec49cd0>
In [63]:
Out[63]:
<matplotlib.image.AxesImage at 0x1eec795f610>
In [64]:
Out[64]:
<matplotlib.image.AxesImage at 0x1eec7e39e20>
In [72]:
Out[72]:
array([795, 795, 795], dtype=int64)
In [73]:
Out[73]:
'shower curtain'
Now take pre-trained model and retrain it using flowers images
In [75]:
In [98]:
Out[98]:
Model: "sequential_4"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
keras_layer_3 (KerasLayer) (None, 1280) 2257984
_________________________________________________________________
dense_1 (Dense) (None, 5) 6405
=================================================================
Total params: 2,264,389
Trainable params: 6,405
Non-trainable params: 2,257,984
_________________________________________________________________
In [99]:
Out[99]:
Epoch 1/5
1/86 [..............................] - ETA: 0s - loss: 1.9482 - acc: 0.2188WARNING:tensorflow:Callbacks method `on_train_batch_end` is slow compared to the batch time (batch time: 0.0080s vs `on_train_batch_end` time: 0.0130s). Check your callbacks.
WARNING:tensorflow:Callbacks method `on_train_batch_end` is slow compared to the batch time (batch time: 0.0080s vs `on_train_batch_end` time: 0.0130s). Check your callbacks.
86/86 [==============================] - 2s 19ms/step - loss: 0.7985 - acc: 0.7028
Epoch 2/5
86/86 [==============================] - 2s 19ms/step - loss: 0.4163 - acc: 0.8517
Epoch 3/5
86/86 [==============================] - 2s 19ms/step - loss: 0.3264 - acc: 0.8895
Epoch 4/5
86/86 [==============================] - 2s 19ms/step - loss: 0.2682 - acc: 0.9106
Epoch 5/5
86/86 [==============================] - 2s 19ms/step - loss: 0.2305 - acc: 0.9266
<tensorflow.python.keras.callbacks.History at 0x1e7fc8112b0>
In [100]:
Out[100]:
29/29 [==============================] - 1s 23ms/step - loss: 0.3703 - acc: 0.8682
[0.37029528617858887, 0.8681917190551758]