Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
packtpublishing
GitHub Repository: packtpublishing/machine-learning-for-algorithmic-trading-second-edition
Path: blob/master/18_convolutional_neural_nets/09_bottleneck_features.ipynb
2923 views
Kernel: Python 3

How to extract bottleneck features

Modern CNNs can take weeks to train on multiple GPUs on ImageNet, but fortunately, many researchers share their final weights. Keras, e.g., contains pre-trained models for several of the reference architectures discussed above, namely VGG16 and 19, ResNet50, InceptionV3 and InceptionResNetV2, MobileNet, DenseNet, NASNet and MobileNetV2

This notebook illustrates how to download pre-trained VGG16 model, either with the final layers to generate predictions or without the final layers as illustrated in the figure below to extract the outputs produced by the bottleneck features.

Imports

import numpy as np from pathlib import Path import tensorflow as tf from tensorflow.keras.applications.vgg19 import VGG19, preprocess_input from tensorflow.keras.applications.vgg16 import VGG16 from tensorflow.keras.applications.inception_v3 import InceptionV3 from tensorflow.keras.applications.resnet50 import ResNet50 from tensorflow.keras.preprocessing import image import tensorflow.keras.backend as K
gpu_devices = tf.config.experimental.list_physical_devices('GPU') if gpu_devices: print('Using GPU') tf.config.experimental.set_memory_growth(gpu_devices[0], True) else: print('Using CPU')
Using CPU

Load and Preprocess Sample Images

Before supplying an image to a pre-trained network in Keras, there are some required preprocessing steps.

We have imported a very small dataset of 7 images and stored the preprocessed image input as img_input. Note that the dimensionality of this array is (8, 224, 224, 3). In this case, each of the 8 images is a 3D tensor, with shape (224, 224, 3).

img_paths = Path('images/img_input').glob('*.jpg')
def path_to_tensor(img_path): # loads RGB image as PIL.Image.Image type img = image.load_img(img_path, target_size=(224, 224)) # convert PIL.Image.Image type to 3D tensor with shape (224, 224, 3) x = image.img_to_array(img) # convert 3D tensor to 4D tensor with shape (1, 224, 224, 3) and return 4D tensor return np.expand_dims(x, axis=0)
def paths_to_tensor(img_paths): list_of_tensors = [path_to_tensor(img_path) for img_path in img_paths] return np.vstack(list_of_tensors)
# calculate the image input img_input = preprocess_input(paths_to_tensor(img_paths)) img_input.shape
(7, 224, 224, 3)

Import Pre-Trained VGG-16

Import the VGG-16 network (including the final classification layer) that has been pre-trained on ImageNet.

Keras makes it very straightforward to download and use pre-trained models:

vgg16 = VGG16() vgg16.summary()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/vgg16/vgg16_weights_tf_dim_ordering_tf_kernels.h5 553467904/553467096 [==============================] - 5s 0us/step Model: "vgg16" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_1 (InputLayer) [(None, 224, 224, 3)] 0 _________________________________________________________________ block1_conv1 (Conv2D) (None, 224, 224, 64) 1792 _________________________________________________________________ block1_conv2 (Conv2D) (None, 224, 224, 64) 36928 _________________________________________________________________ block1_pool (MaxPooling2D) (None, 112, 112, 64) 0 _________________________________________________________________ block2_conv1 (Conv2D) (None, 112, 112, 128) 73856 _________________________________________________________________ block2_conv2 (Conv2D) (None, 112, 112, 128) 147584 _________________________________________________________________ block2_pool (MaxPooling2D) (None, 56, 56, 128) 0 _________________________________________________________________ block3_conv1 (Conv2D) (None, 56, 56, 256) 295168 _________________________________________________________________ block3_conv2 (Conv2D) (None, 56, 56, 256) 590080 _________________________________________________________________ block3_conv3 (Conv2D) (None, 56, 56, 256) 590080 _________________________________________________________________ block3_pool (MaxPooling2D) (None, 28, 28, 256) 0 _________________________________________________________________ block4_conv1 (Conv2D) (None, 28, 28, 512) 1180160 _________________________________________________________________ block4_conv2 (Conv2D) (None, 28, 28, 512) 2359808 _________________________________________________________________ block4_conv3 (Conv2D) (None, 28, 28, 512) 2359808 _________________________________________________________________ block4_pool (MaxPooling2D) (None, 14, 14, 512) 0 _________________________________________________________________ block5_conv1 (Conv2D) (None, 14, 14, 512) 2359808 _________________________________________________________________ block5_conv2 (Conv2D) (None, 14, 14, 512) 2359808 _________________________________________________________________ block5_conv3 (Conv2D) (None, 14, 14, 512) 2359808 _________________________________________________________________ block5_pool (MaxPooling2D) (None, 7, 7, 512) 0 _________________________________________________________________ flatten (Flatten) (None, 25088) 0 _________________________________________________________________ fc1 (Dense) (None, 4096) 102764544 _________________________________________________________________ fc2 (Dense) (None, 4096) 16781312 _________________________________________________________________ predictions (Dense) (None, 1000) 4097000 ================================================================= Total params: 138,357,544 Trainable params: 138,357,544 Non-trainable params: 0 _________________________________________________________________

For this network, model.predict returns a 1000-dimensional probability vector containing the predicted probability that an image returns each of the 1000 ImageNet categories. The dimensionality of the obtained output from passing img_input through the model is (8, 1000). The first value of 7 merely denotes that 7 images were passed through the network.

y_pred = vgg16.predict(img_input) y_pred.shape
(7, 1000)
np.argmax(y_pred, axis=1)
array([206, 208, 221, 205, 218, 215, 209])

Import the VGG-16 Model, with the Final Fully-Connected Layers Removed

When performing transfer learning, we need to remove the final layers of the network, as they are too specific to the ImageNet database. This is accomplished in the code cell below.

VGG-16 model for transfer learning

You can use this model like any other Keras model for predictions. To exclude the fully-connected layers, just add the keyword include_top=False to obtain the output of the final convolutional layer when passing an image to the CNN.

vgg16 = VGG16(include_top=False) vgg16.summary()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/vgg16/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5 58892288/58889256 [==============================] - 1s 0us/step Model: "vgg16" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_2 (InputLayer) [(None, None, None, 3)] 0 _________________________________________________________________ block1_conv1 (Conv2D) (None, None, None, 64) 1792 _________________________________________________________________ block1_conv2 (Conv2D) (None, None, None, 64) 36928 _________________________________________________________________ block1_pool (MaxPooling2D) (None, None, None, 64) 0 _________________________________________________________________ block2_conv1 (Conv2D) (None, None, None, 128) 73856 _________________________________________________________________ block2_conv2 (Conv2D) (None, None, None, 128) 147584 _________________________________________________________________ block2_pool (MaxPooling2D) (None, None, None, 128) 0 _________________________________________________________________ block3_conv1 (Conv2D) (None, None, None, 256) 295168 _________________________________________________________________ block3_conv2 (Conv2D) (None, None, None, 256) 590080 _________________________________________________________________ block3_conv3 (Conv2D) (None, None, None, 256) 590080 _________________________________________________________________ block3_pool (MaxPooling2D) (None, None, None, 256) 0 _________________________________________________________________ block4_conv1 (Conv2D) (None, None, None, 512) 1180160 _________________________________________________________________ block4_conv2 (Conv2D) (None, None, None, 512) 2359808 _________________________________________________________________ block4_conv3 (Conv2D) (None, None, None, 512) 2359808 _________________________________________________________________ block4_pool (MaxPooling2D) (None, None, None, 512) 0 _________________________________________________________________ block5_conv1 (Conv2D) (None, None, None, 512) 2359808 _________________________________________________________________ block5_conv2 (Conv2D) (None, None, None, 512) 2359808 _________________________________________________________________ block5_conv3 (Conv2D) (None, None, None, 512) 2359808 _________________________________________________________________ block5_pool (MaxPooling2D) (None, None, None, 512) 0 ================================================================= Total params: 14,714,688 Trainable params: 14,714,688 Non-trainable params: 0 _________________________________________________________________

By omitting the fully-connected layers, we are no longer forced to use a fixed input size for the model (224x224, the original ImageNet format). By only keeping the convolutional modules, our model can be adapted to arbitrary input sizes.

Extract Output of Final Max Pooling Layer

Now, the network stored in model is a truncated version of the VGG-16 network, where the final three fully-connected layers have been removed. In this case, model.predict returns a 3D array (with dimensions 7×7×5127\times 7\times 512) corresponding to the final max pooling layer of VGG-16. The dimensionality of the obtained output from passing img_input through the model is (8, 7, 7, 512). The first value of 8 merely denotes that 8 images were passed through the network.

vgg16.predict(img_input).shape
(7, 7, 7, 512)

This is exactly how we calculate the bottleneck features for your project!

Import ResNet50

With final layer

resnet = ResNet50() resnet.summary()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/resnet/resnet50_weights_tf_dim_ordering_tf_kernels.h5 102973440/102967424 [==============================] - 1s 0us/step Model: "resnet50" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_3 (InputLayer) [(None, 224, 224, 3) 0 __________________________________________________________________________________________________ conv1_pad (ZeroPadding2D) (None, 230, 230, 3) 0 input_3[0][0] __________________________________________________________________________________________________ conv1_conv (Conv2D) (None, 112, 112, 64) 9472 conv1_pad[0][0] __________________________________________________________________________________________________ conv1_bn (BatchNormalization) (None, 112, 112, 64) 256 conv1_conv[0][0] __________________________________________________________________________________________________ conv1_relu (Activation) (None, 112, 112, 64) 0 conv1_bn[0][0] __________________________________________________________________________________________________ pool1_pad (ZeroPadding2D) (None, 114, 114, 64) 0 conv1_relu[0][0] __________________________________________________________________________________________________ pool1_pool (MaxPooling2D) (None, 56, 56, 64) 0 pool1_pad[0][0] __________________________________________________________________________________________________ conv2_block1_1_conv (Conv2D) (None, 56, 56, 64) 4160 pool1_pool[0][0] __________________________________________________________________________________________________ conv2_block1_1_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block1_1_conv[0][0] __________________________________________________________________________________________________ conv2_block1_1_relu (Activation (None, 56, 56, 64) 0 conv2_block1_1_bn[0][0] __________________________________________________________________________________________________ conv2_block1_2_conv (Conv2D) (None, 56, 56, 64) 36928 conv2_block1_1_relu[0][0] __________________________________________________________________________________________________ conv2_block1_2_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block1_2_conv[0][0] __________________________________________________________________________________________________ conv2_block1_2_relu (Activation (None, 56, 56, 64) 0 conv2_block1_2_bn[0][0] __________________________________________________________________________________________________ conv2_block1_0_conv (Conv2D) (None, 56, 56, 256) 16640 pool1_pool[0][0] __________________________________________________________________________________________________ conv2_block1_3_conv (Conv2D) (None, 56, 56, 256) 16640 conv2_block1_2_relu[0][0] __________________________________________________________________________________________________ conv2_block1_0_bn (BatchNormali (None, 56, 56, 256) 1024 conv2_block1_0_conv[0][0] __________________________________________________________________________________________________ conv2_block1_3_bn (BatchNormali (None, 56, 56, 256) 1024 conv2_block1_3_conv[0][0] __________________________________________________________________________________________________ conv2_block1_add (Add) (None, 56, 56, 256) 0 conv2_block1_0_bn[0][0] conv2_block1_3_bn[0][0] __________________________________________________________________________________________________ conv2_block1_out (Activation) (None, 56, 56, 256) 0 conv2_block1_add[0][0] __________________________________________________________________________________________________ conv2_block2_1_conv (Conv2D) (None, 56, 56, 64) 16448 conv2_block1_out[0][0] __________________________________________________________________________________________________ conv2_block2_1_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block2_1_conv[0][0] __________________________________________________________________________________________________ conv2_block2_1_relu (Activation (None, 56, 56, 64) 0 conv2_block2_1_bn[0][0] __________________________________________________________________________________________________ conv2_block2_2_conv (Conv2D) (None, 56, 56, 64) 36928 conv2_block2_1_relu[0][0] __________________________________________________________________________________________________ conv2_block2_2_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block2_2_conv[0][0] __________________________________________________________________________________________________ conv2_block2_2_relu (Activation (None, 56, 56, 64) 0 conv2_block2_2_bn[0][0] __________________________________________________________________________________________________ conv2_block2_3_conv (Conv2D) (None, 56, 56, 256) 16640 conv2_block2_2_relu[0][0] __________________________________________________________________________________________________ conv2_block2_3_bn (BatchNormali (None, 56, 56, 256) 1024 conv2_block2_3_conv[0][0] __________________________________________________________________________________________________ conv2_block2_add (Add) (None, 56, 56, 256) 0 conv2_block1_out[0][0] conv2_block2_3_bn[0][0] __________________________________________________________________________________________________ conv2_block2_out (Activation) (None, 56, 56, 256) 0 conv2_block2_add[0][0] __________________________________________________________________________________________________ conv2_block3_1_conv (Conv2D) (None, 56, 56, 64) 16448 conv2_block2_out[0][0] __________________________________________________________________________________________________ conv2_block3_1_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block3_1_conv[0][0] __________________________________________________________________________________________________ conv2_block3_1_relu (Activation (None, 56, 56, 64) 0 conv2_block3_1_bn[0][0] __________________________________________________________________________________________________ conv2_block3_2_conv (Conv2D) (None, 56, 56, 64) 36928 conv2_block3_1_relu[0][0] __________________________________________________________________________________________________ conv2_block3_2_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block3_2_conv[0][0] __________________________________________________________________________________________________ conv2_block3_2_relu (Activation (None, 56, 56, 64) 0 conv2_block3_2_bn[0][0] __________________________________________________________________________________________________ conv2_block3_3_conv (Conv2D) (None, 56, 56, 256) 16640 conv2_block3_2_relu[0][0] __________________________________________________________________________________________________ conv2_block3_3_bn (BatchNormali (None, 56, 56, 256) 1024 conv2_block3_3_conv[0][0] __________________________________________________________________________________________________ conv2_block3_add (Add) (None, 56, 56, 256) 0 conv2_block2_out[0][0] conv2_block3_3_bn[0][0] __________________________________________________________________________________________________ conv2_block3_out (Activation) (None, 56, 56, 256) 0 conv2_block3_add[0][0] __________________________________________________________________________________________________ conv3_block1_1_conv (Conv2D) (None, 28, 28, 128) 32896 conv2_block3_out[0][0] __________________________________________________________________________________________________ conv3_block1_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block1_1_conv[0][0] __________________________________________________________________________________________________ conv3_block1_1_relu (Activation (None, 28, 28, 128) 0 conv3_block1_1_bn[0][0] __________________________________________________________________________________________________ conv3_block1_2_conv (Conv2D) (None, 28, 28, 128) 147584 conv3_block1_1_relu[0][0] __________________________________________________________________________________________________ conv3_block1_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block1_2_conv[0][0] __________________________________________________________________________________________________ conv3_block1_2_relu (Activation (None, 28, 28, 128) 0 conv3_block1_2_bn[0][0] __________________________________________________________________________________________________ conv3_block1_0_conv (Conv2D) (None, 28, 28, 512) 131584 conv2_block3_out[0][0] __________________________________________________________________________________________________ conv3_block1_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block1_2_relu[0][0] __________________________________________________________________________________________________ conv3_block1_0_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block1_0_conv[0][0] __________________________________________________________________________________________________ conv3_block1_3_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block1_3_conv[0][0] __________________________________________________________________________________________________ conv3_block1_add (Add) (None, 28, 28, 512) 0 conv3_block1_0_bn[0][0] conv3_block1_3_bn[0][0] __________________________________________________________________________________________________ conv3_block1_out (Activation) (None, 28, 28, 512) 0 conv3_block1_add[0][0] __________________________________________________________________________________________________ conv3_block2_1_conv (Conv2D) (None, 28, 28, 128) 65664 conv3_block1_out[0][0] __________________________________________________________________________________________________ conv3_block2_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block2_1_conv[0][0] __________________________________________________________________________________________________ conv3_block2_1_relu (Activation (None, 28, 28, 128) 0 conv3_block2_1_bn[0][0] __________________________________________________________________________________________________ conv3_block2_2_conv (Conv2D) (None, 28, 28, 128) 147584 conv3_block2_1_relu[0][0] __________________________________________________________________________________________________ conv3_block2_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block2_2_conv[0][0] __________________________________________________________________________________________________ conv3_block2_2_relu (Activation (None, 28, 28, 128) 0 conv3_block2_2_bn[0][0] __________________________________________________________________________________________________ conv3_block2_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block2_2_relu[0][0] __________________________________________________________________________________________________ conv3_block2_3_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block2_3_conv[0][0] __________________________________________________________________________________________________ conv3_block2_add (Add) (None, 28, 28, 512) 0 conv3_block1_out[0][0] conv3_block2_3_bn[0][0] __________________________________________________________________________________________________ conv3_block2_out (Activation) (None, 28, 28, 512) 0 conv3_block2_add[0][0] __________________________________________________________________________________________________ conv3_block3_1_conv (Conv2D) (None, 28, 28, 128) 65664 conv3_block2_out[0][0] __________________________________________________________________________________________________ conv3_block3_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block3_1_conv[0][0] __________________________________________________________________________________________________ conv3_block3_1_relu (Activation (None, 28, 28, 128) 0 conv3_block3_1_bn[0][0] __________________________________________________________________________________________________ conv3_block3_2_conv (Conv2D) (None, 28, 28, 128) 147584 conv3_block3_1_relu[0][0] __________________________________________________________________________________________________ conv3_block3_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block3_2_conv[0][0] __________________________________________________________________________________________________ conv3_block3_2_relu (Activation (None, 28, 28, 128) 0 conv3_block3_2_bn[0][0] __________________________________________________________________________________________________ conv3_block3_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block3_2_relu[0][0] __________________________________________________________________________________________________ conv3_block3_3_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block3_3_conv[0][0] __________________________________________________________________________________________________ conv3_block3_add (Add) (None, 28, 28, 512) 0 conv3_block2_out[0][0] conv3_block3_3_bn[0][0] __________________________________________________________________________________________________ conv3_block3_out (Activation) (None, 28, 28, 512) 0 conv3_block3_add[0][0] __________________________________________________________________________________________________ conv3_block4_1_conv (Conv2D) (None, 28, 28, 128) 65664 conv3_block3_out[0][0] __________________________________________________________________________________________________ conv3_block4_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block4_1_conv[0][0] __________________________________________________________________________________________________ conv3_block4_1_relu (Activation (None, 28, 28, 128) 0 conv3_block4_1_bn[0][0] __________________________________________________________________________________________________ conv3_block4_2_conv (Conv2D) (None, 28, 28, 128) 147584 conv3_block4_1_relu[0][0] __________________________________________________________________________________________________ conv3_block4_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block4_2_conv[0][0] __________________________________________________________________________________________________ conv3_block4_2_relu (Activation (None, 28, 28, 128) 0 conv3_block4_2_bn[0][0] __________________________________________________________________________________________________ conv3_block4_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block4_2_relu[0][0] __________________________________________________________________________________________________ conv3_block4_3_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block4_3_conv[0][0] __________________________________________________________________________________________________ conv3_block4_add (Add) (None, 28, 28, 512) 0 conv3_block3_out[0][0] conv3_block4_3_bn[0][0] __________________________________________________________________________________________________ conv3_block4_out (Activation) (None, 28, 28, 512) 0 conv3_block4_add[0][0] __________________________________________________________________________________________________ conv4_block1_1_conv (Conv2D) (None, 14, 14, 256) 131328 conv3_block4_out[0][0] __________________________________________________________________________________________________ conv4_block1_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block1_1_conv[0][0] __________________________________________________________________________________________________ conv4_block1_1_relu (Activation (None, 14, 14, 256) 0 conv4_block1_1_bn[0][0] __________________________________________________________________________________________________ conv4_block1_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block1_1_relu[0][0] __________________________________________________________________________________________________ conv4_block1_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block1_2_conv[0][0] __________________________________________________________________________________________________ conv4_block1_2_relu (Activation (None, 14, 14, 256) 0 conv4_block1_2_bn[0][0] __________________________________________________________________________________________________ conv4_block1_0_conv (Conv2D) (None, 14, 14, 1024) 525312 conv3_block4_out[0][0] __________________________________________________________________________________________________ conv4_block1_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block1_2_relu[0][0] __________________________________________________________________________________________________ conv4_block1_0_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block1_0_conv[0][0] __________________________________________________________________________________________________ conv4_block1_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block1_3_conv[0][0] __________________________________________________________________________________________________ conv4_block1_add (Add) (None, 14, 14, 1024) 0 conv4_block1_0_bn[0][0] conv4_block1_3_bn[0][0] __________________________________________________________________________________________________ conv4_block1_out (Activation) (None, 14, 14, 1024) 0 conv4_block1_add[0][0] __________________________________________________________________________________________________ conv4_block2_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block1_out[0][0] __________________________________________________________________________________________________ conv4_block2_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block2_1_conv[0][0] __________________________________________________________________________________________________ conv4_block2_1_relu (Activation (None, 14, 14, 256) 0 conv4_block2_1_bn[0][0] __________________________________________________________________________________________________ conv4_block2_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block2_1_relu[0][0] __________________________________________________________________________________________________ conv4_block2_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block2_2_conv[0][0] __________________________________________________________________________________________________ conv4_block2_2_relu (Activation (None, 14, 14, 256) 0 conv4_block2_2_bn[0][0] __________________________________________________________________________________________________ conv4_block2_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block2_2_relu[0][0] __________________________________________________________________________________________________ conv4_block2_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block2_3_conv[0][0] __________________________________________________________________________________________________ conv4_block2_add (Add) (None, 14, 14, 1024) 0 conv4_block1_out[0][0] conv4_block2_3_bn[0][0] __________________________________________________________________________________________________ conv4_block2_out (Activation) (None, 14, 14, 1024) 0 conv4_block2_add[0][0] __________________________________________________________________________________________________ conv4_block3_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block2_out[0][0] __________________________________________________________________________________________________ conv4_block3_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block3_1_conv[0][0] __________________________________________________________________________________________________ conv4_block3_1_relu (Activation (None, 14, 14, 256) 0 conv4_block3_1_bn[0][0] __________________________________________________________________________________________________ conv4_block3_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block3_1_relu[0][0] __________________________________________________________________________________________________ conv4_block3_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block3_2_conv[0][0] __________________________________________________________________________________________________ conv4_block3_2_relu (Activation (None, 14, 14, 256) 0 conv4_block3_2_bn[0][0] __________________________________________________________________________________________________ conv4_block3_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block3_2_relu[0][0] __________________________________________________________________________________________________ conv4_block3_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block3_3_conv[0][0] __________________________________________________________________________________________________ conv4_block3_add (Add) (None, 14, 14, 1024) 0 conv4_block2_out[0][0] conv4_block3_3_bn[0][0] __________________________________________________________________________________________________ conv4_block3_out (Activation) (None, 14, 14, 1024) 0 conv4_block3_add[0][0] __________________________________________________________________________________________________ conv4_block4_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block3_out[0][0] __________________________________________________________________________________________________ conv4_block4_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block4_1_conv[0][0] __________________________________________________________________________________________________ conv4_block4_1_relu (Activation (None, 14, 14, 256) 0 conv4_block4_1_bn[0][0] __________________________________________________________________________________________________ conv4_block4_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block4_1_relu[0][0] __________________________________________________________________________________________________ conv4_block4_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block4_2_conv[0][0] __________________________________________________________________________________________________ conv4_block4_2_relu (Activation (None, 14, 14, 256) 0 conv4_block4_2_bn[0][0] __________________________________________________________________________________________________ conv4_block4_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block4_2_relu[0][0] __________________________________________________________________________________________________ conv4_block4_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block4_3_conv[0][0] __________________________________________________________________________________________________ conv4_block4_add (Add) (None, 14, 14, 1024) 0 conv4_block3_out[0][0] conv4_block4_3_bn[0][0] __________________________________________________________________________________________________ conv4_block4_out (Activation) (None, 14, 14, 1024) 0 conv4_block4_add[0][0] __________________________________________________________________________________________________ conv4_block5_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block4_out[0][0] __________________________________________________________________________________________________ conv4_block5_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block5_1_conv[0][0] __________________________________________________________________________________________________ conv4_block5_1_relu (Activation (None, 14, 14, 256) 0 conv4_block5_1_bn[0][0] __________________________________________________________________________________________________ conv4_block5_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block5_1_relu[0][0] __________________________________________________________________________________________________ conv4_block5_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block5_2_conv[0][0] __________________________________________________________________________________________________ conv4_block5_2_relu (Activation (None, 14, 14, 256) 0 conv4_block5_2_bn[0][0] __________________________________________________________________________________________________ conv4_block5_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block5_2_relu[0][0] __________________________________________________________________________________________________ conv4_block5_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block5_3_conv[0][0] __________________________________________________________________________________________________ conv4_block5_add (Add) (None, 14, 14, 1024) 0 conv4_block4_out[0][0] conv4_block5_3_bn[0][0] __________________________________________________________________________________________________ conv4_block5_out (Activation) (None, 14, 14, 1024) 0 conv4_block5_add[0][0] __________________________________________________________________________________________________ conv4_block6_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block5_out[0][0] __________________________________________________________________________________________________ conv4_block6_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block6_1_conv[0][0] __________________________________________________________________________________________________ conv4_block6_1_relu (Activation (None, 14, 14, 256) 0 conv4_block6_1_bn[0][0] __________________________________________________________________________________________________ conv4_block6_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block6_1_relu[0][0] __________________________________________________________________________________________________ conv4_block6_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block6_2_conv[0][0] __________________________________________________________________________________________________ conv4_block6_2_relu (Activation (None, 14, 14, 256) 0 conv4_block6_2_bn[0][0] __________________________________________________________________________________________________ conv4_block6_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block6_2_relu[0][0] __________________________________________________________________________________________________ conv4_block6_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block6_3_conv[0][0] __________________________________________________________________________________________________ conv4_block6_add (Add) (None, 14, 14, 1024) 0 conv4_block5_out[0][0] conv4_block6_3_bn[0][0] __________________________________________________________________________________________________ conv4_block6_out (Activation) (None, 14, 14, 1024) 0 conv4_block6_add[0][0] __________________________________________________________________________________________________ conv5_block1_1_conv (Conv2D) (None, 7, 7, 512) 524800 conv4_block6_out[0][0] __________________________________________________________________________________________________ conv5_block1_1_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block1_1_conv[0][0] __________________________________________________________________________________________________ conv5_block1_1_relu (Activation (None, 7, 7, 512) 0 conv5_block1_1_bn[0][0] __________________________________________________________________________________________________ conv5_block1_2_conv (Conv2D) (None, 7, 7, 512) 2359808 conv5_block1_1_relu[0][0] __________________________________________________________________________________________________ conv5_block1_2_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block1_2_conv[0][0] __________________________________________________________________________________________________ conv5_block1_2_relu (Activation (None, 7, 7, 512) 0 conv5_block1_2_bn[0][0] __________________________________________________________________________________________________ conv5_block1_0_conv (Conv2D) (None, 7, 7, 2048) 2099200 conv4_block6_out[0][0] __________________________________________________________________________________________________ conv5_block1_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 conv5_block1_2_relu[0][0] __________________________________________________________________________________________________ conv5_block1_0_bn (BatchNormali (None, 7, 7, 2048) 8192 conv5_block1_0_conv[0][0] __________________________________________________________________________________________________ conv5_block1_3_bn (BatchNormali (None, 7, 7, 2048) 8192 conv5_block1_3_conv[0][0] __________________________________________________________________________________________________ conv5_block1_add (Add) (None, 7, 7, 2048) 0 conv5_block1_0_bn[0][0] conv5_block1_3_bn[0][0] __________________________________________________________________________________________________ conv5_block1_out (Activation) (None, 7, 7, 2048) 0 conv5_block1_add[0][0] __________________________________________________________________________________________________ conv5_block2_1_conv (Conv2D) (None, 7, 7, 512) 1049088 conv5_block1_out[0][0] __________________________________________________________________________________________________ conv5_block2_1_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block2_1_conv[0][0] __________________________________________________________________________________________________ conv5_block2_1_relu (Activation (None, 7, 7, 512) 0 conv5_block2_1_bn[0][0] __________________________________________________________________________________________________ conv5_block2_2_conv (Conv2D) (None, 7, 7, 512) 2359808 conv5_block2_1_relu[0][0] __________________________________________________________________________________________________ conv5_block2_2_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block2_2_conv[0][0] __________________________________________________________________________________________________ conv5_block2_2_relu (Activation (None, 7, 7, 512) 0 conv5_block2_2_bn[0][0] __________________________________________________________________________________________________ conv5_block2_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 conv5_block2_2_relu[0][0] __________________________________________________________________________________________________ conv5_block2_3_bn (BatchNormali (None, 7, 7, 2048) 8192 conv5_block2_3_conv[0][0] __________________________________________________________________________________________________ conv5_block2_add (Add) (None, 7, 7, 2048) 0 conv5_block1_out[0][0] conv5_block2_3_bn[0][0] __________________________________________________________________________________________________ conv5_block2_out (Activation) (None, 7, 7, 2048) 0 conv5_block2_add[0][0] __________________________________________________________________________________________________ conv5_block3_1_conv (Conv2D) (None, 7, 7, 512) 1049088 conv5_block2_out[0][0] __________________________________________________________________________________________________ conv5_block3_1_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block3_1_conv[0][0] __________________________________________________________________________________________________ conv5_block3_1_relu (Activation (None, 7, 7, 512) 0 conv5_block3_1_bn[0][0] __________________________________________________________________________________________________ conv5_block3_2_conv (Conv2D) (None, 7, 7, 512) 2359808 conv5_block3_1_relu[0][0] __________________________________________________________________________________________________ conv5_block3_2_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block3_2_conv[0][0] __________________________________________________________________________________________________ conv5_block3_2_relu (Activation (None, 7, 7, 512) 0 conv5_block3_2_bn[0][0] __________________________________________________________________________________________________ conv5_block3_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 conv5_block3_2_relu[0][0] __________________________________________________________________________________________________ conv5_block3_3_bn (BatchNormali (None, 7, 7, 2048) 8192 conv5_block3_3_conv[0][0] __________________________________________________________________________________________________ conv5_block3_add (Add) (None, 7, 7, 2048) 0 conv5_block2_out[0][0] conv5_block3_3_bn[0][0] __________________________________________________________________________________________________ conv5_block3_out (Activation) (None, 7, 7, 2048) 0 conv5_block3_add[0][0] __________________________________________________________________________________________________ avg_pool (GlobalAveragePooling2 (None, 2048) 0 conv5_block3_out[0][0] __________________________________________________________________________________________________ predictions (Dense) (None, 1000) 2049000 avg_pool[0][0] ================================================================================================== Total params: 25,636,712 Trainable params: 25,583,592 Non-trainable params: 53,120 __________________________________________________________________________________________________

Without final layer

resnet = ResNet50(include_top=False) resnet.summary()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/resnet/resnet50_weights_tf_dim_ordering_tf_kernels_notop.h5 94773248/94765736 [==============================] - 1s 0us/step Model: "resnet50" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_4 (InputLayer) [(None, None, None, 0 __________________________________________________________________________________________________ conv1_pad (ZeroPadding2D) (None, None, None, 3 0 input_4[0][0] __________________________________________________________________________________________________ conv1_conv (Conv2D) (None, None, None, 6 9472 conv1_pad[0][0] __________________________________________________________________________________________________ conv1_bn (BatchNormalization) (None, None, None, 6 256 conv1_conv[0][0] __________________________________________________________________________________________________ conv1_relu (Activation) (None, None, None, 6 0 conv1_bn[0][0] __________________________________________________________________________________________________ pool1_pad (ZeroPadding2D) (None, None, None, 6 0 conv1_relu[0][0] __________________________________________________________________________________________________ pool1_pool (MaxPooling2D) (None, None, None, 6 0 pool1_pad[0][0] __________________________________________________________________________________________________ conv2_block1_1_conv (Conv2D) (None, None, None, 6 4160 pool1_pool[0][0] __________________________________________________________________________________________________ conv2_block1_1_bn (BatchNormali (None, None, None, 6 256 conv2_block1_1_conv[0][0] __________________________________________________________________________________________________ conv2_block1_1_relu (Activation (None, None, None, 6 0 conv2_block1_1_bn[0][0] __________________________________________________________________________________________________ conv2_block1_2_conv (Conv2D) (None, None, None, 6 36928 conv2_block1_1_relu[0][0] __________________________________________________________________________________________________ conv2_block1_2_bn (BatchNormali (None, None, None, 6 256 conv2_block1_2_conv[0][0] __________________________________________________________________________________________________ conv2_block1_2_relu (Activation (None, None, None, 6 0 conv2_block1_2_bn[0][0] __________________________________________________________________________________________________ conv2_block1_0_conv (Conv2D) (None, None, None, 2 16640 pool1_pool[0][0] __________________________________________________________________________________________________ conv2_block1_3_conv (Conv2D) (None, None, None, 2 16640 conv2_block1_2_relu[0][0] __________________________________________________________________________________________________ conv2_block1_0_bn (BatchNormali (None, None, None, 2 1024 conv2_block1_0_conv[0][0] __________________________________________________________________________________________________ conv2_block1_3_bn (BatchNormali (None, None, None, 2 1024 conv2_block1_3_conv[0][0] __________________________________________________________________________________________________ conv2_block1_add (Add) (None, None, None, 2 0 conv2_block1_0_bn[0][0] conv2_block1_3_bn[0][0] __________________________________________________________________________________________________ conv2_block1_out (Activation) (None, None, None, 2 0 conv2_block1_add[0][0] __________________________________________________________________________________________________ conv2_block2_1_conv (Conv2D) (None, None, None, 6 16448 conv2_block1_out[0][0] __________________________________________________________________________________________________ conv2_block2_1_bn (BatchNormali (None, None, None, 6 256 conv2_block2_1_conv[0][0] __________________________________________________________________________________________________ conv2_block2_1_relu (Activation (None, None, None, 6 0 conv2_block2_1_bn[0][0] __________________________________________________________________________________________________ conv2_block2_2_conv (Conv2D) (None, None, None, 6 36928 conv2_block2_1_relu[0][0] __________________________________________________________________________________________________ conv2_block2_2_bn (BatchNormali (None, None, None, 6 256 conv2_block2_2_conv[0][0] __________________________________________________________________________________________________ conv2_block2_2_relu (Activation (None, None, None, 6 0 conv2_block2_2_bn[0][0] __________________________________________________________________________________________________ conv2_block2_3_conv (Conv2D) (None, None, None, 2 16640 conv2_block2_2_relu[0][0] __________________________________________________________________________________________________ conv2_block2_3_bn (BatchNormali (None, None, None, 2 1024 conv2_block2_3_conv[0][0] __________________________________________________________________________________________________ conv2_block2_add (Add) (None, None, None, 2 0 conv2_block1_out[0][0] conv2_block2_3_bn[0][0] __________________________________________________________________________________________________ conv2_block2_out (Activation) (None, None, None, 2 0 conv2_block2_add[0][0] __________________________________________________________________________________________________ conv2_block3_1_conv (Conv2D) (None, None, None, 6 16448 conv2_block2_out[0][0] __________________________________________________________________________________________________ conv2_block3_1_bn (BatchNormali (None, None, None, 6 256 conv2_block3_1_conv[0][0] __________________________________________________________________________________________________ conv2_block3_1_relu (Activation (None, None, None, 6 0 conv2_block3_1_bn[0][0] __________________________________________________________________________________________________ conv2_block3_2_conv (Conv2D) (None, None, None, 6 36928 conv2_block3_1_relu[0][0] __________________________________________________________________________________________________ conv2_block3_2_bn (BatchNormali (None, None, None, 6 256 conv2_block3_2_conv[0][0] __________________________________________________________________________________________________ conv2_block3_2_relu (Activation (None, None, None, 6 0 conv2_block3_2_bn[0][0] __________________________________________________________________________________________________ conv2_block3_3_conv (Conv2D) (None, None, None, 2 16640 conv2_block3_2_relu[0][0] __________________________________________________________________________________________________ conv2_block3_3_bn (BatchNormali (None, None, None, 2 1024 conv2_block3_3_conv[0][0] __________________________________________________________________________________________________ conv2_block3_add (Add) (None, None, None, 2 0 conv2_block2_out[0][0] conv2_block3_3_bn[0][0] __________________________________________________________________________________________________ conv2_block3_out (Activation) (None, None, None, 2 0 conv2_block3_add[0][0] __________________________________________________________________________________________________ conv3_block1_1_conv (Conv2D) (None, None, None, 1 32896 conv2_block3_out[0][0] __________________________________________________________________________________________________ conv3_block1_1_bn (BatchNormali (None, None, None, 1 512 conv3_block1_1_conv[0][0] __________________________________________________________________________________________________ conv3_block1_1_relu (Activation (None, None, None, 1 0 conv3_block1_1_bn[0][0] __________________________________________________________________________________________________ conv3_block1_2_conv (Conv2D) (None, None, None, 1 147584 conv3_block1_1_relu[0][0] __________________________________________________________________________________________________ conv3_block1_2_bn (BatchNormali (None, None, None, 1 512 conv3_block1_2_conv[0][0] __________________________________________________________________________________________________ conv3_block1_2_relu (Activation (None, None, None, 1 0 conv3_block1_2_bn[0][0] __________________________________________________________________________________________________ conv3_block1_0_conv (Conv2D) (None, None, None, 5 131584 conv2_block3_out[0][0] __________________________________________________________________________________________________ conv3_block1_3_conv (Conv2D) (None, None, None, 5 66048 conv3_block1_2_relu[0][0] __________________________________________________________________________________________________ conv3_block1_0_bn (BatchNormali (None, None, None, 5 2048 conv3_block1_0_conv[0][0] __________________________________________________________________________________________________ conv3_block1_3_bn (BatchNormali (None, None, None, 5 2048 conv3_block1_3_conv[0][0] __________________________________________________________________________________________________ conv3_block1_add (Add) (None, None, None, 5 0 conv3_block1_0_bn[0][0] conv3_block1_3_bn[0][0] __________________________________________________________________________________________________ conv3_block1_out (Activation) (None, None, None, 5 0 conv3_block1_add[0][0] __________________________________________________________________________________________________ conv3_block2_1_conv (Conv2D) (None, None, None, 1 65664 conv3_block1_out[0][0] __________________________________________________________________________________________________ conv3_block2_1_bn (BatchNormali (None, None, None, 1 512 conv3_block2_1_conv[0][0] __________________________________________________________________________________________________ conv3_block2_1_relu (Activation (None, None, None, 1 0 conv3_block2_1_bn[0][0] __________________________________________________________________________________________________ conv3_block2_2_conv (Conv2D) (None, None, None, 1 147584 conv3_block2_1_relu[0][0] __________________________________________________________________________________________________ conv3_block2_2_bn (BatchNormali (None, None, None, 1 512 conv3_block2_2_conv[0][0] __________________________________________________________________________________________________ conv3_block2_2_relu (Activation (None, None, None, 1 0 conv3_block2_2_bn[0][0] __________________________________________________________________________________________________ conv3_block2_3_conv (Conv2D) (None, None, None, 5 66048 conv3_block2_2_relu[0][0] __________________________________________________________________________________________________ conv3_block2_3_bn (BatchNormali (None, None, None, 5 2048 conv3_block2_3_conv[0][0] __________________________________________________________________________________________________ conv3_block2_add (Add) (None, None, None, 5 0 conv3_block1_out[0][0] conv3_block2_3_bn[0][0] __________________________________________________________________________________________________ conv3_block2_out (Activation) (None, None, None, 5 0 conv3_block2_add[0][0] __________________________________________________________________________________________________ conv3_block3_1_conv (Conv2D) (None, None, None, 1 65664 conv3_block2_out[0][0] __________________________________________________________________________________________________ conv3_block3_1_bn (BatchNormali (None, None, None, 1 512 conv3_block3_1_conv[0][0] __________________________________________________________________________________________________ conv3_block3_1_relu (Activation (None, None, None, 1 0 conv3_block3_1_bn[0][0] __________________________________________________________________________________________________ conv3_block3_2_conv (Conv2D) (None, None, None, 1 147584 conv3_block3_1_relu[0][0] __________________________________________________________________________________________________ conv3_block3_2_bn (BatchNormali (None, None, None, 1 512 conv3_block3_2_conv[0][0] __________________________________________________________________________________________________ conv3_block3_2_relu (Activation (None, None, None, 1 0 conv3_block3_2_bn[0][0] __________________________________________________________________________________________________ conv3_block3_3_conv (Conv2D) (None, None, None, 5 66048 conv3_block3_2_relu[0][0] __________________________________________________________________________________________________ conv3_block3_3_bn (BatchNormali (None, None, None, 5 2048 conv3_block3_3_conv[0][0] __________________________________________________________________________________________________ conv3_block3_add (Add) (None, None, None, 5 0 conv3_block2_out[0][0] conv3_block3_3_bn[0][0] __________________________________________________________________________________________________ conv3_block3_out (Activation) (None, None, None, 5 0 conv3_block3_add[0][0] __________________________________________________________________________________________________ conv3_block4_1_conv (Conv2D) (None, None, None, 1 65664 conv3_block3_out[0][0] __________________________________________________________________________________________________ conv3_block4_1_bn (BatchNormali (None, None, None, 1 512 conv3_block4_1_conv[0][0] __________________________________________________________________________________________________ conv3_block4_1_relu (Activation (None, None, None, 1 0 conv3_block4_1_bn[0][0] __________________________________________________________________________________________________ conv3_block4_2_conv (Conv2D) (None, None, None, 1 147584 conv3_block4_1_relu[0][0] __________________________________________________________________________________________________ conv3_block4_2_bn (BatchNormali (None, None, None, 1 512 conv3_block4_2_conv[0][0] __________________________________________________________________________________________________ conv3_block4_2_relu (Activation (None, None, None, 1 0 conv3_block4_2_bn[0][0] __________________________________________________________________________________________________ conv3_block4_3_conv (Conv2D) (None, None, None, 5 66048 conv3_block4_2_relu[0][0] __________________________________________________________________________________________________ conv3_block4_3_bn (BatchNormali (None, None, None, 5 2048 conv3_block4_3_conv[0][0] __________________________________________________________________________________________________ conv3_block4_add (Add) (None, None, None, 5 0 conv3_block3_out[0][0] conv3_block4_3_bn[0][0] __________________________________________________________________________________________________ conv3_block4_out (Activation) (None, None, None, 5 0 conv3_block4_add[0][0] __________________________________________________________________________________________________ conv4_block1_1_conv (Conv2D) (None, None, None, 2 131328 conv3_block4_out[0][0] __________________________________________________________________________________________________ conv4_block1_1_bn (BatchNormali (None, None, None, 2 1024 conv4_block1_1_conv[0][0] __________________________________________________________________________________________________ conv4_block1_1_relu (Activation (None, None, None, 2 0 conv4_block1_1_bn[0][0] __________________________________________________________________________________________________ conv4_block1_2_conv (Conv2D) (None, None, None, 2 590080 conv4_block1_1_relu[0][0] __________________________________________________________________________________________________ conv4_block1_2_bn (BatchNormali (None, None, None, 2 1024 conv4_block1_2_conv[0][0] __________________________________________________________________________________________________ conv4_block1_2_relu (Activation (None, None, None, 2 0 conv4_block1_2_bn[0][0] __________________________________________________________________________________________________ conv4_block1_0_conv (Conv2D) (None, None, None, 1 525312 conv3_block4_out[0][0] __________________________________________________________________________________________________ conv4_block1_3_conv (Conv2D) (None, None, None, 1 263168 conv4_block1_2_relu[0][0] __________________________________________________________________________________________________ conv4_block1_0_bn (BatchNormali (None, None, None, 1 4096 conv4_block1_0_conv[0][0] __________________________________________________________________________________________________ conv4_block1_3_bn (BatchNormali (None, None, None, 1 4096 conv4_block1_3_conv[0][0] __________________________________________________________________________________________________ conv4_block1_add (Add) (None, None, None, 1 0 conv4_block1_0_bn[0][0] conv4_block1_3_bn[0][0] __________________________________________________________________________________________________ conv4_block1_out (Activation) (None, None, None, 1 0 conv4_block1_add[0][0] __________________________________________________________________________________________________ conv4_block2_1_conv (Conv2D) (None, None, None, 2 262400 conv4_block1_out[0][0] __________________________________________________________________________________________________ conv4_block2_1_bn (BatchNormali (None, None, None, 2 1024 conv4_block2_1_conv[0][0] __________________________________________________________________________________________________ conv4_block2_1_relu (Activation (None, None, None, 2 0 conv4_block2_1_bn[0][0] __________________________________________________________________________________________________ conv4_block2_2_conv (Conv2D) (None, None, None, 2 590080 conv4_block2_1_relu[0][0] __________________________________________________________________________________________________ conv4_block2_2_bn (BatchNormali (None, None, None, 2 1024 conv4_block2_2_conv[0][0] __________________________________________________________________________________________________ conv4_block2_2_relu (Activation (None, None, None, 2 0 conv4_block2_2_bn[0][0] __________________________________________________________________________________________________ conv4_block2_3_conv (Conv2D) (None, None, None, 1 263168 conv4_block2_2_relu[0][0] __________________________________________________________________________________________________ conv4_block2_3_bn (BatchNormali (None, None, None, 1 4096 conv4_block2_3_conv[0][0] __________________________________________________________________________________________________ conv4_block2_add (Add) (None, None, None, 1 0 conv4_block1_out[0][0] conv4_block2_3_bn[0][0] __________________________________________________________________________________________________ conv4_block2_out (Activation) (None, None, None, 1 0 conv4_block2_add[0][0] __________________________________________________________________________________________________ conv4_block3_1_conv (Conv2D) (None, None, None, 2 262400 conv4_block2_out[0][0] __________________________________________________________________________________________________ conv4_block3_1_bn (BatchNormali (None, None, None, 2 1024 conv4_block3_1_conv[0][0] __________________________________________________________________________________________________ conv4_block3_1_relu (Activation (None, None, None, 2 0 conv4_block3_1_bn[0][0] __________________________________________________________________________________________________ conv4_block3_2_conv (Conv2D) (None, None, None, 2 590080 conv4_block3_1_relu[0][0] __________________________________________________________________________________________________ conv4_block3_2_bn (BatchNormali (None, None, None, 2 1024 conv4_block3_2_conv[0][0] __________________________________________________________________________________________________ conv4_block3_2_relu (Activation (None, None, None, 2 0 conv4_block3_2_bn[0][0] __________________________________________________________________________________________________ conv4_block3_3_conv (Conv2D) (None, None, None, 1 263168 conv4_block3_2_relu[0][0] __________________________________________________________________________________________________ conv4_block3_3_bn (BatchNormali (None, None, None, 1 4096 conv4_block3_3_conv[0][0] __________________________________________________________________________________________________ conv4_block3_add (Add) (None, None, None, 1 0 conv4_block2_out[0][0] conv4_block3_3_bn[0][0] __________________________________________________________________________________________________ conv4_block3_out (Activation) (None, None, None, 1 0 conv4_block3_add[0][0] __________________________________________________________________________________________________ conv4_block4_1_conv (Conv2D) (None, None, None, 2 262400 conv4_block3_out[0][0] __________________________________________________________________________________________________ conv4_block4_1_bn (BatchNormali (None, None, None, 2 1024 conv4_block4_1_conv[0][0] __________________________________________________________________________________________________ conv4_block4_1_relu (Activation (None, None, None, 2 0 conv4_block4_1_bn[0][0] __________________________________________________________________________________________________ conv4_block4_2_conv (Conv2D) (None, None, None, 2 590080 conv4_block4_1_relu[0][0] __________________________________________________________________________________________________ conv4_block4_2_bn (BatchNormali (None, None, None, 2 1024 conv4_block4_2_conv[0][0] __________________________________________________________________________________________________ conv4_block4_2_relu (Activation (None, None, None, 2 0 conv4_block4_2_bn[0][0] __________________________________________________________________________________________________ conv4_block4_3_conv (Conv2D) (None, None, None, 1 263168 conv4_block4_2_relu[0][0] __________________________________________________________________________________________________ conv4_block4_3_bn (BatchNormali (None, None, None, 1 4096 conv4_block4_3_conv[0][0] __________________________________________________________________________________________________ conv4_block4_add (Add) (None, None, None, 1 0 conv4_block3_out[0][0] conv4_block4_3_bn[0][0] __________________________________________________________________________________________________ conv4_block4_out (Activation) (None, None, None, 1 0 conv4_block4_add[0][0] __________________________________________________________________________________________________ conv4_block5_1_conv (Conv2D) (None, None, None, 2 262400 conv4_block4_out[0][0] __________________________________________________________________________________________________ conv4_block5_1_bn (BatchNormali (None, None, None, 2 1024 conv4_block5_1_conv[0][0] __________________________________________________________________________________________________ conv4_block5_1_relu (Activation (None, None, None, 2 0 conv4_block5_1_bn[0][0] __________________________________________________________________________________________________ conv4_block5_2_conv (Conv2D) (None, None, None, 2 590080 conv4_block5_1_relu[0][0] __________________________________________________________________________________________________ conv4_block5_2_bn (BatchNormali (None, None, None, 2 1024 conv4_block5_2_conv[0][0] __________________________________________________________________________________________________ conv4_block5_2_relu (Activation (None, None, None, 2 0 conv4_block5_2_bn[0][0] __________________________________________________________________________________________________ conv4_block5_3_conv (Conv2D) (None, None, None, 1 263168 conv4_block5_2_relu[0][0] __________________________________________________________________________________________________ conv4_block5_3_bn (BatchNormali (None, None, None, 1 4096 conv4_block5_3_conv[0][0] __________________________________________________________________________________________________ conv4_block5_add (Add) (None, None, None, 1 0 conv4_block4_out[0][0] conv4_block5_3_bn[0][0] __________________________________________________________________________________________________ conv4_block5_out (Activation) (None, None, None, 1 0 conv4_block5_add[0][0] __________________________________________________________________________________________________ conv4_block6_1_conv (Conv2D) (None, None, None, 2 262400 conv4_block5_out[0][0] __________________________________________________________________________________________________ conv4_block6_1_bn (BatchNormali (None, None, None, 2 1024 conv4_block6_1_conv[0][0] __________________________________________________________________________________________________ conv4_block6_1_relu (Activation (None, None, None, 2 0 conv4_block6_1_bn[0][0] __________________________________________________________________________________________________ conv4_block6_2_conv (Conv2D) (None, None, None, 2 590080 conv4_block6_1_relu[0][0] __________________________________________________________________________________________________ conv4_block6_2_bn (BatchNormali (None, None, None, 2 1024 conv4_block6_2_conv[0][0] __________________________________________________________________________________________________ conv4_block6_2_relu (Activation (None, None, None, 2 0 conv4_block6_2_bn[0][0] __________________________________________________________________________________________________ conv4_block6_3_conv (Conv2D) (None, None, None, 1 263168 conv4_block6_2_relu[0][0] __________________________________________________________________________________________________ conv4_block6_3_bn (BatchNormali (None, None, None, 1 4096 conv4_block6_3_conv[0][0] __________________________________________________________________________________________________ conv4_block6_add (Add) (None, None, None, 1 0 conv4_block5_out[0][0] conv4_block6_3_bn[0][0] __________________________________________________________________________________________________ conv4_block6_out (Activation) (None, None, None, 1 0 conv4_block6_add[0][0] __________________________________________________________________________________________________ conv5_block1_1_conv (Conv2D) (None, None, None, 5 524800 conv4_block6_out[0][0] __________________________________________________________________________________________________ conv5_block1_1_bn (BatchNormali (None, None, None, 5 2048 conv5_block1_1_conv[0][0] __________________________________________________________________________________________________ conv5_block1_1_relu (Activation (None, None, None, 5 0 conv5_block1_1_bn[0][0] __________________________________________________________________________________________________ conv5_block1_2_conv (Conv2D) (None, None, None, 5 2359808 conv5_block1_1_relu[0][0] __________________________________________________________________________________________________ conv5_block1_2_bn (BatchNormali (None, None, None, 5 2048 conv5_block1_2_conv[0][0] __________________________________________________________________________________________________ conv5_block1_2_relu (Activation (None, None, None, 5 0 conv5_block1_2_bn[0][0] __________________________________________________________________________________________________ conv5_block1_0_conv (Conv2D) (None, None, None, 2 2099200 conv4_block6_out[0][0] __________________________________________________________________________________________________ conv5_block1_3_conv (Conv2D) (None, None, None, 2 1050624 conv5_block1_2_relu[0][0] __________________________________________________________________________________________________ conv5_block1_0_bn (BatchNormali (None, None, None, 2 8192 conv5_block1_0_conv[0][0] __________________________________________________________________________________________________ conv5_block1_3_bn (BatchNormali (None, None, None, 2 8192 conv5_block1_3_conv[0][0] __________________________________________________________________________________________________ conv5_block1_add (Add) (None, None, None, 2 0 conv5_block1_0_bn[0][0] conv5_block1_3_bn[0][0] __________________________________________________________________________________________________ conv5_block1_out (Activation) (None, None, None, 2 0 conv5_block1_add[0][0] __________________________________________________________________________________________________ conv5_block2_1_conv (Conv2D) (None, None, None, 5 1049088 conv5_block1_out[0][0] __________________________________________________________________________________________________ conv5_block2_1_bn (BatchNormali (None, None, None, 5 2048 conv5_block2_1_conv[0][0] __________________________________________________________________________________________________ conv5_block2_1_relu (Activation (None, None, None, 5 0 conv5_block2_1_bn[0][0] __________________________________________________________________________________________________ conv5_block2_2_conv (Conv2D) (None, None, None, 5 2359808 conv5_block2_1_relu[0][0] __________________________________________________________________________________________________ conv5_block2_2_bn (BatchNormali (None, None, None, 5 2048 conv5_block2_2_conv[0][0] __________________________________________________________________________________________________ conv5_block2_2_relu (Activation (None, None, None, 5 0 conv5_block2_2_bn[0][0] __________________________________________________________________________________________________ conv5_block2_3_conv (Conv2D) (None, None, None, 2 1050624 conv5_block2_2_relu[0][0] __________________________________________________________________________________________________ conv5_block2_3_bn (BatchNormali (None, None, None, 2 8192 conv5_block2_3_conv[0][0] __________________________________________________________________________________________________ conv5_block2_add (Add) (None, None, None, 2 0 conv5_block1_out[0][0] conv5_block2_3_bn[0][0] __________________________________________________________________________________________________ conv5_block2_out (Activation) (None, None, None, 2 0 conv5_block2_add[0][0] __________________________________________________________________________________________________ conv5_block3_1_conv (Conv2D) (None, None, None, 5 1049088 conv5_block2_out[0][0] __________________________________________________________________________________________________ conv5_block3_1_bn (BatchNormali (None, None, None, 5 2048 conv5_block3_1_conv[0][0] __________________________________________________________________________________________________ conv5_block3_1_relu (Activation (None, None, None, 5 0 conv5_block3_1_bn[0][0] __________________________________________________________________________________________________ conv5_block3_2_conv (Conv2D) (None, None, None, 5 2359808 conv5_block3_1_relu[0][0] __________________________________________________________________________________________________ conv5_block3_2_bn (BatchNormali (None, None, None, 5 2048 conv5_block3_2_conv[0][0] __________________________________________________________________________________________________ conv5_block3_2_relu (Activation (None, None, None, 5 0 conv5_block3_2_bn[0][0] __________________________________________________________________________________________________ conv5_block3_3_conv (Conv2D) (None, None, None, 2 1050624 conv5_block3_2_relu[0][0] __________________________________________________________________________________________________ conv5_block3_3_bn (BatchNormali (None, None, None, 2 8192 conv5_block3_3_conv[0][0] __________________________________________________________________________________________________ conv5_block3_add (Add) (None, None, None, 2 0 conv5_block2_out[0][0] conv5_block3_3_bn[0][0] __________________________________________________________________________________________________ conv5_block3_out (Activation) (None, None, None, 2 0 conv5_block3_add[0][0] ================================================================================================== Total params: 23,587,712 Trainable params: 23,534,592 Non-trainable params: 53,120 __________________________________________________________________________________________________

Import Inception V3

With final layer

inception = InceptionV3() inception.summary()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/inception_v3/inception_v3_weights_tf_dim_ordering_tf_kernels.h5 96116736/96112376 [==============================] - 1s 0us/step Model: "inception_v3" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_5 (InputLayer) [(None, 299, 299, 3) 0 __________________________________________________________________________________________________ conv2d (Conv2D) (None, 149, 149, 32) 864 input_5[0][0] __________________________________________________________________________________________________ batch_normalization (BatchNorma (None, 149, 149, 32) 96 conv2d[0][0] __________________________________________________________________________________________________ activation (Activation) (None, 149, 149, 32) 0 batch_normalization[0][0] __________________________________________________________________________________________________ conv2d_1 (Conv2D) (None, 147, 147, 32) 9216 activation[0][0] __________________________________________________________________________________________________ batch_normalization_1 (BatchNor (None, 147, 147, 32) 96 conv2d_1[0][0] __________________________________________________________________________________________________ activation_1 (Activation) (None, 147, 147, 32) 0 batch_normalization_1[0][0] __________________________________________________________________________________________________ conv2d_2 (Conv2D) (None, 147, 147, 64) 18432 activation_1[0][0] __________________________________________________________________________________________________ batch_normalization_2 (BatchNor (None, 147, 147, 64) 192 conv2d_2[0][0] __________________________________________________________________________________________________ activation_2 (Activation) (None, 147, 147, 64) 0 batch_normalization_2[0][0] __________________________________________________________________________________________________ max_pooling2d (MaxPooling2D) (None, 73, 73, 64) 0 activation_2[0][0] __________________________________________________________________________________________________ conv2d_3 (Conv2D) (None, 73, 73, 80) 5120 max_pooling2d[0][0] __________________________________________________________________________________________________ batch_normalization_3 (BatchNor (None, 73, 73, 80) 240 conv2d_3[0][0] __________________________________________________________________________________________________ activation_3 (Activation) (None, 73, 73, 80) 0 batch_normalization_3[0][0] __________________________________________________________________________________________________ conv2d_4 (Conv2D) (None, 71, 71, 192) 138240 activation_3[0][0] __________________________________________________________________________________________________ batch_normalization_4 (BatchNor (None, 71, 71, 192) 576 conv2d_4[0][0] __________________________________________________________________________________________________ activation_4 (Activation) (None, 71, 71, 192) 0 batch_normalization_4[0][0] __________________________________________________________________________________________________ max_pooling2d_1 (MaxPooling2D) (None, 35, 35, 192) 0 activation_4[0][0] __________________________________________________________________________________________________ conv2d_8 (Conv2D) (None, 35, 35, 64) 12288 max_pooling2d_1[0][0] __________________________________________________________________________________________________ batch_normalization_8 (BatchNor (None, 35, 35, 64) 192 conv2d_8[0][0] __________________________________________________________________________________________________ activation_8 (Activation) (None, 35, 35, 64) 0 batch_normalization_8[0][0] __________________________________________________________________________________________________ conv2d_6 (Conv2D) (None, 35, 35, 48) 9216 max_pooling2d_1[0][0] __________________________________________________________________________________________________ conv2d_9 (Conv2D) (None, 35, 35, 96) 55296 activation_8[0][0] __________________________________________________________________________________________________ batch_normalization_6 (BatchNor (None, 35, 35, 48) 144 conv2d_6[0][0] __________________________________________________________________________________________________ batch_normalization_9 (BatchNor (None, 35, 35, 96) 288 conv2d_9[0][0] __________________________________________________________________________________________________ activation_6 (Activation) (None, 35, 35, 48) 0 batch_normalization_6[0][0] __________________________________________________________________________________________________ activation_9 (Activation) (None, 35, 35, 96) 0 batch_normalization_9[0][0] __________________________________________________________________________________________________ average_pooling2d (AveragePooli (None, 35, 35, 192) 0 max_pooling2d_1[0][0] __________________________________________________________________________________________________ conv2d_5 (Conv2D) (None, 35, 35, 64) 12288 max_pooling2d_1[0][0] __________________________________________________________________________________________________ conv2d_7 (Conv2D) (None, 35, 35, 64) 76800 activation_6[0][0] __________________________________________________________________________________________________ conv2d_10 (Conv2D) (None, 35, 35, 96) 82944 activation_9[0][0] __________________________________________________________________________________________________ conv2d_11 (Conv2D) (None, 35, 35, 32) 6144 average_pooling2d[0][0] __________________________________________________________________________________________________ batch_normalization_5 (BatchNor (None, 35, 35, 64) 192 conv2d_5[0][0] __________________________________________________________________________________________________ batch_normalization_7 (BatchNor (None, 35, 35, 64) 192 conv2d_7[0][0] __________________________________________________________________________________________________ batch_normalization_10 (BatchNo (None, 35, 35, 96) 288 conv2d_10[0][0] __________________________________________________________________________________________________ batch_normalization_11 (BatchNo (None, 35, 35, 32) 96 conv2d_11[0][0] __________________________________________________________________________________________________ activation_5 (Activation) (None, 35, 35, 64) 0 batch_normalization_5[0][0] __________________________________________________________________________________________________ activation_7 (Activation) (None, 35, 35, 64) 0 batch_normalization_7[0][0] __________________________________________________________________________________________________ activation_10 (Activation) (None, 35, 35, 96) 0 batch_normalization_10[0][0] __________________________________________________________________________________________________ activation_11 (Activation) (None, 35, 35, 32) 0 batch_normalization_11[0][0] __________________________________________________________________________________________________ mixed0 (Concatenate) (None, 35, 35, 256) 0 activation_5[0][0] activation_7[0][0] activation_10[0][0] activation_11[0][0] __________________________________________________________________________________________________ conv2d_15 (Conv2D) (None, 35, 35, 64) 16384 mixed0[0][0] __________________________________________________________________________________________________ batch_normalization_15 (BatchNo (None, 35, 35, 64) 192 conv2d_15[0][0] __________________________________________________________________________________________________ activation_15 (Activation) (None, 35, 35, 64) 0 batch_normalization_15[0][0] __________________________________________________________________________________________________ conv2d_13 (Conv2D) (None, 35, 35, 48) 12288 mixed0[0][0] __________________________________________________________________________________________________ conv2d_16 (Conv2D) (None, 35, 35, 96) 55296 activation_15[0][0] __________________________________________________________________________________________________ batch_normalization_13 (BatchNo (None, 35, 35, 48) 144 conv2d_13[0][0] __________________________________________________________________________________________________ batch_normalization_16 (BatchNo (None, 35, 35, 96) 288 conv2d_16[0][0] __________________________________________________________________________________________________ activation_13 (Activation) (None, 35, 35, 48) 0 batch_normalization_13[0][0] __________________________________________________________________________________________________ activation_16 (Activation) (None, 35, 35, 96) 0 batch_normalization_16[0][0] __________________________________________________________________________________________________ average_pooling2d_1 (AveragePoo (None, 35, 35, 256) 0 mixed0[0][0] __________________________________________________________________________________________________ conv2d_12 (Conv2D) (None, 35, 35, 64) 16384 mixed0[0][0] __________________________________________________________________________________________________ conv2d_14 (Conv2D) (None, 35, 35, 64) 76800 activation_13[0][0] __________________________________________________________________________________________________ conv2d_17 (Conv2D) (None, 35, 35, 96) 82944 activation_16[0][0] __________________________________________________________________________________________________ conv2d_18 (Conv2D) (None, 35, 35, 64) 16384 average_pooling2d_1[0][0] __________________________________________________________________________________________________ batch_normalization_12 (BatchNo (None, 35, 35, 64) 192 conv2d_12[0][0] __________________________________________________________________________________________________ batch_normalization_14 (BatchNo (None, 35, 35, 64) 192 conv2d_14[0][0] __________________________________________________________________________________________________ batch_normalization_17 (BatchNo (None, 35, 35, 96) 288 conv2d_17[0][0] __________________________________________________________________________________________________ batch_normalization_18 (BatchNo (None, 35, 35, 64) 192 conv2d_18[0][0] __________________________________________________________________________________________________ activation_12 (Activation) (None, 35, 35, 64) 0 batch_normalization_12[0][0] __________________________________________________________________________________________________ activation_14 (Activation) (None, 35, 35, 64) 0 batch_normalization_14[0][0] __________________________________________________________________________________________________ activation_17 (Activation) (None, 35, 35, 96) 0 batch_normalization_17[0][0] __________________________________________________________________________________________________ activation_18 (Activation) (None, 35, 35, 64) 0 batch_normalization_18[0][0] __________________________________________________________________________________________________ mixed1 (Concatenate) (None, 35, 35, 288) 0 activation_12[0][0] activation_14[0][0] activation_17[0][0] activation_18[0][0] __________________________________________________________________________________________________ conv2d_22 (Conv2D) (None, 35, 35, 64) 18432 mixed1[0][0] __________________________________________________________________________________________________ batch_normalization_22 (BatchNo (None, 35, 35, 64) 192 conv2d_22[0][0] __________________________________________________________________________________________________ activation_22 (Activation) (None, 35, 35, 64) 0 batch_normalization_22[0][0] __________________________________________________________________________________________________ conv2d_20 (Conv2D) (None, 35, 35, 48) 13824 mixed1[0][0] __________________________________________________________________________________________________ conv2d_23 (Conv2D) (None, 35, 35, 96) 55296 activation_22[0][0] __________________________________________________________________________________________________ batch_normalization_20 (BatchNo (None, 35, 35, 48) 144 conv2d_20[0][0] __________________________________________________________________________________________________ batch_normalization_23 (BatchNo (None, 35, 35, 96) 288 conv2d_23[0][0] __________________________________________________________________________________________________ activation_20 (Activation) (None, 35, 35, 48) 0 batch_normalization_20[0][0] __________________________________________________________________________________________________ activation_23 (Activation) (None, 35, 35, 96) 0 batch_normalization_23[0][0] __________________________________________________________________________________________________ average_pooling2d_2 (AveragePoo (None, 35, 35, 288) 0 mixed1[0][0] __________________________________________________________________________________________________ conv2d_19 (Conv2D) (None, 35, 35, 64) 18432 mixed1[0][0] __________________________________________________________________________________________________ conv2d_21 (Conv2D) (None, 35, 35, 64) 76800 activation_20[0][0] __________________________________________________________________________________________________ conv2d_24 (Conv2D) (None, 35, 35, 96) 82944 activation_23[0][0] __________________________________________________________________________________________________ conv2d_25 (Conv2D) (None, 35, 35, 64) 18432 average_pooling2d_2[0][0] __________________________________________________________________________________________________ batch_normalization_19 (BatchNo (None, 35, 35, 64) 192 conv2d_19[0][0] __________________________________________________________________________________________________ batch_normalization_21 (BatchNo (None, 35, 35, 64) 192 conv2d_21[0][0] __________________________________________________________________________________________________ batch_normalization_24 (BatchNo (None, 35, 35, 96) 288 conv2d_24[0][0] __________________________________________________________________________________________________ batch_normalization_25 (BatchNo (None, 35, 35, 64) 192 conv2d_25[0][0] __________________________________________________________________________________________________ activation_19 (Activation) (None, 35, 35, 64) 0 batch_normalization_19[0][0] __________________________________________________________________________________________________ activation_21 (Activation) (None, 35, 35, 64) 0 batch_normalization_21[0][0] __________________________________________________________________________________________________ activation_24 (Activation) (None, 35, 35, 96) 0 batch_normalization_24[0][0] __________________________________________________________________________________________________ activation_25 (Activation) (None, 35, 35, 64) 0 batch_normalization_25[0][0] __________________________________________________________________________________________________ mixed2 (Concatenate) (None, 35, 35, 288) 0 activation_19[0][0] activation_21[0][0] activation_24[0][0] activation_25[0][0] __________________________________________________________________________________________________ conv2d_27 (Conv2D) (None, 35, 35, 64) 18432 mixed2[0][0] __________________________________________________________________________________________________ batch_normalization_27 (BatchNo (None, 35, 35, 64) 192 conv2d_27[0][0] __________________________________________________________________________________________________ activation_27 (Activation) (None, 35, 35, 64) 0 batch_normalization_27[0][0] __________________________________________________________________________________________________ conv2d_28 (Conv2D) (None, 35, 35, 96) 55296 activation_27[0][0] __________________________________________________________________________________________________ batch_normalization_28 (BatchNo (None, 35, 35, 96) 288 conv2d_28[0][0] __________________________________________________________________________________________________ activation_28 (Activation) (None, 35, 35, 96) 0 batch_normalization_28[0][0] __________________________________________________________________________________________________ conv2d_26 (Conv2D) (None, 17, 17, 384) 995328 mixed2[0][0] __________________________________________________________________________________________________ conv2d_29 (Conv2D) (None, 17, 17, 96) 82944 activation_28[0][0] __________________________________________________________________________________________________ batch_normalization_26 (BatchNo (None, 17, 17, 384) 1152 conv2d_26[0][0] __________________________________________________________________________________________________ batch_normalization_29 (BatchNo (None, 17, 17, 96) 288 conv2d_29[0][0] __________________________________________________________________________________________________ activation_26 (Activation) (None, 17, 17, 384) 0 batch_normalization_26[0][0] __________________________________________________________________________________________________ activation_29 (Activation) (None, 17, 17, 96) 0 batch_normalization_29[0][0] __________________________________________________________________________________________________ max_pooling2d_2 (MaxPooling2D) (None, 17, 17, 288) 0 mixed2[0][0] __________________________________________________________________________________________________ mixed3 (Concatenate) (None, 17, 17, 768) 0 activation_26[0][0] activation_29[0][0] max_pooling2d_2[0][0] __________________________________________________________________________________________________ conv2d_34 (Conv2D) (None, 17, 17, 128) 98304 mixed3[0][0] __________________________________________________________________________________________________ batch_normalization_34 (BatchNo (None, 17, 17, 128) 384 conv2d_34[0][0] __________________________________________________________________________________________________ activation_34 (Activation) (None, 17, 17, 128) 0 batch_normalization_34[0][0] __________________________________________________________________________________________________ conv2d_35 (Conv2D) (None, 17, 17, 128) 114688 activation_34[0][0] __________________________________________________________________________________________________ batch_normalization_35 (BatchNo (None, 17, 17, 128) 384 conv2d_35[0][0] __________________________________________________________________________________________________ activation_35 (Activation) (None, 17, 17, 128) 0 batch_normalization_35[0][0] __________________________________________________________________________________________________ conv2d_31 (Conv2D) (None, 17, 17, 128) 98304 mixed3[0][0] __________________________________________________________________________________________________ conv2d_36 (Conv2D) (None, 17, 17, 128) 114688 activation_35[0][0] __________________________________________________________________________________________________ batch_normalization_31 (BatchNo (None, 17, 17, 128) 384 conv2d_31[0][0] __________________________________________________________________________________________________ batch_normalization_36 (BatchNo (None, 17, 17, 128) 384 conv2d_36[0][0] __________________________________________________________________________________________________ activation_31 (Activation) (None, 17, 17, 128) 0 batch_normalization_31[0][0] __________________________________________________________________________________________________ activation_36 (Activation) (None, 17, 17, 128) 0 batch_normalization_36[0][0] __________________________________________________________________________________________________ conv2d_32 (Conv2D) (None, 17, 17, 128) 114688 activation_31[0][0] __________________________________________________________________________________________________ conv2d_37 (Conv2D) (None, 17, 17, 128) 114688 activation_36[0][0] __________________________________________________________________________________________________ batch_normalization_32 (BatchNo (None, 17, 17, 128) 384 conv2d_32[0][0] __________________________________________________________________________________________________ batch_normalization_37 (BatchNo (None, 17, 17, 128) 384 conv2d_37[0][0] __________________________________________________________________________________________________ activation_32 (Activation) (None, 17, 17, 128) 0 batch_normalization_32[0][0] __________________________________________________________________________________________________ activation_37 (Activation) (None, 17, 17, 128) 0 batch_normalization_37[0][0] __________________________________________________________________________________________________ average_pooling2d_3 (AveragePoo (None, 17, 17, 768) 0 mixed3[0][0] __________________________________________________________________________________________________ conv2d_30 (Conv2D) (None, 17, 17, 192) 147456 mixed3[0][0] __________________________________________________________________________________________________ conv2d_33 (Conv2D) (None, 17, 17, 192) 172032 activation_32[0][0] __________________________________________________________________________________________________ conv2d_38 (Conv2D) (None, 17, 17, 192) 172032 activation_37[0][0] __________________________________________________________________________________________________ conv2d_39 (Conv2D) (None, 17, 17, 192) 147456 average_pooling2d_3[0][0] __________________________________________________________________________________________________ batch_normalization_30 (BatchNo (None, 17, 17, 192) 576 conv2d_30[0][0] __________________________________________________________________________________________________ batch_normalization_33 (BatchNo (None, 17, 17, 192) 576 conv2d_33[0][0] __________________________________________________________________________________________________ batch_normalization_38 (BatchNo (None, 17, 17, 192) 576 conv2d_38[0][0] __________________________________________________________________________________________________ batch_normalization_39 (BatchNo (None, 17, 17, 192) 576 conv2d_39[0][0] __________________________________________________________________________________________________ activation_30 (Activation) (None, 17, 17, 192) 0 batch_normalization_30[0][0] __________________________________________________________________________________________________ activation_33 (Activation) (None, 17, 17, 192) 0 batch_normalization_33[0][0] __________________________________________________________________________________________________ activation_38 (Activation) (None, 17, 17, 192) 0 batch_normalization_38[0][0] __________________________________________________________________________________________________ activation_39 (Activation) (None, 17, 17, 192) 0 batch_normalization_39[0][0] __________________________________________________________________________________________________ mixed4 (Concatenate) (None, 17, 17, 768) 0 activation_30[0][0] activation_33[0][0] activation_38[0][0] activation_39[0][0] __________________________________________________________________________________________________ conv2d_44 (Conv2D) (None, 17, 17, 160) 122880 mixed4[0][0] __________________________________________________________________________________________________ batch_normalization_44 (BatchNo (None, 17, 17, 160) 480 conv2d_44[0][0] __________________________________________________________________________________________________ activation_44 (Activation) (None, 17, 17, 160) 0 batch_normalization_44[0][0] __________________________________________________________________________________________________ conv2d_45 (Conv2D) (None, 17, 17, 160) 179200 activation_44[0][0] __________________________________________________________________________________________________ batch_normalization_45 (BatchNo (None, 17, 17, 160) 480 conv2d_45[0][0] __________________________________________________________________________________________________ activation_45 (Activation) (None, 17, 17, 160) 0 batch_normalization_45[0][0] __________________________________________________________________________________________________ conv2d_41 (Conv2D) (None, 17, 17, 160) 122880 mixed4[0][0] __________________________________________________________________________________________________ conv2d_46 (Conv2D) (None, 17, 17, 160) 179200 activation_45[0][0] __________________________________________________________________________________________________ batch_normalization_41 (BatchNo (None, 17, 17, 160) 480 conv2d_41[0][0] __________________________________________________________________________________________________ batch_normalization_46 (BatchNo (None, 17, 17, 160) 480 conv2d_46[0][0] __________________________________________________________________________________________________ activation_41 (Activation) (None, 17, 17, 160) 0 batch_normalization_41[0][0] __________________________________________________________________________________________________ activation_46 (Activation) (None, 17, 17, 160) 0 batch_normalization_46[0][0] __________________________________________________________________________________________________ conv2d_42 (Conv2D) (None, 17, 17, 160) 179200 activation_41[0][0] __________________________________________________________________________________________________ conv2d_47 (Conv2D) (None, 17, 17, 160) 179200 activation_46[0][0] __________________________________________________________________________________________________ batch_normalization_42 (BatchNo (None, 17, 17, 160) 480 conv2d_42[0][0] __________________________________________________________________________________________________ batch_normalization_47 (BatchNo (None, 17, 17, 160) 480 conv2d_47[0][0] __________________________________________________________________________________________________ activation_42 (Activation) (None, 17, 17, 160) 0 batch_normalization_42[0][0] __________________________________________________________________________________________________ activation_47 (Activation) (None, 17, 17, 160) 0 batch_normalization_47[0][0] __________________________________________________________________________________________________ average_pooling2d_4 (AveragePoo (None, 17, 17, 768) 0 mixed4[0][0] __________________________________________________________________________________________________ conv2d_40 (Conv2D) (None, 17, 17, 192) 147456 mixed4[0][0] __________________________________________________________________________________________________ conv2d_43 (Conv2D) (None, 17, 17, 192) 215040 activation_42[0][0] __________________________________________________________________________________________________ conv2d_48 (Conv2D) (None, 17, 17, 192) 215040 activation_47[0][0] __________________________________________________________________________________________________ conv2d_49 (Conv2D) (None, 17, 17, 192) 147456 average_pooling2d_4[0][0] __________________________________________________________________________________________________ batch_normalization_40 (BatchNo (None, 17, 17, 192) 576 conv2d_40[0][0] __________________________________________________________________________________________________ batch_normalization_43 (BatchNo (None, 17, 17, 192) 576 conv2d_43[0][0] __________________________________________________________________________________________________ batch_normalization_48 (BatchNo (None, 17, 17, 192) 576 conv2d_48[0][0] __________________________________________________________________________________________________ batch_normalization_49 (BatchNo (None, 17, 17, 192) 576 conv2d_49[0][0] __________________________________________________________________________________________________ activation_40 (Activation) (None, 17, 17, 192) 0 batch_normalization_40[0][0] __________________________________________________________________________________________________ activation_43 (Activation) (None, 17, 17, 192) 0 batch_normalization_43[0][0] __________________________________________________________________________________________________ activation_48 (Activation) (None, 17, 17, 192) 0 batch_normalization_48[0][0] __________________________________________________________________________________________________ activation_49 (Activation) (None, 17, 17, 192) 0 batch_normalization_49[0][0] __________________________________________________________________________________________________ mixed5 (Concatenate) (None, 17, 17, 768) 0 activation_40[0][0] activation_43[0][0] activation_48[0][0] activation_49[0][0] __________________________________________________________________________________________________ conv2d_54 (Conv2D) (None, 17, 17, 160) 122880 mixed5[0][0] __________________________________________________________________________________________________ batch_normalization_54 (BatchNo (None, 17, 17, 160) 480 conv2d_54[0][0] __________________________________________________________________________________________________ activation_54 (Activation) (None, 17, 17, 160) 0 batch_normalization_54[0][0] __________________________________________________________________________________________________ conv2d_55 (Conv2D) (None, 17, 17, 160) 179200 activation_54[0][0] __________________________________________________________________________________________________ batch_normalization_55 (BatchNo (None, 17, 17, 160) 480 conv2d_55[0][0] __________________________________________________________________________________________________ activation_55 (Activation) (None, 17, 17, 160) 0 batch_normalization_55[0][0] __________________________________________________________________________________________________ conv2d_51 (Conv2D) (None, 17, 17, 160) 122880 mixed5[0][0] __________________________________________________________________________________________________ conv2d_56 (Conv2D) (None, 17, 17, 160) 179200 activation_55[0][0] __________________________________________________________________________________________________ batch_normalization_51 (BatchNo (None, 17, 17, 160) 480 conv2d_51[0][0] __________________________________________________________________________________________________ batch_normalization_56 (BatchNo (None, 17, 17, 160) 480 conv2d_56[0][0] __________________________________________________________________________________________________ activation_51 (Activation) (None, 17, 17, 160) 0 batch_normalization_51[0][0] __________________________________________________________________________________________________ activation_56 (Activation) (None, 17, 17, 160) 0 batch_normalization_56[0][0] __________________________________________________________________________________________________ conv2d_52 (Conv2D) (None, 17, 17, 160) 179200 activation_51[0][0] __________________________________________________________________________________________________ conv2d_57 (Conv2D) (None, 17, 17, 160) 179200 activation_56[0][0] __________________________________________________________________________________________________ batch_normalization_52 (BatchNo (None, 17, 17, 160) 480 conv2d_52[0][0] __________________________________________________________________________________________________ batch_normalization_57 (BatchNo (None, 17, 17, 160) 480 conv2d_57[0][0] __________________________________________________________________________________________________ activation_52 (Activation) (None, 17, 17, 160) 0 batch_normalization_52[0][0] __________________________________________________________________________________________________ activation_57 (Activation) (None, 17, 17, 160) 0 batch_normalization_57[0][0] __________________________________________________________________________________________________ average_pooling2d_5 (AveragePoo (None, 17, 17, 768) 0 mixed5[0][0] __________________________________________________________________________________________________ conv2d_50 (Conv2D) (None, 17, 17, 192) 147456 mixed5[0][0] __________________________________________________________________________________________________ conv2d_53 (Conv2D) (None, 17, 17, 192) 215040 activation_52[0][0] __________________________________________________________________________________________________ conv2d_58 (Conv2D) (None, 17, 17, 192) 215040 activation_57[0][0] __________________________________________________________________________________________________ conv2d_59 (Conv2D) (None, 17, 17, 192) 147456 average_pooling2d_5[0][0] __________________________________________________________________________________________________ batch_normalization_50 (BatchNo (None, 17, 17, 192) 576 conv2d_50[0][0] __________________________________________________________________________________________________ batch_normalization_53 (BatchNo (None, 17, 17, 192) 576 conv2d_53[0][0] __________________________________________________________________________________________________ batch_normalization_58 (BatchNo (None, 17, 17, 192) 576 conv2d_58[0][0] __________________________________________________________________________________________________ batch_normalization_59 (BatchNo (None, 17, 17, 192) 576 conv2d_59[0][0] __________________________________________________________________________________________________ activation_50 (Activation) (None, 17, 17, 192) 0 batch_normalization_50[0][0] __________________________________________________________________________________________________ activation_53 (Activation) (None, 17, 17, 192) 0 batch_normalization_53[0][0] __________________________________________________________________________________________________ activation_58 (Activation) (None, 17, 17, 192) 0 batch_normalization_58[0][0] __________________________________________________________________________________________________ activation_59 (Activation) (None, 17, 17, 192) 0 batch_normalization_59[0][0] __________________________________________________________________________________________________ mixed6 (Concatenate) (None, 17, 17, 768) 0 activation_50[0][0] activation_53[0][0] activation_58[0][0] activation_59[0][0] __________________________________________________________________________________________________ conv2d_64 (Conv2D) (None, 17, 17, 192) 147456 mixed6[0][0] __________________________________________________________________________________________________ batch_normalization_64 (BatchNo (None, 17, 17, 192) 576 conv2d_64[0][0] __________________________________________________________________________________________________ activation_64 (Activation) (None, 17, 17, 192) 0 batch_normalization_64[0][0] __________________________________________________________________________________________________ conv2d_65 (Conv2D) (None, 17, 17, 192) 258048 activation_64[0][0] __________________________________________________________________________________________________ batch_normalization_65 (BatchNo (None, 17, 17, 192) 576 conv2d_65[0][0] __________________________________________________________________________________________________ activation_65 (Activation) (None, 17, 17, 192) 0 batch_normalization_65[0][0] __________________________________________________________________________________________________ conv2d_61 (Conv2D) (None, 17, 17, 192) 147456 mixed6[0][0] __________________________________________________________________________________________________ conv2d_66 (Conv2D) (None, 17, 17, 192) 258048 activation_65[0][0] __________________________________________________________________________________________________ batch_normalization_61 (BatchNo (None, 17, 17, 192) 576 conv2d_61[0][0] __________________________________________________________________________________________________ batch_normalization_66 (BatchNo (None, 17, 17, 192) 576 conv2d_66[0][0] __________________________________________________________________________________________________ activation_61 (Activation) (None, 17, 17, 192) 0 batch_normalization_61[0][0] __________________________________________________________________________________________________ activation_66 (Activation) (None, 17, 17, 192) 0 batch_normalization_66[0][0] __________________________________________________________________________________________________ conv2d_62 (Conv2D) (None, 17, 17, 192) 258048 activation_61[0][0] __________________________________________________________________________________________________ conv2d_67 (Conv2D) (None, 17, 17, 192) 258048 activation_66[0][0] __________________________________________________________________________________________________ batch_normalization_62 (BatchNo (None, 17, 17, 192) 576 conv2d_62[0][0] __________________________________________________________________________________________________ batch_normalization_67 (BatchNo (None, 17, 17, 192) 576 conv2d_67[0][0] __________________________________________________________________________________________________ activation_62 (Activation) (None, 17, 17, 192) 0 batch_normalization_62[0][0] __________________________________________________________________________________________________ activation_67 (Activation) (None, 17, 17, 192) 0 batch_normalization_67[0][0] __________________________________________________________________________________________________ average_pooling2d_6 (AveragePoo (None, 17, 17, 768) 0 mixed6[0][0] __________________________________________________________________________________________________ conv2d_60 (Conv2D) (None, 17, 17, 192) 147456 mixed6[0][0] __________________________________________________________________________________________________ conv2d_63 (Conv2D) (None, 17, 17, 192) 258048 activation_62[0][0] __________________________________________________________________________________________________ conv2d_68 (Conv2D) (None, 17, 17, 192) 258048 activation_67[0][0] __________________________________________________________________________________________________ conv2d_69 (Conv2D) (None, 17, 17, 192) 147456 average_pooling2d_6[0][0] __________________________________________________________________________________________________ batch_normalization_60 (BatchNo (None, 17, 17, 192) 576 conv2d_60[0][0] __________________________________________________________________________________________________ batch_normalization_63 (BatchNo (None, 17, 17, 192) 576 conv2d_63[0][0] __________________________________________________________________________________________________ batch_normalization_68 (BatchNo (None, 17, 17, 192) 576 conv2d_68[0][0] __________________________________________________________________________________________________ batch_normalization_69 (BatchNo (None, 17, 17, 192) 576 conv2d_69[0][0] __________________________________________________________________________________________________ activation_60 (Activation) (None, 17, 17, 192) 0 batch_normalization_60[0][0] __________________________________________________________________________________________________ activation_63 (Activation) (None, 17, 17, 192) 0 batch_normalization_63[0][0] __________________________________________________________________________________________________ activation_68 (Activation) (None, 17, 17, 192) 0 batch_normalization_68[0][0] __________________________________________________________________________________________________ activation_69 (Activation) (None, 17, 17, 192) 0 batch_normalization_69[0][0] __________________________________________________________________________________________________ mixed7 (Concatenate) (None, 17, 17, 768) 0 activation_60[0][0] activation_63[0][0] activation_68[0][0] activation_69[0][0] __________________________________________________________________________________________________ conv2d_72 (Conv2D) (None, 17, 17, 192) 147456 mixed7[0][0] __________________________________________________________________________________________________ batch_normalization_72 (BatchNo (None, 17, 17, 192) 576 conv2d_72[0][0] __________________________________________________________________________________________________ activation_72 (Activation) (None, 17, 17, 192) 0 batch_normalization_72[0][0] __________________________________________________________________________________________________ conv2d_73 (Conv2D) (None, 17, 17, 192) 258048 activation_72[0][0] __________________________________________________________________________________________________ batch_normalization_73 (BatchNo (None, 17, 17, 192) 576 conv2d_73[0][0] __________________________________________________________________________________________________ activation_73 (Activation) (None, 17, 17, 192) 0 batch_normalization_73[0][0] __________________________________________________________________________________________________ conv2d_70 (Conv2D) (None, 17, 17, 192) 147456 mixed7[0][0] __________________________________________________________________________________________________ conv2d_74 (Conv2D) (None, 17, 17, 192) 258048 activation_73[0][0] __________________________________________________________________________________________________ batch_normalization_70 (BatchNo (None, 17, 17, 192) 576 conv2d_70[0][0] __________________________________________________________________________________________________ batch_normalization_74 (BatchNo (None, 17, 17, 192) 576 conv2d_74[0][0] __________________________________________________________________________________________________ activation_70 (Activation) (None, 17, 17, 192) 0 batch_normalization_70[0][0] __________________________________________________________________________________________________ activation_74 (Activation) (None, 17, 17, 192) 0 batch_normalization_74[0][0] __________________________________________________________________________________________________ conv2d_71 (Conv2D) (None, 8, 8, 320) 552960 activation_70[0][0] __________________________________________________________________________________________________ conv2d_75 (Conv2D) (None, 8, 8, 192) 331776 activation_74[0][0] __________________________________________________________________________________________________ batch_normalization_71 (BatchNo (None, 8, 8, 320) 960 conv2d_71[0][0] __________________________________________________________________________________________________ batch_normalization_75 (BatchNo (None, 8, 8, 192) 576 conv2d_75[0][0] __________________________________________________________________________________________________ activation_71 (Activation) (None, 8, 8, 320) 0 batch_normalization_71[0][0] __________________________________________________________________________________________________ activation_75 (Activation) (None, 8, 8, 192) 0 batch_normalization_75[0][0] __________________________________________________________________________________________________ max_pooling2d_3 (MaxPooling2D) (None, 8, 8, 768) 0 mixed7[0][0] __________________________________________________________________________________________________ mixed8 (Concatenate) (None, 8, 8, 1280) 0 activation_71[0][0] activation_75[0][0] max_pooling2d_3[0][0] __________________________________________________________________________________________________ conv2d_80 (Conv2D) (None, 8, 8, 448) 573440 mixed8[0][0] __________________________________________________________________________________________________ batch_normalization_80 (BatchNo (None, 8, 8, 448) 1344 conv2d_80[0][0] __________________________________________________________________________________________________ activation_80 (Activation) (None, 8, 8, 448) 0 batch_normalization_80[0][0] __________________________________________________________________________________________________ conv2d_77 (Conv2D) (None, 8, 8, 384) 491520 mixed8[0][0] __________________________________________________________________________________________________ conv2d_81 (Conv2D) (None, 8, 8, 384) 1548288 activation_80[0][0] __________________________________________________________________________________________________ batch_normalization_77 (BatchNo (None, 8, 8, 384) 1152 conv2d_77[0][0] __________________________________________________________________________________________________ batch_normalization_81 (BatchNo (None, 8, 8, 384) 1152 conv2d_81[0][0] __________________________________________________________________________________________________ activation_77 (Activation) (None, 8, 8, 384) 0 batch_normalization_77[0][0] __________________________________________________________________________________________________ activation_81 (Activation) (None, 8, 8, 384) 0 batch_normalization_81[0][0] __________________________________________________________________________________________________ conv2d_78 (Conv2D) (None, 8, 8, 384) 442368 activation_77[0][0] __________________________________________________________________________________________________ conv2d_79 (Conv2D) (None, 8, 8, 384) 442368 activation_77[0][0] __________________________________________________________________________________________________ conv2d_82 (Conv2D) (None, 8, 8, 384) 442368 activation_81[0][0] __________________________________________________________________________________________________ conv2d_83 (Conv2D) (None, 8, 8, 384) 442368 activation_81[0][0] __________________________________________________________________________________________________ average_pooling2d_7 (AveragePoo (None, 8, 8, 1280) 0 mixed8[0][0] __________________________________________________________________________________________________ conv2d_76 (Conv2D) (None, 8, 8, 320) 409600 mixed8[0][0] __________________________________________________________________________________________________ batch_normalization_78 (BatchNo (None, 8, 8, 384) 1152 conv2d_78[0][0] __________________________________________________________________________________________________ batch_normalization_79 (BatchNo (None, 8, 8, 384) 1152 conv2d_79[0][0] __________________________________________________________________________________________________ batch_normalization_82 (BatchNo (None, 8, 8, 384) 1152 conv2d_82[0][0] __________________________________________________________________________________________________ batch_normalization_83 (BatchNo (None, 8, 8, 384) 1152 conv2d_83[0][0] __________________________________________________________________________________________________ conv2d_84 (Conv2D) (None, 8, 8, 192) 245760 average_pooling2d_7[0][0] __________________________________________________________________________________________________ batch_normalization_76 (BatchNo (None, 8, 8, 320) 960 conv2d_76[0][0] __________________________________________________________________________________________________ activation_78 (Activation) (None, 8, 8, 384) 0 batch_normalization_78[0][0] __________________________________________________________________________________________________ activation_79 (Activation) (None, 8, 8, 384) 0 batch_normalization_79[0][0] __________________________________________________________________________________________________ activation_82 (Activation) (None, 8, 8, 384) 0 batch_normalization_82[0][0] __________________________________________________________________________________________________ activation_83 (Activation) (None, 8, 8, 384) 0 batch_normalization_83[0][0] __________________________________________________________________________________________________ batch_normalization_84 (BatchNo (None, 8, 8, 192) 576 conv2d_84[0][0] __________________________________________________________________________________________________ activation_76 (Activation) (None, 8, 8, 320) 0 batch_normalization_76[0][0] __________________________________________________________________________________________________ mixed9_0 (Concatenate) (None, 8, 8, 768) 0 activation_78[0][0] activation_79[0][0] __________________________________________________________________________________________________ concatenate (Concatenate) (None, 8, 8, 768) 0 activation_82[0][0] activation_83[0][0] __________________________________________________________________________________________________ activation_84 (Activation) (None, 8, 8, 192) 0 batch_normalization_84[0][0] __________________________________________________________________________________________________ mixed9 (Concatenate) (None, 8, 8, 2048) 0 activation_76[0][0] mixed9_0[0][0] concatenate[0][0] activation_84[0][0] __________________________________________________________________________________________________ conv2d_89 (Conv2D) (None, 8, 8, 448) 917504 mixed9[0][0] __________________________________________________________________________________________________ batch_normalization_89 (BatchNo (None, 8, 8, 448) 1344 conv2d_89[0][0] __________________________________________________________________________________________________ activation_89 (Activation) (None, 8, 8, 448) 0 batch_normalization_89[0][0] __________________________________________________________________________________________________ conv2d_86 (Conv2D) (None, 8, 8, 384) 786432 mixed9[0][0] __________________________________________________________________________________________________ conv2d_90 (Conv2D) (None, 8, 8, 384) 1548288 activation_89[0][0] __________________________________________________________________________________________________ batch_normalization_86 (BatchNo (None, 8, 8, 384) 1152 conv2d_86[0][0] __________________________________________________________________________________________________ batch_normalization_90 (BatchNo (None, 8, 8, 384) 1152 conv2d_90[0][0] __________________________________________________________________________________________________ activation_86 (Activation) (None, 8, 8, 384) 0 batch_normalization_86[0][0] __________________________________________________________________________________________________ activation_90 (Activation) (None, 8, 8, 384) 0 batch_normalization_90[0][0] __________________________________________________________________________________________________ conv2d_87 (Conv2D) (None, 8, 8, 384) 442368 activation_86[0][0] __________________________________________________________________________________________________ conv2d_88 (Conv2D) (None, 8, 8, 384) 442368 activation_86[0][0] __________________________________________________________________________________________________ conv2d_91 (Conv2D) (None, 8, 8, 384) 442368 activation_90[0][0] __________________________________________________________________________________________________ conv2d_92 (Conv2D) (None, 8, 8, 384) 442368 activation_90[0][0] __________________________________________________________________________________________________ average_pooling2d_8 (AveragePoo (None, 8, 8, 2048) 0 mixed9[0][0] __________________________________________________________________________________________________ conv2d_85 (Conv2D) (None, 8, 8, 320) 655360 mixed9[0][0] __________________________________________________________________________________________________ batch_normalization_87 (BatchNo (None, 8, 8, 384) 1152 conv2d_87[0][0] __________________________________________________________________________________________________ batch_normalization_88 (BatchNo (None, 8, 8, 384) 1152 conv2d_88[0][0] __________________________________________________________________________________________________ batch_normalization_91 (BatchNo (None, 8, 8, 384) 1152 conv2d_91[0][0] __________________________________________________________________________________________________ batch_normalization_92 (BatchNo (None, 8, 8, 384) 1152 conv2d_92[0][0] __________________________________________________________________________________________________ conv2d_93 (Conv2D) (None, 8, 8, 192) 393216 average_pooling2d_8[0][0] __________________________________________________________________________________________________ batch_normalization_85 (BatchNo (None, 8, 8, 320) 960 conv2d_85[0][0] __________________________________________________________________________________________________ activation_87 (Activation) (None, 8, 8, 384) 0 batch_normalization_87[0][0] __________________________________________________________________________________________________ activation_88 (Activation) (None, 8, 8, 384) 0 batch_normalization_88[0][0] __________________________________________________________________________________________________ activation_91 (Activation) (None, 8, 8, 384) 0 batch_normalization_91[0][0] __________________________________________________________________________________________________ activation_92 (Activation) (None, 8, 8, 384) 0 batch_normalization_92[0][0] __________________________________________________________________________________________________ batch_normalization_93 (BatchNo (None, 8, 8, 192) 576 conv2d_93[0][0] __________________________________________________________________________________________________ activation_85 (Activation) (None, 8, 8, 320) 0 batch_normalization_85[0][0] __________________________________________________________________________________________________ mixed9_1 (Concatenate) (None, 8, 8, 768) 0 activation_87[0][0] activation_88[0][0] __________________________________________________________________________________________________ concatenate_1 (Concatenate) (None, 8, 8, 768) 0 activation_91[0][0] activation_92[0][0] __________________________________________________________________________________________________ activation_93 (Activation) (None, 8, 8, 192) 0 batch_normalization_93[0][0] __________________________________________________________________________________________________ mixed10 (Concatenate) (None, 8, 8, 2048) 0 activation_85[0][0] mixed9_1[0][0] concatenate_1[0][0] activation_93[0][0] __________________________________________________________________________________________________ avg_pool (GlobalAveragePooling2 (None, 2048) 0 mixed10[0][0] __________________________________________________________________________________________________ predictions (Dense) (None, 1000) 2049000 avg_pool[0][0] ================================================================================================== Total params: 23,851,784 Trainable params: 23,817,352 Non-trainable params: 34,432 __________________________________________________________________________________________________

Without final layer

inception = InceptionV3(include_top=False) inception.summary()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/inception_v3/inception_v3_weights_tf_dim_ordering_tf_kernels_notop.h5 87916544/87910968 [==============================] - 1s 0us/step Model: "inception_v3" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_6 (InputLayer) [(None, None, None, 0 __________________________________________________________________________________________________ conv2d_94 (Conv2D) (None, None, None, 3 864 input_6[0][0] __________________________________________________________________________________________________ batch_normalization_94 (BatchNo (None, None, None, 3 96 conv2d_94[0][0] __________________________________________________________________________________________________ activation_94 (Activation) (None, None, None, 3 0 batch_normalization_94[0][0] __________________________________________________________________________________________________ conv2d_95 (Conv2D) (None, None, None, 3 9216 activation_94[0][0] __________________________________________________________________________________________________ batch_normalization_95 (BatchNo (None, None, None, 3 96 conv2d_95[0][0] __________________________________________________________________________________________________ activation_95 (Activation) (None, None, None, 3 0 batch_normalization_95[0][0] __________________________________________________________________________________________________ conv2d_96 (Conv2D) (None, None, None, 6 18432 activation_95[0][0] __________________________________________________________________________________________________ batch_normalization_96 (BatchNo (None, None, None, 6 192 conv2d_96[0][0] __________________________________________________________________________________________________ activation_96 (Activation) (None, None, None, 6 0 batch_normalization_96[0][0] __________________________________________________________________________________________________ max_pooling2d_4 (MaxPooling2D) (None, None, None, 6 0 activation_96[0][0] __________________________________________________________________________________________________ conv2d_97 (Conv2D) (None, None, None, 8 5120 max_pooling2d_4[0][0] __________________________________________________________________________________________________ batch_normalization_97 (BatchNo (None, None, None, 8 240 conv2d_97[0][0] __________________________________________________________________________________________________ activation_97 (Activation) (None, None, None, 8 0 batch_normalization_97[0][0] __________________________________________________________________________________________________ conv2d_98 (Conv2D) (None, None, None, 1 138240 activation_97[0][0] __________________________________________________________________________________________________ batch_normalization_98 (BatchNo (None, None, None, 1 576 conv2d_98[0][0] __________________________________________________________________________________________________ activation_98 (Activation) (None, None, None, 1 0 batch_normalization_98[0][0] __________________________________________________________________________________________________ max_pooling2d_5 (MaxPooling2D) (None, None, None, 1 0 activation_98[0][0] __________________________________________________________________________________________________ conv2d_102 (Conv2D) (None, None, None, 6 12288 max_pooling2d_5[0][0] __________________________________________________________________________________________________ batch_normalization_102 (BatchN (None, None, None, 6 192 conv2d_102[0][0] __________________________________________________________________________________________________ activation_102 (Activation) (None, None, None, 6 0 batch_normalization_102[0][0] __________________________________________________________________________________________________ conv2d_100 (Conv2D) (None, None, None, 4 9216 max_pooling2d_5[0][0] __________________________________________________________________________________________________ conv2d_103 (Conv2D) (None, None, None, 9 55296 activation_102[0][0] __________________________________________________________________________________________________ batch_normalization_100 (BatchN (None, None, None, 4 144 conv2d_100[0][0] __________________________________________________________________________________________________ batch_normalization_103 (BatchN (None, None, None, 9 288 conv2d_103[0][0] __________________________________________________________________________________________________ activation_100 (Activation) (None, None, None, 4 0 batch_normalization_100[0][0] __________________________________________________________________________________________________ activation_103 (Activation) (None, None, None, 9 0 batch_normalization_103[0][0] __________________________________________________________________________________________________ average_pooling2d_9 (AveragePoo (None, None, None, 1 0 max_pooling2d_5[0][0] __________________________________________________________________________________________________ conv2d_99 (Conv2D) (None, None, None, 6 12288 max_pooling2d_5[0][0] __________________________________________________________________________________________________ conv2d_101 (Conv2D) (None, None, None, 6 76800 activation_100[0][0] __________________________________________________________________________________________________ conv2d_104 (Conv2D) (None, None, None, 9 82944 activation_103[0][0] __________________________________________________________________________________________________ conv2d_105 (Conv2D) (None, None, None, 3 6144 average_pooling2d_9[0][0] __________________________________________________________________________________________________ batch_normalization_99 (BatchNo (None, None, None, 6 192 conv2d_99[0][0] __________________________________________________________________________________________________ batch_normalization_101 (BatchN (None, None, None, 6 192 conv2d_101[0][0] __________________________________________________________________________________________________ batch_normalization_104 (BatchN (None, None, None, 9 288 conv2d_104[0][0] __________________________________________________________________________________________________ batch_normalization_105 (BatchN (None, None, None, 3 96 conv2d_105[0][0] __________________________________________________________________________________________________ activation_99 (Activation) (None, None, None, 6 0 batch_normalization_99[0][0] __________________________________________________________________________________________________ activation_101 (Activation) (None, None, None, 6 0 batch_normalization_101[0][0] __________________________________________________________________________________________________ activation_104 (Activation) (None, None, None, 9 0 batch_normalization_104[0][0] __________________________________________________________________________________________________ activation_105 (Activation) (None, None, None, 3 0 batch_normalization_105[0][0] __________________________________________________________________________________________________ mixed0 (Concatenate) (None, None, None, 2 0 activation_99[0][0] activation_101[0][0] activation_104[0][0] activation_105[0][0] __________________________________________________________________________________________________ conv2d_109 (Conv2D) (None, None, None, 6 16384 mixed0[0][0] __________________________________________________________________________________________________ batch_normalization_109 (BatchN (None, None, None, 6 192 conv2d_109[0][0] __________________________________________________________________________________________________ activation_109 (Activation) (None, None, None, 6 0 batch_normalization_109[0][0] __________________________________________________________________________________________________ conv2d_107 (Conv2D) (None, None, None, 4 12288 mixed0[0][0] __________________________________________________________________________________________________ conv2d_110 (Conv2D) (None, None, None, 9 55296 activation_109[0][0] __________________________________________________________________________________________________ batch_normalization_107 (BatchN (None, None, None, 4 144 conv2d_107[0][0] __________________________________________________________________________________________________ batch_normalization_110 (BatchN (None, None, None, 9 288 conv2d_110[0][0] __________________________________________________________________________________________________ activation_107 (Activation) (None, None, None, 4 0 batch_normalization_107[0][0] __________________________________________________________________________________________________ activation_110 (Activation) (None, None, None, 9 0 batch_normalization_110[0][0] __________________________________________________________________________________________________ average_pooling2d_10 (AveragePo (None, None, None, 2 0 mixed0[0][0] __________________________________________________________________________________________________ conv2d_106 (Conv2D) (None, None, None, 6 16384 mixed0[0][0] __________________________________________________________________________________________________ conv2d_108 (Conv2D) (None, None, None, 6 76800 activation_107[0][0] __________________________________________________________________________________________________ conv2d_111 (Conv2D) (None, None, None, 9 82944 activation_110[0][0] __________________________________________________________________________________________________ conv2d_112 (Conv2D) (None, None, None, 6 16384 average_pooling2d_10[0][0] __________________________________________________________________________________________________ batch_normalization_106 (BatchN (None, None, None, 6 192 conv2d_106[0][0] __________________________________________________________________________________________________ batch_normalization_108 (BatchN (None, None, None, 6 192 conv2d_108[0][0] __________________________________________________________________________________________________ batch_normalization_111 (BatchN (None, None, None, 9 288 conv2d_111[0][0] __________________________________________________________________________________________________ batch_normalization_112 (BatchN (None, None, None, 6 192 conv2d_112[0][0] __________________________________________________________________________________________________ activation_106 (Activation) (None, None, None, 6 0 batch_normalization_106[0][0] __________________________________________________________________________________________________ activation_108 (Activation) (None, None, None, 6 0 batch_normalization_108[0][0] __________________________________________________________________________________________________ activation_111 (Activation) (None, None, None, 9 0 batch_normalization_111[0][0] __________________________________________________________________________________________________ activation_112 (Activation) (None, None, None, 6 0 batch_normalization_112[0][0] __________________________________________________________________________________________________ mixed1 (Concatenate) (None, None, None, 2 0 activation_106[0][0] activation_108[0][0] activation_111[0][0] activation_112[0][0] __________________________________________________________________________________________________ conv2d_116 (Conv2D) (None, None, None, 6 18432 mixed1[0][0] __________________________________________________________________________________________________ batch_normalization_116 (BatchN (None, None, None, 6 192 conv2d_116[0][0] __________________________________________________________________________________________________ activation_116 (Activation) (None, None, None, 6 0 batch_normalization_116[0][0] __________________________________________________________________________________________________ conv2d_114 (Conv2D) (None, None, None, 4 13824 mixed1[0][0] __________________________________________________________________________________________________ conv2d_117 (Conv2D) (None, None, None, 9 55296 activation_116[0][0] __________________________________________________________________________________________________ batch_normalization_114 (BatchN (None, None, None, 4 144 conv2d_114[0][0] __________________________________________________________________________________________________ batch_normalization_117 (BatchN (None, None, None, 9 288 conv2d_117[0][0] __________________________________________________________________________________________________ activation_114 (Activation) (None, None, None, 4 0 batch_normalization_114[0][0] __________________________________________________________________________________________________ activation_117 (Activation) (None, None, None, 9 0 batch_normalization_117[0][0] __________________________________________________________________________________________________ average_pooling2d_11 (AveragePo (None, None, None, 2 0 mixed1[0][0] __________________________________________________________________________________________________ conv2d_113 (Conv2D) (None, None, None, 6 18432 mixed1[0][0] __________________________________________________________________________________________________ conv2d_115 (Conv2D) (None, None, None, 6 76800 activation_114[0][0] __________________________________________________________________________________________________ conv2d_118 (Conv2D) (None, None, None, 9 82944 activation_117[0][0] __________________________________________________________________________________________________ conv2d_119 (Conv2D) (None, None, None, 6 18432 average_pooling2d_11[0][0] __________________________________________________________________________________________________ batch_normalization_113 (BatchN (None, None, None, 6 192 conv2d_113[0][0] __________________________________________________________________________________________________ batch_normalization_115 (BatchN (None, None, None, 6 192 conv2d_115[0][0] __________________________________________________________________________________________________ batch_normalization_118 (BatchN (None, None, None, 9 288 conv2d_118[0][0] __________________________________________________________________________________________________ batch_normalization_119 (BatchN (None, None, None, 6 192 conv2d_119[0][0] __________________________________________________________________________________________________ activation_113 (Activation) (None, None, None, 6 0 batch_normalization_113[0][0] __________________________________________________________________________________________________ activation_115 (Activation) (None, None, None, 6 0 batch_normalization_115[0][0] __________________________________________________________________________________________________ activation_118 (Activation) (None, None, None, 9 0 batch_normalization_118[0][0] __________________________________________________________________________________________________ activation_119 (Activation) (None, None, None, 6 0 batch_normalization_119[0][0] __________________________________________________________________________________________________ mixed2 (Concatenate) (None, None, None, 2 0 activation_113[0][0] activation_115[0][0] activation_118[0][0] activation_119[0][0] __________________________________________________________________________________________________ conv2d_121 (Conv2D) (None, None, None, 6 18432 mixed2[0][0] __________________________________________________________________________________________________ batch_normalization_121 (BatchN (None, None, None, 6 192 conv2d_121[0][0] __________________________________________________________________________________________________ activation_121 (Activation) (None, None, None, 6 0 batch_normalization_121[0][0] __________________________________________________________________________________________________ conv2d_122 (Conv2D) (None, None, None, 9 55296 activation_121[0][0] __________________________________________________________________________________________________ batch_normalization_122 (BatchN (None, None, None, 9 288 conv2d_122[0][0] __________________________________________________________________________________________________ activation_122 (Activation) (None, None, None, 9 0 batch_normalization_122[0][0] __________________________________________________________________________________________________ conv2d_120 (Conv2D) (None, None, None, 3 995328 mixed2[0][0] __________________________________________________________________________________________________ conv2d_123 (Conv2D) (None, None, None, 9 82944 activation_122[0][0] __________________________________________________________________________________________________ batch_normalization_120 (BatchN (None, None, None, 3 1152 conv2d_120[0][0] __________________________________________________________________________________________________ batch_normalization_123 (BatchN (None, None, None, 9 288 conv2d_123[0][0] __________________________________________________________________________________________________ activation_120 (Activation) (None, None, None, 3 0 batch_normalization_120[0][0] __________________________________________________________________________________________________ activation_123 (Activation) (None, None, None, 9 0 batch_normalization_123[0][0] __________________________________________________________________________________________________ max_pooling2d_6 (MaxPooling2D) (None, None, None, 2 0 mixed2[0][0] __________________________________________________________________________________________________ mixed3 (Concatenate) (None, None, None, 7 0 activation_120[0][0] activation_123[0][0] max_pooling2d_6[0][0] __________________________________________________________________________________________________ conv2d_128 (Conv2D) (None, None, None, 1 98304 mixed3[0][0] __________________________________________________________________________________________________ batch_normalization_128 (BatchN (None, None, None, 1 384 conv2d_128[0][0] __________________________________________________________________________________________________ activation_128 (Activation) (None, None, None, 1 0 batch_normalization_128[0][0] __________________________________________________________________________________________________ conv2d_129 (Conv2D) (None, None, None, 1 114688 activation_128[0][0] __________________________________________________________________________________________________ batch_normalization_129 (BatchN (None, None, None, 1 384 conv2d_129[0][0] __________________________________________________________________________________________________ activation_129 (Activation) (None, None, None, 1 0 batch_normalization_129[0][0] __________________________________________________________________________________________________ conv2d_125 (Conv2D) (None, None, None, 1 98304 mixed3[0][0] __________________________________________________________________________________________________ conv2d_130 (Conv2D) (None, None, None, 1 114688 activation_129[0][0] __________________________________________________________________________________________________ batch_normalization_125 (BatchN (None, None, None, 1 384 conv2d_125[0][0] __________________________________________________________________________________________________ batch_normalization_130 (BatchN (None, None, None, 1 384 conv2d_130[0][0] __________________________________________________________________________________________________ activation_125 (Activation) (None, None, None, 1 0 batch_normalization_125[0][0] __________________________________________________________________________________________________ activation_130 (Activation) (None, None, None, 1 0 batch_normalization_130[0][0] __________________________________________________________________________________________________ conv2d_126 (Conv2D) (None, None, None, 1 114688 activation_125[0][0] __________________________________________________________________________________________________ conv2d_131 (Conv2D) (None, None, None, 1 114688 activation_130[0][0] __________________________________________________________________________________________________ batch_normalization_126 (BatchN (None, None, None, 1 384 conv2d_126[0][0] __________________________________________________________________________________________________ batch_normalization_131 (BatchN (None, None, None, 1 384 conv2d_131[0][0] __________________________________________________________________________________________________ activation_126 (Activation) (None, None, None, 1 0 batch_normalization_126[0][0] __________________________________________________________________________________________________ activation_131 (Activation) (None, None, None, 1 0 batch_normalization_131[0][0] __________________________________________________________________________________________________ average_pooling2d_12 (AveragePo (None, None, None, 7 0 mixed3[0][0] __________________________________________________________________________________________________ conv2d_124 (Conv2D) (None, None, None, 1 147456 mixed3[0][0] __________________________________________________________________________________________________ conv2d_127 (Conv2D) (None, None, None, 1 172032 activation_126[0][0] __________________________________________________________________________________________________ conv2d_132 (Conv2D) (None, None, None, 1 172032 activation_131[0][0] __________________________________________________________________________________________________ conv2d_133 (Conv2D) (None, None, None, 1 147456 average_pooling2d_12[0][0] __________________________________________________________________________________________________ batch_normalization_124 (BatchN (None, None, None, 1 576 conv2d_124[0][0] __________________________________________________________________________________________________ batch_normalization_127 (BatchN (None, None, None, 1 576 conv2d_127[0][0] __________________________________________________________________________________________________ batch_normalization_132 (BatchN (None, None, None, 1 576 conv2d_132[0][0] __________________________________________________________________________________________________ batch_normalization_133 (BatchN (None, None, None, 1 576 conv2d_133[0][0] __________________________________________________________________________________________________ activation_124 (Activation) (None, None, None, 1 0 batch_normalization_124[0][0] __________________________________________________________________________________________________ activation_127 (Activation) (None, None, None, 1 0 batch_normalization_127[0][0] __________________________________________________________________________________________________ activation_132 (Activation) (None, None, None, 1 0 batch_normalization_132[0][0] __________________________________________________________________________________________________ activation_133 (Activation) (None, None, None, 1 0 batch_normalization_133[0][0] __________________________________________________________________________________________________ mixed4 (Concatenate) (None, None, None, 7 0 activation_124[0][0] activation_127[0][0] activation_132[0][0] activation_133[0][0] __________________________________________________________________________________________________ conv2d_138 (Conv2D) (None, None, None, 1 122880 mixed4[0][0] __________________________________________________________________________________________________ batch_normalization_138 (BatchN (None, None, None, 1 480 conv2d_138[0][0] __________________________________________________________________________________________________ activation_138 (Activation) (None, None, None, 1 0 batch_normalization_138[0][0] __________________________________________________________________________________________________ conv2d_139 (Conv2D) (None, None, None, 1 179200 activation_138[0][0] __________________________________________________________________________________________________ batch_normalization_139 (BatchN (None, None, None, 1 480 conv2d_139[0][0] __________________________________________________________________________________________________ activation_139 (Activation) (None, None, None, 1 0 batch_normalization_139[0][0] __________________________________________________________________________________________________ conv2d_135 (Conv2D) (None, None, None, 1 122880 mixed4[0][0] __________________________________________________________________________________________________ conv2d_140 (Conv2D) (None, None, None, 1 179200 activation_139[0][0] __________________________________________________________________________________________________ batch_normalization_135 (BatchN (None, None, None, 1 480 conv2d_135[0][0] __________________________________________________________________________________________________ batch_normalization_140 (BatchN (None, None, None, 1 480 conv2d_140[0][0] __________________________________________________________________________________________________ activation_135 (Activation) (None, None, None, 1 0 batch_normalization_135[0][0] __________________________________________________________________________________________________ activation_140 (Activation) (None, None, None, 1 0 batch_normalization_140[0][0] __________________________________________________________________________________________________ conv2d_136 (Conv2D) (None, None, None, 1 179200 activation_135[0][0] __________________________________________________________________________________________________ conv2d_141 (Conv2D) (None, None, None, 1 179200 activation_140[0][0] __________________________________________________________________________________________________ batch_normalization_136 (BatchN (None, None, None, 1 480 conv2d_136[0][0] __________________________________________________________________________________________________ batch_normalization_141 (BatchN (None, None, None, 1 480 conv2d_141[0][0] __________________________________________________________________________________________________ activation_136 (Activation) (None, None, None, 1 0 batch_normalization_136[0][0] __________________________________________________________________________________________________ activation_141 (Activation) (None, None, None, 1 0 batch_normalization_141[0][0] __________________________________________________________________________________________________ average_pooling2d_13 (AveragePo (None, None, None, 7 0 mixed4[0][0] __________________________________________________________________________________________________ conv2d_134 (Conv2D) (None, None, None, 1 147456 mixed4[0][0] __________________________________________________________________________________________________ conv2d_137 (Conv2D) (None, None, None, 1 215040 activation_136[0][0] __________________________________________________________________________________________________ conv2d_142 (Conv2D) (None, None, None, 1 215040 activation_141[0][0] __________________________________________________________________________________________________ conv2d_143 (Conv2D) (None, None, None, 1 147456 average_pooling2d_13[0][0] __________________________________________________________________________________________________ batch_normalization_134 (BatchN (None, None, None, 1 576 conv2d_134[0][0] __________________________________________________________________________________________________ batch_normalization_137 (BatchN (None, None, None, 1 576 conv2d_137[0][0] __________________________________________________________________________________________________ batch_normalization_142 (BatchN (None, None, None, 1 576 conv2d_142[0][0] __________________________________________________________________________________________________ batch_normalization_143 (BatchN (None, None, None, 1 576 conv2d_143[0][0] __________________________________________________________________________________________________ activation_134 (Activation) (None, None, None, 1 0 batch_normalization_134[0][0] __________________________________________________________________________________________________ activation_137 (Activation) (None, None, None, 1 0 batch_normalization_137[0][0] __________________________________________________________________________________________________ activation_142 (Activation) (None, None, None, 1 0 batch_normalization_142[0][0] __________________________________________________________________________________________________ activation_143 (Activation) (None, None, None, 1 0 batch_normalization_143[0][0] __________________________________________________________________________________________________ mixed5 (Concatenate) (None, None, None, 7 0 activation_134[0][0] activation_137[0][0] activation_142[0][0] activation_143[0][0] __________________________________________________________________________________________________ conv2d_148 (Conv2D) (None, None, None, 1 122880 mixed5[0][0] __________________________________________________________________________________________________ batch_normalization_148 (BatchN (None, None, None, 1 480 conv2d_148[0][0] __________________________________________________________________________________________________ activation_148 (Activation) (None, None, None, 1 0 batch_normalization_148[0][0] __________________________________________________________________________________________________ conv2d_149 (Conv2D) (None, None, None, 1 179200 activation_148[0][0] __________________________________________________________________________________________________ batch_normalization_149 (BatchN (None, None, None, 1 480 conv2d_149[0][0] __________________________________________________________________________________________________ activation_149 (Activation) (None, None, None, 1 0 batch_normalization_149[0][0] __________________________________________________________________________________________________ conv2d_145 (Conv2D) (None, None, None, 1 122880 mixed5[0][0] __________________________________________________________________________________________________ conv2d_150 (Conv2D) (None, None, None, 1 179200 activation_149[0][0] __________________________________________________________________________________________________ batch_normalization_145 (BatchN (None, None, None, 1 480 conv2d_145[0][0] __________________________________________________________________________________________________ batch_normalization_150 (BatchN (None, None, None, 1 480 conv2d_150[0][0] __________________________________________________________________________________________________ activation_145 (Activation) (None, None, None, 1 0 batch_normalization_145[0][0] __________________________________________________________________________________________________ activation_150 (Activation) (None, None, None, 1 0 batch_normalization_150[0][0] __________________________________________________________________________________________________ conv2d_146 (Conv2D) (None, None, None, 1 179200 activation_145[0][0] __________________________________________________________________________________________________ conv2d_151 (Conv2D) (None, None, None, 1 179200 activation_150[0][0] __________________________________________________________________________________________________ batch_normalization_146 (BatchN (None, None, None, 1 480 conv2d_146[0][0] __________________________________________________________________________________________________ batch_normalization_151 (BatchN (None, None, None, 1 480 conv2d_151[0][0] __________________________________________________________________________________________________ activation_146 (Activation) (None, None, None, 1 0 batch_normalization_146[0][0] __________________________________________________________________________________________________ activation_151 (Activation) (None, None, None, 1 0 batch_normalization_151[0][0] __________________________________________________________________________________________________ average_pooling2d_14 (AveragePo (None, None, None, 7 0 mixed5[0][0] __________________________________________________________________________________________________ conv2d_144 (Conv2D) (None, None, None, 1 147456 mixed5[0][0] __________________________________________________________________________________________________ conv2d_147 (Conv2D) (None, None, None, 1 215040 activation_146[0][0] __________________________________________________________________________________________________ conv2d_152 (Conv2D) (None, None, None, 1 215040 activation_151[0][0] __________________________________________________________________________________________________ conv2d_153 (Conv2D) (None, None, None, 1 147456 average_pooling2d_14[0][0] __________________________________________________________________________________________________ batch_normalization_144 (BatchN (None, None, None, 1 576 conv2d_144[0][0] __________________________________________________________________________________________________ batch_normalization_147 (BatchN (None, None, None, 1 576 conv2d_147[0][0] __________________________________________________________________________________________________ batch_normalization_152 (BatchN (None, None, None, 1 576 conv2d_152[0][0] __________________________________________________________________________________________________ batch_normalization_153 (BatchN (None, None, None, 1 576 conv2d_153[0][0] __________________________________________________________________________________________________ activation_144 (Activation) (None, None, None, 1 0 batch_normalization_144[0][0] __________________________________________________________________________________________________ activation_147 (Activation) (None, None, None, 1 0 batch_normalization_147[0][0] __________________________________________________________________________________________________ activation_152 (Activation) (None, None, None, 1 0 batch_normalization_152[0][0] __________________________________________________________________________________________________ activation_153 (Activation) (None, None, None, 1 0 batch_normalization_153[0][0] __________________________________________________________________________________________________ mixed6 (Concatenate) (None, None, None, 7 0 activation_144[0][0] activation_147[0][0] activation_152[0][0] activation_153[0][0] __________________________________________________________________________________________________ conv2d_158 (Conv2D) (None, None, None, 1 147456 mixed6[0][0] __________________________________________________________________________________________________ batch_normalization_158 (BatchN (None, None, None, 1 576 conv2d_158[0][0] __________________________________________________________________________________________________ activation_158 (Activation) (None, None, None, 1 0 batch_normalization_158[0][0] __________________________________________________________________________________________________ conv2d_159 (Conv2D) (None, None, None, 1 258048 activation_158[0][0] __________________________________________________________________________________________________ batch_normalization_159 (BatchN (None, None, None, 1 576 conv2d_159[0][0] __________________________________________________________________________________________________ activation_159 (Activation) (None, None, None, 1 0 batch_normalization_159[0][0] __________________________________________________________________________________________________ conv2d_155 (Conv2D) (None, None, None, 1 147456 mixed6[0][0] __________________________________________________________________________________________________ conv2d_160 (Conv2D) (None, None, None, 1 258048 activation_159[0][0] __________________________________________________________________________________________________ batch_normalization_155 (BatchN (None, None, None, 1 576 conv2d_155[0][0] __________________________________________________________________________________________________ batch_normalization_160 (BatchN (None, None, None, 1 576 conv2d_160[0][0] __________________________________________________________________________________________________ activation_155 (Activation) (None, None, None, 1 0 batch_normalization_155[0][0] __________________________________________________________________________________________________ activation_160 (Activation) (None, None, None, 1 0 batch_normalization_160[0][0] __________________________________________________________________________________________________ conv2d_156 (Conv2D) (None, None, None, 1 258048 activation_155[0][0] __________________________________________________________________________________________________ conv2d_161 (Conv2D) (None, None, None, 1 258048 activation_160[0][0] __________________________________________________________________________________________________ batch_normalization_156 (BatchN (None, None, None, 1 576 conv2d_156[0][0] __________________________________________________________________________________________________ batch_normalization_161 (BatchN (None, None, None, 1 576 conv2d_161[0][0] __________________________________________________________________________________________________ activation_156 (Activation) (None, None, None, 1 0 batch_normalization_156[0][0] __________________________________________________________________________________________________ activation_161 (Activation) (None, None, None, 1 0 batch_normalization_161[0][0] __________________________________________________________________________________________________ average_pooling2d_15 (AveragePo (None, None, None, 7 0 mixed6[0][0] __________________________________________________________________________________________________ conv2d_154 (Conv2D) (None, None, None, 1 147456 mixed6[0][0] __________________________________________________________________________________________________ conv2d_157 (Conv2D) (None, None, None, 1 258048 activation_156[0][0] __________________________________________________________________________________________________ conv2d_162 (Conv2D) (None, None, None, 1 258048 activation_161[0][0] __________________________________________________________________________________________________ conv2d_163 (Conv2D) (None, None, None, 1 147456 average_pooling2d_15[0][0] __________________________________________________________________________________________________ batch_normalization_154 (BatchN (None, None, None, 1 576 conv2d_154[0][0] __________________________________________________________________________________________________ batch_normalization_157 (BatchN (None, None, None, 1 576 conv2d_157[0][0] __________________________________________________________________________________________________ batch_normalization_162 (BatchN (None, None, None, 1 576 conv2d_162[0][0] __________________________________________________________________________________________________ batch_normalization_163 (BatchN (None, None, None, 1 576 conv2d_163[0][0] __________________________________________________________________________________________________ activation_154 (Activation) (None, None, None, 1 0 batch_normalization_154[0][0] __________________________________________________________________________________________________ activation_157 (Activation) (None, None, None, 1 0 batch_normalization_157[0][0] __________________________________________________________________________________________________ activation_162 (Activation) (None, None, None, 1 0 batch_normalization_162[0][0] __________________________________________________________________________________________________ activation_163 (Activation) (None, None, None, 1 0 batch_normalization_163[0][0] __________________________________________________________________________________________________ mixed7 (Concatenate) (None, None, None, 7 0 activation_154[0][0] activation_157[0][0] activation_162[0][0] activation_163[0][0] __________________________________________________________________________________________________ conv2d_166 (Conv2D) (None, None, None, 1 147456 mixed7[0][0] __________________________________________________________________________________________________ batch_normalization_166 (BatchN (None, None, None, 1 576 conv2d_166[0][0] __________________________________________________________________________________________________ activation_166 (Activation) (None, None, None, 1 0 batch_normalization_166[0][0] __________________________________________________________________________________________________ conv2d_167 (Conv2D) (None, None, None, 1 258048 activation_166[0][0] __________________________________________________________________________________________________ batch_normalization_167 (BatchN (None, None, None, 1 576 conv2d_167[0][0] __________________________________________________________________________________________________ activation_167 (Activation) (None, None, None, 1 0 batch_normalization_167[0][0] __________________________________________________________________________________________________ conv2d_164 (Conv2D) (None, None, None, 1 147456 mixed7[0][0] __________________________________________________________________________________________________ conv2d_168 (Conv2D) (None, None, None, 1 258048 activation_167[0][0] __________________________________________________________________________________________________ batch_normalization_164 (BatchN (None, None, None, 1 576 conv2d_164[0][0] __________________________________________________________________________________________________ batch_normalization_168 (BatchN (None, None, None, 1 576 conv2d_168[0][0] __________________________________________________________________________________________________ activation_164 (Activation) (None, None, None, 1 0 batch_normalization_164[0][0] __________________________________________________________________________________________________ activation_168 (Activation) (None, None, None, 1 0 batch_normalization_168[0][0] __________________________________________________________________________________________________ conv2d_165 (Conv2D) (None, None, None, 3 552960 activation_164[0][0] __________________________________________________________________________________________________ conv2d_169 (Conv2D) (None, None, None, 1 331776 activation_168[0][0] __________________________________________________________________________________________________ batch_normalization_165 (BatchN (None, None, None, 3 960 conv2d_165[0][0] __________________________________________________________________________________________________ batch_normalization_169 (BatchN (None, None, None, 1 576 conv2d_169[0][0] __________________________________________________________________________________________________ activation_165 (Activation) (None, None, None, 3 0 batch_normalization_165[0][0] __________________________________________________________________________________________________ activation_169 (Activation) (None, None, None, 1 0 batch_normalization_169[0][0] __________________________________________________________________________________________________ max_pooling2d_7 (MaxPooling2D) (None, None, None, 7 0 mixed7[0][0] __________________________________________________________________________________________________ mixed8 (Concatenate) (None, None, None, 1 0 activation_165[0][0] activation_169[0][0] max_pooling2d_7[0][0] __________________________________________________________________________________________________ conv2d_174 (Conv2D) (None, None, None, 4 573440 mixed8[0][0] __________________________________________________________________________________________________ batch_normalization_174 (BatchN (None, None, None, 4 1344 conv2d_174[0][0] __________________________________________________________________________________________________ activation_174 (Activation) (None, None, None, 4 0 batch_normalization_174[0][0] __________________________________________________________________________________________________ conv2d_171 (Conv2D) (None, None, None, 3 491520 mixed8[0][0] __________________________________________________________________________________________________ conv2d_175 (Conv2D) (None, None, None, 3 1548288 activation_174[0][0] __________________________________________________________________________________________________ batch_normalization_171 (BatchN (None, None, None, 3 1152 conv2d_171[0][0] __________________________________________________________________________________________________ batch_normalization_175 (BatchN (None, None, None, 3 1152 conv2d_175[0][0] __________________________________________________________________________________________________ activation_171 (Activation) (None, None, None, 3 0 batch_normalization_171[0][0] __________________________________________________________________________________________________ activation_175 (Activation) (None, None, None, 3 0 batch_normalization_175[0][0] __________________________________________________________________________________________________ conv2d_172 (Conv2D) (None, None, None, 3 442368 activation_171[0][0] __________________________________________________________________________________________________ conv2d_173 (Conv2D) (None, None, None, 3 442368 activation_171[0][0] __________________________________________________________________________________________________ conv2d_176 (Conv2D) (None, None, None, 3 442368 activation_175[0][0] __________________________________________________________________________________________________ conv2d_177 (Conv2D) (None, None, None, 3 442368 activation_175[0][0] __________________________________________________________________________________________________ average_pooling2d_16 (AveragePo (None, None, None, 1 0 mixed8[0][0] __________________________________________________________________________________________________ conv2d_170 (Conv2D) (None, None, None, 3 409600 mixed8[0][0] __________________________________________________________________________________________________ batch_normalization_172 (BatchN (None, None, None, 3 1152 conv2d_172[0][0] __________________________________________________________________________________________________ batch_normalization_173 (BatchN (None, None, None, 3 1152 conv2d_173[0][0] __________________________________________________________________________________________________ batch_normalization_176 (BatchN (None, None, None, 3 1152 conv2d_176[0][0] __________________________________________________________________________________________________ batch_normalization_177 (BatchN (None, None, None, 3 1152 conv2d_177[0][0] __________________________________________________________________________________________________ conv2d_178 (Conv2D) (None, None, None, 1 245760 average_pooling2d_16[0][0] __________________________________________________________________________________________________ batch_normalization_170 (BatchN (None, None, None, 3 960 conv2d_170[0][0] __________________________________________________________________________________________________ activation_172 (Activation) (None, None, None, 3 0 batch_normalization_172[0][0] __________________________________________________________________________________________________ activation_173 (Activation) (None, None, None, 3 0 batch_normalization_173[0][0] __________________________________________________________________________________________________ activation_176 (Activation) (None, None, None, 3 0 batch_normalization_176[0][0] __________________________________________________________________________________________________ activation_177 (Activation) (None, None, None, 3 0 batch_normalization_177[0][0] __________________________________________________________________________________________________ batch_normalization_178 (BatchN (None, None, None, 1 576 conv2d_178[0][0] __________________________________________________________________________________________________ activation_170 (Activation) (None, None, None, 3 0 batch_normalization_170[0][0] __________________________________________________________________________________________________ mixed9_0 (Concatenate) (None, None, None, 7 0 activation_172[0][0] activation_173[0][0] __________________________________________________________________________________________________ concatenate_2 (Concatenate) (None, None, None, 7 0 activation_176[0][0] activation_177[0][0] __________________________________________________________________________________________________ activation_178 (Activation) (None, None, None, 1 0 batch_normalization_178[0][0] __________________________________________________________________________________________________ mixed9 (Concatenate) (None, None, None, 2 0 activation_170[0][0] mixed9_0[0][0] concatenate_2[0][0] activation_178[0][0] __________________________________________________________________________________________________ conv2d_183 (Conv2D) (None, None, None, 4 917504 mixed9[0][0] __________________________________________________________________________________________________ batch_normalization_183 (BatchN (None, None, None, 4 1344 conv2d_183[0][0] __________________________________________________________________________________________________ activation_183 (Activation) (None, None, None, 4 0 batch_normalization_183[0][0] __________________________________________________________________________________________________ conv2d_180 (Conv2D) (None, None, None, 3 786432 mixed9[0][0] __________________________________________________________________________________________________ conv2d_184 (Conv2D) (None, None, None, 3 1548288 activation_183[0][0] __________________________________________________________________________________________________ batch_normalization_180 (BatchN (None, None, None, 3 1152 conv2d_180[0][0] __________________________________________________________________________________________________ batch_normalization_184 (BatchN (None, None, None, 3 1152 conv2d_184[0][0] __________________________________________________________________________________________________ activation_180 (Activation) (None, None, None, 3 0 batch_normalization_180[0][0] __________________________________________________________________________________________________ activation_184 (Activation) (None, None, None, 3 0 batch_normalization_184[0][0] __________________________________________________________________________________________________ conv2d_181 (Conv2D) (None, None, None, 3 442368 activation_180[0][0] __________________________________________________________________________________________________ conv2d_182 (Conv2D) (None, None, None, 3 442368 activation_180[0][0] __________________________________________________________________________________________________ conv2d_185 (Conv2D) (None, None, None, 3 442368 activation_184[0][0] __________________________________________________________________________________________________ conv2d_186 (Conv2D) (None, None, None, 3 442368 activation_184[0][0] __________________________________________________________________________________________________ average_pooling2d_17 (AveragePo (None, None, None, 2 0 mixed9[0][0] __________________________________________________________________________________________________ conv2d_179 (Conv2D) (None, None, None, 3 655360 mixed9[0][0] __________________________________________________________________________________________________ batch_normalization_181 (BatchN (None, None, None, 3 1152 conv2d_181[0][0] __________________________________________________________________________________________________ batch_normalization_182 (BatchN (None, None, None, 3 1152 conv2d_182[0][0] __________________________________________________________________________________________________ batch_normalization_185 (BatchN (None, None, None, 3 1152 conv2d_185[0][0] __________________________________________________________________________________________________ batch_normalization_186 (BatchN (None, None, None, 3 1152 conv2d_186[0][0] __________________________________________________________________________________________________ conv2d_187 (Conv2D) (None, None, None, 1 393216 average_pooling2d_17[0][0] __________________________________________________________________________________________________ batch_normalization_179 (BatchN (None, None, None, 3 960 conv2d_179[0][0] __________________________________________________________________________________________________ activation_181 (Activation) (None, None, None, 3 0 batch_normalization_181[0][0] __________________________________________________________________________________________________ activation_182 (Activation) (None, None, None, 3 0 batch_normalization_182[0][0] __________________________________________________________________________________________________ activation_185 (Activation) (None, None, None, 3 0 batch_normalization_185[0][0] __________________________________________________________________________________________________ activation_186 (Activation) (None, None, None, 3 0 batch_normalization_186[0][0] __________________________________________________________________________________________________ batch_normalization_187 (BatchN (None, None, None, 1 576 conv2d_187[0][0] __________________________________________________________________________________________________ activation_179 (Activation) (None, None, None, 3 0 batch_normalization_179[0][0] __________________________________________________________________________________________________ mixed9_1 (Concatenate) (None, None, None, 7 0 activation_181[0][0] activation_182[0][0] __________________________________________________________________________________________________ concatenate_3 (Concatenate) (None, None, None, 7 0 activation_185[0][0] activation_186[0][0] __________________________________________________________________________________________________ activation_187 (Activation) (None, None, None, 1 0 batch_normalization_187[0][0] __________________________________________________________________________________________________ mixed10 (Concatenate) (None, None, None, 2 0 activation_179[0][0] mixed9_1[0][0] concatenate_3[0][0] activation_187[0][0] ================================================================================================== Total params: 21,802,784 Trainable params: 21,768,352 Non-trainable params: 34,432 __________________________________________________________________________________________________