Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
tensorflow
GitHub Repository: tensorflow/docs-l10n
Path: blob/master/site/en-snapshot/lite/guide/model_analyzer.ipynb
25118 views
Kernel: Python 3
#@title Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # https://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License.

TensorFlow Lite Model Analyzer

TensorFlow Lite Model Analyzer API helps you analyze models in TensorFlow Lite format by listing a model's structure.

Model Analyzer API

The following API is available for the TensorFlow Lite Model Analyzer.

tf.lite.experimental.Analyzer.analyze(model_path=None, model_content=None, gpu_compatibility=False)

You can find the API details from https://www.tensorflow.org/api_docs/python/tf/lite/experimental/Analyzer or run help(tf.lite.experimental.Analyzer.analyze) from a Python terminal.

Basic usage with simple Keras model

The following code shows basic usage of Model Analyzer. It shows contents of the converted Keras model in TFLite model content, formatted as a flatbuffer object.

import tensorflow as tf model = tf.keras.models.Sequential([ tf.keras.layers.Flatten(input_shape=(128, 128)), tf.keras.layers.Dense(256, activation='relu'), tf.keras.layers.Dropout(0.2), tf.keras.layers.Dense(10) ]) fb_model = tf.lite.TFLiteConverter.from_keras_model(model).convert() tf.lite.experimental.Analyzer.analyze(model_content=fb_model)

Basic usage with MobileNetV3Large Keras model

This API works with large models such as MobileNetV3Large. Since the output is large, you might want to browse it with your favorite text editor.

model = tf.keras.applications.MobileNetV3Large() fb_model = tf.lite.TFLiteConverter.from_keras_model(model).convert() tf.lite.experimental.Analyzer.analyze(model_content=fb_model)

Check GPU delegate compatibility

The ModelAnalyzer API provides a way to check the GPU delegate compatibility of the given model by providing gpu_compatibility=True option.

Case 1: When model is incompatibile

The following code shows a way to use gpu_compatibility=True option for simple tf.function which uses tf.slice with a 2D tensor and tf.cosh which are not compatible with GPU delegate.

You will see GPU COMPATIBILITY WARNING per every node which has compatibility issue(s).

import tensorflow as tf @tf.function(input_signature=[ tf.TensorSpec(shape=[4, 4], dtype=tf.float32) ]) def func(x): return tf.cosh(x) + tf.slice(x, [1, 1], [1, 1]) converter = tf.lite.TFLiteConverter.from_concrete_functions( [func.get_concrete_function()], func) converter.target_spec.supported_ops = [ tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS, ] fb_model = converter.convert() tf.lite.experimental.Analyzer.analyze(model_content=fb_model, gpu_compatibility=True)

Case 2: When model is compatibile

In this example, the given model is compatbile with GPU delegate.

Note: Even though the tool doesn't find any compatibility issue, it doesn't guarantee that your model works well with GPU delegate on every device. There could be some runtime incompatibililty happen such as missing CL_DEVICE_IMAGE_SUPPORT feature by target OpenGL backend.

model = tf.keras.models.Sequential([ tf.keras.layers.Flatten(input_shape=(128, 128)), tf.keras.layers.Dense(256, activation='relu'), tf.keras.layers.Dropout(0.2), tf.keras.layers.Dense(10) ]) fb_model = tf.lite.TFLiteConverter.from_keras_model(model).convert() tf.lite.experimental.Analyzer.analyze(model_content=fb_model, gpu_compatibility=True)