Path: blob/master/site/en-snapshot/lite/guide/authoring.ipynb
37813 views
Copyright 2021 The TensorFlow Authors.
TFLite Authoring Tool
TensorFlow Lite Authoring API provides a way to maintain your tf.function models compatibile with TensorFlow Lite.
Setup
TensorFlow to TensorFlow Lite compatibility issue
If you want to use your TF model on devices, you need to convert it to a TFLite model to use it from TFLite interpreter. During the conversion, you might encounter a compatibility error because of unsupported TensorFlow ops by the TFLite builtin op set.
This is a kind of annoying issue. How can you detect it earlier like the model authoring time?
Note that the following code will fail on the converter.convert() call.
Simple Target Aware Authoring usage
We introduced Authoring API to detect the TensorFlow Lite compatibility issue during the model authoring time.
You just need to add @tf.lite.experimental.authoring.compatible decorator to wrap your tf.function model to check TFLite compatibility.
After this, the compatibility will be checked automatically when you evaluate your model.
If any TensorFlow Lite compatibility issue is found, it will show COMPATIBILITY WARNING or COMPATIBILITY ERROR with the exact location of the problematic op. In this example, it shows the location of tf.Cosh op in your tf.function model.
You can also check the compatiblity log with the <function_name>.get_compatibility_log() method.
Raise an exception for an incompatibility
You can provide an option to the @tf.lite.experimental.authoring.compatible decorator. The raise_exception option gives you an exception when you're trying to evaluate the decorated model.
Specifying "Select TF ops" usage
If you're already aware of Select TF ops usage, you can tell this to the Authoring API by setting converter_target_spec. It's the same tf.lite.TargetSpec object you'll use it for tf.lite.TFLiteConverter API.
Checking GPU compatibility
If you want to ensure your model is compatibile with GPU delegate of TensorFlow Lite, you can set experimental_supported_backends of tf.lite.TargetSpec.
The following example shows how to ensure GPU delegate compatibility of your model. Note that this model has compatibility issues since it uses a 2D tensor with tf.slice operator and unsupported tf.cosh operator. You'll see two COMPATIBILITY WARNING with the location information.
Read more
For more information, please refer to:
View on TensorFlow.org
Run in Google Colab
View source on GitHub
Download notebook