Converting a TensorFlow Lite Model#
You can download a TensorFlow Lite model from
Kaggle
or Hugging Face.
To convert the model, run model conversion with the path to the .tflite
model file:
import openvino as ov
ov.convert_model('your_model_file.tflite')
ovc your_model_file.tflite
Note
TensorFlow Lite model file can be loaded by openvino.Core.read_model
or
openvino.Core.compile_model
methods by OpenVINO runtime API without preparing
OpenVINO IR first. Refer to the
inference example
for more details. Using openvino.convert_model
is still recommended if model
load latency matters for the inference application.
Supported TensorFlow Lite Layers#
For the list of supported standard layers, refer to the Supported Operations page.
Supported TensorFlow Lite Models#
More than eighty percent of public TensorFlow Lite models are supported from open sources Kaggle and MediaPipe. Unsupported models usually have custom TensorFlow Lite operations.