Converting a TensorFlow Lite Model¶
To convert an TensorFlow Lite model, run model conversion with the path to the
.tflite model file:
import openvino as ov
TensorFlow Lite model file can be loaded by
openvino.Core.compile_model methods by OpenVINO runtime API without preparing OpenVINO IR first. Refer to the inference example for more details. Using
openvino.convert_model is still recommended if model load latency matters for the inference application.
Supported TensorFlow Lite Layers¶
For the list of supported standard layers, refer to the Supported Operations page.