Supported Model Formats

OpenVINO IR (Intermediate Representation) - the proprietary format of OpenVINO™, benefiting from the full extent of its features.

ONNX, PaddlePaddle - formats supported directly, which means they can be used with OpenVINO Runtime without any prior conversion. For a guide on how to run inference on ONNX and PaddlePaddle, see how to Integrate OpenVINO™ with Your Application.

TensorFlow, PyTorch, MXNet, Caffe, Kaldi - formats supported indirectly, which means they need to be converted to one of the formats listed before. Conversion from these formats to OpenVINO IR is performed with Model Optimizer. In some cases other converters need to be used as intermediaries.

Refer to the following articles for details on conversion for different formats and models: