OpenVINO™ Family

OpenVINO™ is a family of tools and utilities providing a comprehensive workflow for developing deep learning solutions. Learn more about each of them to the full potential of OpenVINO™.

DL Workbench

An alternative, web-based version of OpenVINO designed to make production of pretrained deep learning models significantly easier.

Docker Hub PyPI

DL Streamer

A streaming media analytics framework, based on GStreamer* multimedia framework, for creating complex media analytics pipelines.

OpenVINO™ Toolkit installer Docker Hub GitHub

OpenVINO™ Model Server (OVMS)

A scalable, high-performance solution for serving deep learning models optimized for Intel architectures. The server uses Inference Engine libraries as a backend and exposes gRPC and HTTP/REST interfaces for inference that are fully compatible with TensorFlow Serving.

Docker Hub GitHub Red Hat Ecosystem Catalog

OpenVINO™ integration with TensorFlow (OVTF)

A solution empowering TensorFlow developers with OpenVINO's optimization capabilities. With just two lines of code your application, you can offload inference to OpenVINO, while keeping the TensorFlow API.

PyPI GitHub

Neural Network Compression Framework (NNCF)

A suite of advanced algorithms for Neural Network inference optimization with minimal accuracy drop. NNCF applies quantization, filter pruning, binarization and sparsity algorithms to PyTorch and TensorFlow models during training.

PyPI GitHub

OpenVINO™ Training Extensions (OTE)

A convenient environment to train Deep Learning models and convert them using the OpenVINO™ toolkit for optimized inference.


Computer Vision Annotation Tool (CVAT)

An online, interactive video and image annotation tool for computer vision purposes.

Docker Hub GitHub web application

Dataset Management Framework (Datumaro)

A framework and CLI tool to build, transform, and analyze datasets.

PyPI GitHub

OpenVINO™ Security Add-on

A solution for Model Developers and Independent Software Vendors to use secure packaging and secure model execution.