Documentation

This section provides reference documents that guide you through developing your own deep learning applications with the OpenVINO™ toolkit. These documents will most helpful if you have first gone through the Get Started guide.

Converting and Preparing Models

With the Model Downloader and Model Optimizer guides, you will learn to download pre-trained models and convert them for use with the OpenVINO™ toolkit. You can provide your own model or choose a public or Intel model from a broad selection provided in the Open Model Zoo.

Deploying Inference

The Inference Engine Developer Guide explains the process of creating your own application that runs inference with the OpenVINO™ toolkit. The API Reference defines the Inference Engine API for Python, C++, and C and the nGraph API for Python and C++. The Inference Engine API is what you’ll use to create an OpenVINO™ application, while the nGraph API is available for using enhanced operations sets and other features. After writing your application, you can use the Deployment Manager for deploying to target devices.

Tuning for Performance

The toolkit provides a Performance Optimization Guide and utilities for squeezing the best performance out of your application, including Accuracy Checker, Post-Training Optimization Tool, and other tools for measuring accuracy, benchmarking performance, and tuning your application.

Graphical Web Interface for OpenVINO™ Toolkit

You can choose to use the OpenVINO™ Deep Learning Workbench, a web-based tool that guides you through the process of converting, measuring, optimizing, and deploying models. This tool also serves as a low-effort introduction to the toolkit and provides a variety of useful interactive charts for understanding performance.

Media Processing

The OpenVINO™ toolkit comes with several sets of libraries and tools that add capability and flexibility to the toolkit. These include DL Streamer, a utility that eases creation of pipelines via command line or API, and optimized versions of OpenCV and OpenCL.