Get Started with OpenVINO™ Toolkit on Raspbian* OS¶
The OpenVINO™ toolkit optimizes and runs Deep Learning Neural Network models on Intel® hardware. This guide helps you get started with the OpenVINO™ toolkit you installed on Raspbian* OS.
In this guide, you will:
Learn the OpenVINO™ inference workflow.
Build and run sample code using detailed instructions.
OpenVINO™ Toolkit Components¶
On Raspbian* OS, the OpenVINO™ toolkit consists of the following components:
Inference Engine: The software libraries that run inference against the Intermediate Representation (optimized model) to produce inference results.
MYRIAD Plugin: The plugin developed for inference of neural networks on Intel® Neural Compute Stick 2.
The OpenVINO™ package for Raspberry* does not include the Model Optimizer. To convert models to Intermediate Representation (IR), you need to install it separately to your host machine.
The package does not include the Open Model Zoo demo applications. You can download them separately from the Open Models Zoo repository.
In addition, code samples are provided to help you get up and running with the toolkit.
Intel® Distribution of OpenVINO™ Toolkit Directory Structure¶
This guide assumes you completed all Intel® Distribution of OpenVINO™ toolkit installation and configuration steps. If you have not yet installed and configured the toolkit, see Install Intel® Distribution of OpenVINO™ toolkit for Raspbian*.
The OpenVINO toolkit for Raspbian* OS is distributed without installer. This document refers to the directory to which you unpacked the toolkit package as
The primary tools for deploying your models and applications are installed to the
Inference Engine directory. Contains Inference Engine API binaries and source files, samples and extensions source files, and resources like hardware drivers.
Third-party dependencies and drivers.
Inference Engine header files. For API documentation, see the Inference Engine API Reference .
Inference Engine libraries.
Inference Engine samples. Contains source code for C++ and Python* samples and build scripts. See the Inference Engine Samples Overview .
CMake configuration files for linking with Inference Engine.
OpenVINO™ Workflow Overview¶
The OpenVINO™ workflow on Raspbian* OS is as follows:
Get a pre-trained model for your inference task. If you want to use your model for inference, the model must be converted to the
.xmlIntermediate Representation (IR) files, which are used as input by Inference Engine. On Raspberry PI, OpenVINO™ toolkit includes only the Inference Engine module. The Model Optimizer is not supported on this platform. To get the optimized models you can use one of the following options:
For more information on pre-trained models, see Pre-Trained Models Documentation
Convert a model using the Model Optimizer from a full installation of Intel® Distribution of OpenVINO™ toolkit on one of the supported platforms. Installation instructions are available:
Use the Inference Engine API in the application to run inference against the Intermediate Representation (optimized model) and output inference results. The application can be an OpenVINO™ sample or your own application.
Build and Run Code Samples¶
Follow the steps below to run pre-trained Face Detection network using Inference Engine samples from the OpenVINO toolkit.
Create a samples build directory. This example uses a directory named
mkdir build && cd build
Build the Object Detection Sample with the following command:
cmake -DCMAKE_BUILD_TYPE=Release -DCMAKE_CXX_FLAGS="-march=armv7-a" /opt/intel/openvino_2021/deployment_tools/inference_engine/samples/cpp make -j2 object_detection_sample_ssd
Download the pre-trained Face Detection model with the Model Downloader tool :
git clone --depth 1 https://github.com/openvinotoolkit/open_model_zoo cd open_model_zoo/tools/downloader python3 -m pip install -r requirements.in python3 downloader.py --name face-detection-adas-0001
Run the sample, specifying the model and path to the input image:
./armv7l/Release/object_detection_sample_ssd -m face-detection-adas-0001.xml -d MYRIAD -i <path_to_image>
The application outputs an image (
out_0.bmp) with detected faced enclosed in rectangles.
Basic Guidelines for Using Code Samples¶
Following are some basic guidelines for executing the OpenVINO™ workflow using the code samples:
Before using the OpenVINO™ samples, always set up the environment:
Have the directory path for the following:
Use these resources to learn more about the OpenVINO™ toolkit: