Install OpenVINO™ Runtime on Linux from an Archive File¶
With the OpenVINO™ 2022.2 release, you can download and use archive files to install OpenVINO Runtime. The archive files contain pre-built binaries and library files needed for OpenVINO Runtime, as well as code samples.
Installing OpenVINO Runtime from archive files is recommended for C++ developers. If you are working with Python, the PyPI package has everything needed for Python development and deployment on CPU and GPUs. See the Install OpenVINO from PyPI page for instructions on how to install OpenVINO Runtime for Python using PyPI.
Since the OpenVINO™ 2022.1 release, the following development tools: Model Optimizer, Post-Training Optimization Tool, Model Downloader and other Open Model Zoo tools, Accuracy Checker, and Annotation Converter can be installed via pypi.org only.
See the Release Notes for more information on updates in the latest release.
Ubuntu 18.04 long-term support (LTS) x86, 64-bit
Ubuntu 20.04 long-term support (LTS) x86, 64-bit
Red Hat Enterprise Linux 8 x86, 64-bit
Since the OpenVINO™ 2022.1 release, CentOS 7.6, 64-bit is no longer supported.
Optimized for these processors:
6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors
3rd generation Intel® Xeon® Scalable processor (formerly code named Cooper Lake)
Intel® Xeon® Scalable processor (formerly Skylake and Cascade Lake)
Intel Atom® processor with support for Intel® Streaming SIMD Extensions 4.1 (Intel® SSE4.1)
Intel Pentium® processor N4200/5, N3350/5, or N3450/5 with Intel® HD Graphics
Intel® Iris® Xe MAX Graphics
Intel® Neural Compute Stick 2
Intel® Vision Accelerator Design with Intel® Movidius™ VPUs
Processor graphics are not included in all processors. See Product Specifications for information about your processor.
Installing OpenVINO Runtime¶
Step 1: Download and Install the OpenVINO Core Components¶
Open a command prompt terminal window. You can use the keyboard shortcut: Ctrl+Alt+T
/opt/intelfolder for OpenVINO by using the following command. If the folder already exists, skip this step.
sudo mkdir /opt/intel
/opt/intelpath is the recommended folder path for administrators or root users. If you prefer to install OpenVINO in regular userspace, the recommended path is
/home/<USER>/intel. You may use a different path if desired.
Browse to the current user’s
Download the OpenVINO Runtime archive file for your system, extract the files, rename the extracted folder and move it to the desired path:
curl -L https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.2/linux/l_openvino_toolkit_ubuntu18_2022.2.0.7713.af16ea1d79a_x86_64.tgz --output openvino_2022.2.0.7713.tgz tar -xf openvino_2022.2.0.7713.tgz sudo mv l_openvino_toolkit_ubuntu18_2022.2.0.7713.af16ea1d79a_x86_64 /opt/intel/openvino_2022.2.0.7713
curl -L https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.2/linux/l_openvino_toolkit_ubuntu20_2022.2.0.7713.af16ea1d79a_x86_64.tgz --output openvino_2022.2.0.7713.tgz tar -xf openvino_2022.2.0.7713.tgz sudo mv l_openvino_toolkit_ubuntu20_2022.2.0.7713.af16ea1d79a_x86_64 /opt/intel/openvino_2022.2.0.7713
curl -L https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.2/linux/l_openvino_toolkit_rhel8_2022.2.0.7713.af16ea1d79a_x86_64.tgz --output openvino_2022.2.0.7713.tgz tar -xf openvino_2022.2.0.7713.tgz sudo mv l_openvino_toolkit_rhel8_2022.2.0.7713.af16ea1d79a_x86_64 /opt/intel/openvino_2022.2.0.7713
For simplicity, it is useful to create a symbolic link as below:
sudo ln -s openvino_2022.2.0.7713 openvino_2022
If you have already installed a previous release of OpenVINO 2022, a symbolic link to the
openvino_2022folder may already exist. Unlink the previous link with
sudo unlink openvino_2022, and then re-run the command above.
Congratulations, you finished the installation! The
/opt/intel/openvino_2022 folder now contains the core components for OpenVINO. If you used a different path in Step 2, for example,
/home/<USER>/Intel/, OpenVINO is then installed in
/home/<USER>/Intel/openvino_2022. The path to the
openvino_2022 directory is also referred as
<INSTALL_DIR> throughout the OpenVINO documentation.
Step 2: Configure the Environment¶
You must update several environment variables before you can compile and run OpenVINO applications. Open a terminal window and run the
setupvars.sh script as shown below to temporarily set your environment variables. If your <INSTALL_DIR> is not
/opt/intel/openvino_2022, use the correct one instead.
If you have more than one OpenVINO version on your machine, you can easily switch its version by sourcing the
setupvars.sh of your choice.
The above command must be re-run every time you start a new terminal session. To set up Linux to automatically run the command every time a new terminal is opened, open
~/.bashrc in your favorite editor and add
source /opt/intel/openvino_2022/setupvars.sh after the last line. Next time when you open a terminal, you will see
[setupvars.sh] OpenVINO™ environment initialized. Changing
.bashrc is not recommended when you have multiple OpenVINO versions on your machine and want to switch among them.
The environment variables are set. Continue to the next section if you want to download any additional components.
Step 3 (Optional): Install Additional Components¶
OpenVINO Development Tools is a set of utilities for working with OpenVINO and OpenVINO models. It provides tools like Model Optimizer, Benchmark Tool, Post-Training Optimization Tool, and Open Model Zoo Downloader. If you install OpenVINO Runtime using archive files, OpenVINO Development Tools must be installed separately.
See the Install OpenVINO Development Tools page for step-by-step installation instructions.
OpenCV is necessary to run demos from Open Model Zoo (OMZ). Some OpenVINO samples can also extend their capabilities when compiled with OpenCV as a dependency. To install OpenCV for OpenVINO, see the instructions on GitHub.
Step 4 (Optional): Configure Inference on Non-CPU Devices¶
OpenVINO Runtime has a plugin architecture that enables you to run inference on multiple devices without rewriting your code. Supported devices include integrated GPUs, discrete GPUs, NCS2, VPUs, and GNAs. See the instructions below to set up OpenVINO on these devices.
To enable the toolkit components to use processor graphics (GPU) on your system, follow the steps in GPU Setup Guide.
To perform inference on Intel® Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X VPU, follow the steps on NCS2 Setup Guide.
To install and configure your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, see the VPU Configuration Guide. After configuration is done, you are ready to run the verification scripts with the HDDL Plugin for your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
While working with either HDDL or NCS, choose one of them as they cannot run simultaneously on the same machine.
To enable the toolkit components to use Intel® Gaussian & Neural Accelerator (GNA) on your system, follow the steps in GNA Setup Guide.
Now that you’ve installed OpenVINO Runtime, you’re ready to run your own machine learning applications! Learn more about how to integrate a model in OpenVINO applications by trying out the following tutorials.
Try the Python Quick Start Example to estimate depth in a scene using an OpenVINO monodepth model in a Jupyter Notebook inside your web browser.
Visit the Tutorials page for more Jupyter Notebooks to get you started with OpenVINO, such as:
Try the C++ Quick Start Example for step-by-step instructions on building and running a basic image classification C++ application.
Visit the Samples page for other C++ example applications to get you started with OpenVINO, such as:
Uninstalling the Intel® Distribution of OpenVINO™ Toolkit¶
To uninstall the toolkit, follow the steps on the Uninstalling page.
Converting models for use with OpenVINO™: Model Optimizer User Guide
Writing your own OpenVINO™ applications: OpenVINO™ Runtime User Guide
Sample applications: OpenVINO™ Toolkit Samples Overview
Pre-trained deep learning models: Overview of OpenVINO™ Toolkit Pre-Trained Models
IoT libraries and code samples in the GitHub repository: Intel® IoT Developer Kit