Install Intel® Distribution of OpenVINO™ Toolkit From a Docker Image#

This guide presents information on how to use a pre-built Docker image or create a new image manually, to install OpenVINO™ Runtime.

You can get started easily with pre-built and published docker images, which are available at:

Note

The Ubuntu20 and Ubuntu22 Docker images (runtime and development) now include the tokenizers and GenAI CPP modules. The development versions of these images also have the Python modules for these components pre-installed.

You can use the available Dockerfiles on GitHub or generate a Dockerfile with your settings via DockerHub CI framework, which can generate a Dockerfile, build, test, and deploy an image using the Intel® Distribution of OpenVINO™ toolkit. You can reuse available Dockerfiles, add your layer and customize the OpenVINO™ image to your needs. The Docker CI repository includes guides on how to get started with docker images and how to use OpenVINO™ Toolkit containers with GPU accelerators.

To start using Dockerfiles, install Docker Engine or a compatible container engine on your system:

OpenVINO can be installed under Windows Subsystem for Linux (WSL2).

Also, verify you have permissions to run containers (sudo or docker group membership).

Note

OpenVINO’s Docker and Bare Metal distributions are identical, so the documentation applies to both.

Note that starting with OpenVINO 2024.4, Ubuntu docker images will no longer be provided and will be replaced by Debian-based ones.

Note

OpenVINO development environment in a docker container is also available in the notebook repository. It can be implemented in OpenShift RedHat OpenData Science (RHODS).

More information about Docker CI for Intel® Distribution of OpenVINO™ toolset can be found here