You must configure the Model Optimizer for the framework that was used to train the model. This section tells you how to configure the Model Optimizer either through scripts or by using a manual process.
You can either configure all three frameworks at the same time or install an individual framework. The scripts delivered with the tool install all required dependencies and provide the fastest and easiest way to configure the Model Optimizer.
To configure all three frameworks, go to the <INSTALL_DIR>/deployment_tools/model_optimizer/install_prerequisites
directory and run:
NOTE: This command installs prerequisites globally. If you want to keep Model Optimizer in a separate sandbox, run the following commands instead:
To configure a specific framework, go to the <INSTALL_DIR>/deployment_tools/model_optimizer/install_prerequisites
directory and run:
IMPORTANT: ONLY FOR CAFFE* By default, you do not need to install Caffe to create an Intermediate Representation for a Caffe model, unless you use Caffe for custom layer shape inference and do not write Model Optimizer extensions. To learn more about implementing Model Optimizer custom operations and the limitations of using Caffe for shape inference, see Custom Layers in Model Optimizer.
If you prefer, you can manually configure the Model Optimizer for one framework at a time.
* Activate the virtual environment:
* To install dependencies only for Caffe:
* To install dependencies only for TensorFlow 1.x:
* To install dependencies only for TensorFlow 2.x:
* To install dependencies only for MXNet:
* To install dependencies only for Kaldi:
* To install dependencies only for ONNX:
These procedures require:
Model Optimizer uses the protobuf library to load trained Caffe models. By default, the library executes pure Python* language implementation, which is slow. These steps show how to use the faster C++ implementation of the protobuf library on Windows OS or Linux OS.
To use the C++ implementation of the protobuf library on Linux, it is enough to set up the environment variable:
On Windows, pre-built protobuf packages for Python versions 3.4, 3.5, 3.6, and 3.7 are provided with the installation package and can be found in the <INSTALL_DIR>\deployment_tools\model_optimizer\install_prerequisites
folder. Please note that they are not installed with the install_prerequisites.bat
installation script due to possible issues with pip
, and you can install them at your own discretion. Make sure that you install the protobuf version that matches the Python version you use:
protobuf-3.6.1-py3.4-win-amd64.egg
for Python 3.4protobuf-3.6.1-py3.5-win-amd64.egg
for Python 3.5protobuf-3.6.1-py3.6-win-amd64.egg
for Python 3.6protobuf-3.6.1-py3.7-win-amd64.egg
for Python 3.7To install the protobuf package:
install_prerequisites
folder of the OpenVINO toolkit installation directory: Run the following command to install the protobuf for Python 3.6. If you want to install the protobuf for Python 3.4, 3.5, or 3.7, replace protobuf-3.6.1-py3.6-win-amd64.egg
with the corresponding file name from the list above.
If the Python version you use is lower than 3.4, you need to update it or build the library manually.
NOTE: These steps are optional. If you use Python version 3.4, 3.5, 3.6, or 3.7, you can install the protobuf library using the pre-built packages.
To compile the protobuf library from sources on Windows OS, do the following:
libprotobuf
and libprotobuf-lite
:libprotoc
, protoc
, libprotobuf
, and libprotobuf-lite
projects in the Release configuration.PATH
environment variable: python
directory: setup.py
options:libraries = ['protobuf']
libraries = ['libprotobuf', 'libprotobuf-lite']
extra_objects = ['../src/.libs/libprotobuf.a', '../src/.libs/libprotobuf-lite.a']
extra_objects = ['../cmake/build/solution/Release/libprotobuf.lib', '../cmake/build/solution/Release/libprotobuf-lite.lib']
docs\MO_DG\prepare_model\Config_Model_Optimizer.md docs\install_guides\installing-openvino-raspbian.md