class InferenceEngine::Core

Overview

This class represents Inference Engine Core entity. More…

#include <ie_core.hpp>

class Core
{
public:
    // construction

    Core();

    // methods

    std::map<std::string, Version>const std::string& GetVersions() const;
    CNNNetworkconst std::wstring&const std::wstring& ReadNetwork(, ) const;
    CNNNetworkconst std::string&const std::string& ReadNetwork(, ) const;
    CNNNetworkconst std::string&const Blob::CPtr& ReadNetwork(, ) const;

    ExecutableNetworkconst CNNNetwork&const std::map<std::string, std::string>& LoadNetwork(
        ,

        );

    ExecutableNetworkconst CNNNetwork&const std::string&const std::map<std::string, std::string>& LoadNetwork(
        ,
        ,

        );

    ExecutableNetworkconst std::string&const std::map<std::string, std::string>& LoadNetwork(
        ,

        );

    ExecutableNetworkconst std::string&const std::string&const std::map<std::string, std::string>& LoadNetwork(
        ,
        ,

        );

    voidconst IExtensionPtr& AddExtension();

    ExecutableNetworkconst CNNNetwork&RemoteContext::Ptrconst std::map<std::string, std::string>& LoadNetwork(
        ,
        ,

        );

    voidIExtensionPtrconst std::string& AddExtension(, );

    ExecutableNetworkconst std::string&const std::string&const std::map<std::string, std::string>& ImportNetwork(
        ,
        ,

        );

    ExecutableNetworkstd::istream&const std::string&const std::map<std::string, std::string>& ImportNetwork(
        ,
        ,

        );

    ExecutableNetworkstd::istream& ImportNetwork();

    ExecutableNetworkstd::istream&const RemoteContext::Ptr&const std::map<std::string, std::string>& ImportNetwork(
        ,
        ,

        );

    QueryNetworkResultconst CNNNetwork&const std::string&const std::map<std::string, std::string>& QueryNetwork(
        ,
        ,

        ) const;

    voidconst std::map<std::string, std::string>&const std::string& SetConfig(, );
    Parameterconst std::string&const std::string& GetConfig(, ) const;
    Parameterconst std::string&const std::string&const ParamMap& GetMetric(, , ) const;
    std::vector<std::string> GetAvailableDevices() const;
    voidconst std::string&const std::string& RegisterPlugin(, );
    voidconst std::string& UnregisterPlugin();
    voidconst std::string& RegisterPlugins();
    RemoteContext::Ptrconst std::string&const ParamMap& CreateContext(, );
    RemoteContext::Ptrconst std::string& GetDefaultContext();
};

Detailed Documentation

This class represents Inference Engine Core entity.

It can throw exceptions safely for the application, where it is properly handled.

Construction

Core()

Constructs an OpenVINO Core instance with devices and their plugins description.

There are two ways how to configure device plugins:

  1. (default) Use XML configuration file in case of dynamic libraries build;

  2. Use strictly defined configuration in case of static libraries build.

Parameters:

xmlConfigFile

Path to the .xml file with plugins to load from. If the XML configuration file is not specified, default OpenVINO Runtime plugins are loaded from:

  1. (dynamic build) default plugins.xml file located in the same folder as OpenVINO runtime shared library;

  2. (static build) statically defined configuration. In this case path to the .xml file is ignored.

Methods

std::map<std::string, Version>const std::string& GetVersions() const

Returns plugins version information.

Parameters:

deviceName

Device name to identify plugin

Returns:

A vector of versions

CNNNetworkconst std::wstring&const std::wstring& ReadNetwork(, ) const

Reads models from IR and ONNX formats.

Parameters:

modelPath

path to model

binPath

path to data file For IR format (*.bin):

  • if path is empty, will try to read bin file with the same name as xml and

  • if bin file with the same name was not found, will load IR without weights. For ONNX format (*.onnx):

  • binPath parameter is not used.

Returns:

CNNNetwork

CNNNetworkconst std::string&const std::string& ReadNetwork(, ) const

Reads models from IR and ONNX formats.

Parameters:

modelPath

path to model

binPath

path to data file For IR format (*.bin):

  • if path is empty, will try to read bin file with the same name as xml and

  • if bin file with the same name was not found, will load IR without weights. For ONNX format (*.onnx):

  • binPath parameter is not used.

Returns:

CNNNetwork

CNNNetworkconst std::string&const Blob::CPtr& ReadNetwork(, ) const

Reads models from IR and ONNX formats.

Created InferenceEngine::CNNNetwork object shares the weights with weights object. So, do not create weights on temporary data which can be later freed, since the network constant data becomes to point to invalid memory.

Parameters:

model

string with model in IR or ONNX format

weights

shared pointer to constant blob with weights Reading ONNX models doesn’t support loading weights from data blobs. If you are using an ONNX model with external data files, please use the InferenceEngine::Core::ReadNetwork(const std::string& model, const Blob::CPtr& weights) const function overload which takes a filesystem path to the model. For ONNX case the second parameter should contain empty blob.

Returns:

CNNNetwork

ExecutableNetworkconst CNNNetwork&const std::map<std::string, std::string>& LoadNetwork(
    ,

    )

Creates an executable network from a network object and uses AUTO plugin as the default device to load executable network.

Users can create as many networks as they need and use them simultaneously (up to the limitation of the hardware resources)

Parameters:

network

CNNNetwork object acquired from Core::ReadNetwork

config

Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation

Returns:

An executable network reference

ExecutableNetworkconst CNNNetwork&const std::string&const std::map<std::string, std::string>& LoadNetwork(
    ,
    ,

    )

Creates an executable network from a network object.

Users can create as many networks as they need and use them simultaneously (up to the limitation of the hardware resources)

Parameters:

network

CNNNetwork object acquired from Core::ReadNetwork

deviceName

Name of device to load network to

config

Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation

Returns:

An executable network reference

ExecutableNetworkconst std::string&const std::map<std::string, std::string>& LoadNetwork(
    ,

    )

Reads model and creates an executable network from IR or ONNX file and uses AUTO plugin as the default device to load executable network.

This can be more efficient than using ReadNetwork + LoadNetwork(CNNNetwork) flow especially for cases when caching is enabled and cached model is available

Parameters:

modelPath

path to model

config

Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation/

Returns:

An executable network reference

ExecutableNetworkconst std::string&const std::string&const std::map<std::string, std::string>& LoadNetwork(
    ,
    ,

    )

Reads model and creates an executable network from IR or ONNX file.

This can be more efficient than using ReadNetwork + LoadNetwork(CNNNetwork) flow especially for cases when caching is enabled and cached model is available

Parameters:

modelPath

path to model

deviceName

Name of device to load network to

config

Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation/

Returns:

An executable network reference

voidconst IExtensionPtr& AddExtension()

Registers extension.

Parameters:

extension

Pointer to already loaded extension

ExecutableNetworkconst CNNNetwork&RemoteContext::Ptrconst std::map<std::string, std::string>& LoadNetwork(
    ,
    ,

    )

Creates an executable network from a network object within a specified remote context.

Parameters:

network

CNNNetwork object acquired from Core::ReadNetwork

context

Pointer to RemoteContext object

config

Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation

Returns:

An executable network object

voidIExtensionPtrconst std::string& AddExtension(, )

Registers extension for the specified plugin.

Parameters:

extension

Pointer to already loaded extension

deviceName

Device name to identify plugin to add an executable extension

ExecutableNetworkconst std::string&const std::string&const std::map<std::string, std::string>& ImportNetwork(
    ,
    ,

    )

Creates an executable network from a previously exported network.

Parameters:

modelFileName

Path to the location of the exported file

deviceName

Name of device load executable network on

config

Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation*

Returns:

An executable network reference

ExecutableNetworkstd::istream&const std::string&const std::map<std::string, std::string>& ImportNetwork(
    ,
    ,

    )

Creates an executable network from a previously exported network.

Parameters:

networkModel

network model stream

deviceName

Name of device load executable network on

config

Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation*

Returns:

An executable network reference

ExecutableNetworkstd::istream& ImportNetwork()

Creates an executable network from a previously exported network.

Deprecated Use Core::ImportNetwork with explicit device name

Parameters:

networkModel

network model stream

Returns:

An executable network reference

ExecutableNetworkstd::istream&const RemoteContext::Ptr&const std::map<std::string, std::string>& ImportNetwork(
    ,
    ,

    )

Creates an executable network from a previously exported network within a specified remote context.

Parameters:

networkModel

Network model stream

context

Pointer to RemoteContext object

config

Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation

Returns:

An executable network reference

QueryNetworkResultconst CNNNetwork&const std::string&const std::map<std::string, std::string>& QueryNetwork(
    ,
    ,

    ) const

Query device if it supports specified network with specified configuration.

Parameters:

deviceName

A name of a device to query

network

Network object to query

config

Optional map of pairs: (config parameter name, config parameter value)

Returns:

An object containing a map of pairs a layer name -> a device name supporting this layer.

voidconst std::map<std::string, std::string>&const std::string& SetConfig(, )

Sets configuration for device, acceptable keys can be found in ie_plugin_config.hpp.

Parameters:

deviceName

An optional name of a device. If device name is not specified, the config is set for all the registered devices.

config

Map of pairs: (config parameter name, config parameter value)

Parameterconst std::string&const std::string& GetConfig(, ) const

Gets configuration dedicated to device behaviour.

The method is targeted to extract information which can be set via SetConfig method.

Parameters:

deviceName

  • A name of a device to get a configuration value.

name

  • config key.

Returns:

Value of config corresponding to config key.

Parameterconst std::string&const std::string&const ParamMap& GetMetric(, , ) const

Gets general runtime metric for dedicated hardware.

The method is needed to request common device properties which are executable network agnostic. It can be device name, temperature, other devices-specific values.

Parameters:

deviceName

  • A name of a device to get a metric value.

name

  • metric name to request.

options

  • optional parameters to get a metric value

Returns:

Metric value corresponding to metric key.

std::vector<std::string> GetAvailableDevices() const

Returns devices available for neural networks inference.

Returns:

A vector of devices. The devices are returned as { CPU, GPU.0, GPU.1, GNA } If there more than one device of specific type, they are enumerated with .# suffix.

voidconst std::string&const std::string& RegisterPlugin(, )

Register new device and plugin which implement this device inside Inference Engine.

Parameters:

plugin

Path (absolute or relative) or name of a plugin. Depending on platform, plugin is wrapped with shared library suffix and prefix to identify library full name

deviceName

A device name to register plugin for

voidconst std::string& UnregisterPlugin()

Unloads previously loaded plugin with a specified name from Inference Engine The method is needed to remove plugin instance and free its resources. If plugin for a specified device has not been created before, the method throws an exception.

Parameters:

deviceName

Device name identifying plugin to remove from Inference Engine

voidconst std::string& RegisterPlugins()

Registers plugin to Inference Engine Core instance using XML configuration file with plugins description.

XML file has the following structure:

<ie>
    <plugins>
        <plugin name="" location="">
            <extensions>
                <extension location=""/>
            </extensions>
            <properties>
                <property key="" value=""/>
            </properties>
        </plugin>
    </plugins>
</ie>
  • name identifies name of device enabled by plugin

  • location specifies absolute path to dynamic library with plugin. A path can also be relative to inference engine shared library. It allows to have common config for different systems with different configurations.

  • Properties are set to plugin via the SetConfig method.

  • Extensions are set to plugin via the AddExtension method.

Parameters:

xmlConfigFile

A path to .xml file with plugins to register.

RemoteContext::Ptrconst std::string&const ParamMap& CreateContext(, )

Create a new shared context object on specified accelerator device using specified plugin-specific low level device API parameters (device handle, pointer, etc.)

Parameters:

deviceName

Name of a device to create new shared context on.

params

Map of device-specific shared context parameters.

Returns:

A shared pointer to a created remote context.

RemoteContext::Ptrconst std::string& GetDefaultContext()

Get a pointer to default(plugin-supplied) shared context object for specified accelerator device.

Parameters:

deviceName

  • A name of a device to get create shared context from.

Returns:

A shared pointer to a default remote context.