Public Member Functions
InferenceEngine::Core Class Reference

This class represents Inference Engine Core entity. More...

#include <ie_core.hpp>

Public Member Functions

 Core (const std::string &xmlConfigFile={})
 Constructs Inference Engine Core instance using XML configuration file with plugins description. More...
 
std::map< std::string, VersionGetVersions (const std::string &deviceName) const
 Returns plugins version information. More...
 
CNNNetwork ReadNetwork (const std::string &modelPath, const std::string &binPath={}) const
 Reads models from IR and ONNX formats. More...
 
CNNNetwork ReadNetwork (const std::string &model, const Blob::CPtr &weights) const
 Reads models from IR and ONNX formats. More...
 
ExecutableNetwork LoadNetwork (const CNNNetwork &network, const std::string &deviceName, const std::map< std::string, std::string > &config={})
 Creates an executable network from a network object. More...
 
void AddExtension (const IExtensionPtr &extension)
 Registers extension. More...
 
ExecutableNetwork LoadNetwork (const CNNNetwork &network, RemoteContext::Ptr context, const std::map< std::string, std::string > &config={})
 Creates an executable network from a network object within a specified remote context. More...
 
void AddExtension (IExtensionPtr extension, const std::string &deviceName)
 Registers extension for the specified plugin. More...
 
ExecutableNetwork ImportNetwork (const std::string &modelFileName, const std::string &deviceName, const std::map< std::string, std::string > &config={})
 Creates an executable network from a previously exported network. More...
 
ExecutableNetwork ImportNetwork (std::istream &networkModel, const std::string &deviceName={}, const std::map< std::string, std::string > &config={})
 Creates an executable network from a previously exported network. More...
 
ExecutableNetwork ImportNetwork (std::istream &networkModel, const RemoteContext::Ptr &context, const std::map< std::string, std::string > &config={})
 Creates an executable network from a previously exported network within a specified remote context. More...
 
QueryNetworkResult QueryNetwork (const CNNNetwork &network, const std::string &deviceName, const std::map< std::string, std::string > &config={}) const
 Query device if it supports specified network with specified configuration. More...
 
void SetConfig (const std::map< std::string, std::string > &config, const std::string &deviceName={})
 Sets configuration for device, acceptable keys can be found in ie_plugin_config.hpp. More...
 
Parameter GetConfig (const std::string &deviceName, const std::string &name) const
 Gets configuration dedicated to device behaviour. More...
 
Parameter GetMetric (const std::string &deviceName, const std::string &name) const
 Gets general runtime metric for dedicated hardware. More...
 
std::vector< std::string > GetAvailableDevices () const
 Returns devices available for neural networks inference. More...
 
void RegisterPlugin (const std::string &pluginName, const std::string &deviceName)
 Register new device and plugin which implement this device inside Inference Engine. More...
 
void UnregisterPlugin (const std::string &deviceName)
 Unloads previously loaded plugin with a specified name from Inference Engine The method is needed to remove plugin instance and free its resources. If plugin for a specified device has not been created before, the method throws an exception. More...
 
void RegisterPlugins (const std::string &xmlConfigFile)
 Registers plugin to Inference Engine Core instance using XML configuration file with plugins description. More...
 
RemoteContext::Ptr CreateContext (const std::string &deviceName, const ParamMap &params)
 Create a new shared context object on specified accelerator device using specified plugin-specific low level device API parameters (device handle, pointer, etc.) More...
 
RemoteContext::Ptr GetDefaultContext (const std::string &deviceName)
 Get a pointer to default(plugin-supplied) shared context object for specified accelerator device. More...
 

Detailed Description

This class represents Inference Engine Core entity.

It can throw exceptions safely for the application, where it is properly handled.

Constructor & Destructor Documentation

◆ Core()

InferenceEngine::Core::Core ( const std::string &  xmlConfigFile = {})
explicit

Constructs Inference Engine Core instance using XML configuration file with plugins description.

See RegisterPlugins for more details.

Parameters
xmlConfigFileA path to .xml file with plugins to load from. If XML configuration file is not specified, then default Inference Engine plugins are loaded from the default plugin.xml file.

Member Function Documentation

◆ AddExtension() [1/2]

void InferenceEngine::Core::AddExtension ( const IExtensionPtr extension)

Registers extension.

Parameters
extensionPointer to already loaded extension

◆ AddExtension() [2/2]

void InferenceEngine::Core::AddExtension ( IExtensionPtr  extension,
const std::string &  deviceName 
)

Registers extension for the specified plugin.

Parameters
extensionPointer to already loaded extension
deviceNameDevice name to identify plugin to add an executable extension

◆ CreateContext()

RemoteContext::Ptr InferenceEngine::Core::CreateContext ( const std::string &  deviceName,
const ParamMap params 
)

Create a new shared context object on specified accelerator device using specified plugin-specific low level device API parameters (device handle, pointer, etc.)

Parameters
deviceNameName of a device to create new shared context on.
paramsMap of device-specific shared context parameters.
Returns
A shared pointer to a created remote context.

◆ GetAvailableDevices()

std::vector<std::string> InferenceEngine::Core::GetAvailableDevices ( ) const

Returns devices available for neural networks inference.

Returns
A vector of devices. The devices are returned as { CPU, FPGA.0, FPGA.1, MYRIAD } If there more than one device of specific type, they are enumerated with .# suffix.

◆ GetConfig()

Parameter InferenceEngine::Core::GetConfig ( const std::string &  deviceName,
const std::string &  name 
) const

Gets configuration dedicated to device behaviour.

The method is targeted to extract information which can be set via SetConfig method.

Parameters
deviceName- A name of a device to get a configuration value.
name- value of config corresponding to config key.
Returns
Value of config corresponding to config key.

◆ GetDefaultContext()

RemoteContext::Ptr InferenceEngine::Core::GetDefaultContext ( const std::string &  deviceName)

Get a pointer to default(plugin-supplied) shared context object for specified accelerator device.

Parameters
deviceName- A name of a device to get create shared context from.
Returns
A shared pointer to a default remote context.

◆ GetMetric()

Parameter InferenceEngine::Core::GetMetric ( const std::string &  deviceName,
const std::string &  name 
) const

Gets general runtime metric for dedicated hardware.

The method is needed to request common device properties which are executable network agnostic. It can be device name, temperature, other devices-specific values.

Parameters
deviceName- A name of a device to get a metric value.
name- metric name to request.
Returns
Metric value corresponding to metric key.

◆ GetVersions()

std::map<std::string, Version> InferenceEngine::Core::GetVersions ( const std::string &  deviceName) const

Returns plugins version information.

Parameters
deviceNameDevice name to indentify plugin
Returns
A vector of versions

◆ ImportNetwork() [1/3]

ExecutableNetwork InferenceEngine::Core::ImportNetwork ( const std::string &  modelFileName,
const std::string &  deviceName,
const std::map< std::string, std::string > &  config = {} 
)

Creates an executable network from a previously exported network.

Parameters
deviceNameName of device load executable network on
modelFileNamePath to the location of the exported file
configOptional map of pairs: (config parameter name, config parameter value) relevant only for this load operation*
Returns
An executable network reference

◆ ImportNetwork() [2/3]

ExecutableNetwork InferenceEngine::Core::ImportNetwork ( std::istream &  networkModel,
const RemoteContext::Ptr context,
const std::map< std::string, std::string > &  config = {} 
)

Creates an executable network from a previously exported network within a specified remote context.

Parameters
networkModelNetwork model stream
contextPointer to RemoteContext object
configOptional map of pairs: (config parameter name, config parameter value) relevant only for this load operation
Returns
An executable network reference

◆ ImportNetwork() [3/3]

ExecutableNetwork InferenceEngine::Core::ImportNetwork ( std::istream &  networkModel,
const std::string &  deviceName = {},
const std::map< std::string, std::string > &  config = {} 
)

Creates an executable network from a previously exported network.

Parameters
deviceNameName of device load executable network on
networkModelnetwork model stream
configOptional map of pairs: (config parameter name, config parameter value) relevant only for this load operation*
Returns
An executable network reference

◆ LoadNetwork() [1/2]

ExecutableNetwork InferenceEngine::Core::LoadNetwork ( const CNNNetwork network,
const std::string &  deviceName,
const std::map< std::string, std::string > &  config = {} 
)

Creates an executable network from a network object.

Users can create as many networks as they need and use them simultaneously (up to the limitation of the hardware resources)

Parameters
networkCNNNetwork object acquired from Core::ReadNetwork
deviceNameName of device to load network to
configOptional map of pairs: (config parameter name, config parameter value) relevant only for this load operation
Returns
An executable network reference

◆ LoadNetwork() [2/2]

ExecutableNetwork InferenceEngine::Core::LoadNetwork ( const CNNNetwork network,
RemoteContext::Ptr  context,
const std::map< std::string, std::string > &  config = {} 
)

Creates an executable network from a network object within a specified remote context.

Parameters
networkCNNNetwork object acquired from Core::ReadNetwork
contextPointer to RemoteContext object
configOptional map of pairs: (config parameter name, config parameter value) relevant only for this load operation
Returns
An executable network object

◆ QueryNetwork()

QueryNetworkResult InferenceEngine::Core::QueryNetwork ( const CNNNetwork network,
const std::string &  deviceName,
const std::map< std::string, std::string > &  config = {} 
) const

Query device if it supports specified network with specified configuration.

Parameters
deviceNameA name of a device to query
networkNetwork object to query
configOptional map of pairs: (config parameter name, config parameter value)
Returns
An object containing a map of pairs a layer name -> a device name supporting this layer.

◆ ReadNetwork() [1/2]

CNNNetwork InferenceEngine::Core::ReadNetwork ( const std::string &  model,
const Blob::CPtr weights 
) const

Reads models from IR and ONNX formats.

Parameters
modelstring with model in IR or ONNX format
weightsshared pointer to constant blob with weights ONNX models doesn't support models with data blobs. For ONNX case the second parameter should contain empty blob.
Returns
CNNNetwork

◆ ReadNetwork() [2/2]

CNNNetwork InferenceEngine::Core::ReadNetwork ( const std::string &  modelPath,
const std::string &  binPath = {} 
) const

Reads models from IR and ONNX formats.

Parameters
modelPathpath to model
binPathpath to data file For IR format (*.bin):
  • if path is empty, will try to read bin file with the same name as xml and
  • if bin file with the same name was not found, will load IR without weights. ONNX models with data files are not supported
Returns
CNNNetwork

◆ RegisterPlugin()

void InferenceEngine::Core::RegisterPlugin ( const std::string &  pluginName,
const std::string &  deviceName 
)

Register new device and plugin which implement this device inside Inference Engine.

Parameters
pluginNameA name of plugin. Depending on platform pluginName is wrapped with shared library suffix and prefix to identify library full name
deviceNameA device name to register plugin for. If device name is not specified, then it's taken from plugin itself.

◆ RegisterPlugins()

void InferenceEngine::Core::RegisterPlugins ( const std::string &  xmlConfigFile)

Registers plugin to Inference Engine Core instance using XML configuration file with plugins description.

XML file has the following structure:

<ie>
<plugins>
<plugin name="" location="">
<extensions>
<extension location=""/>
</extensions>
<properties>
<property key="" value=""/>
</properties>
</plugin>
</plugins>
</ie>
  • name identifies name of device enabled by plugin
  • location specifies absolute path to dynamic library with plugin. A path can also be relative to inference engine shared library. It allows to have common config for different systems with different configurations.
  • Properties are set to plugin via the SetConfig method.
  • Extensions are set to plugin via the AddExtension method.
Parameters
xmlConfigFileA path to .xml file with plugins to register.

◆ SetConfig()

void InferenceEngine::Core::SetConfig ( const std::map< std::string, std::string > &  config,
const std::string &  deviceName = {} 
)

Sets configuration for device, acceptable keys can be found in ie_plugin_config.hpp.

Parameters
deviceNameAn optinal name of a device. If device name is not specified, the config is set for all the registered devices.
configMap of pairs: (config parameter name, config parameter value)

◆ UnregisterPlugin()

void InferenceEngine::Core::UnregisterPlugin ( const std::string &  deviceName)

Unloads previously loaded plugin with a specified name from Inference Engine The method is needed to remove plugin instance and free its resources. If plugin for a specified device has not been created before, the method throws an exception.

Parameters
deviceNameDevice name identifying plugin to remove from Inference Engine

The documentation for this class was generated from the following file: