This class represents Inference Engine Core entity. More...
#include <ie_core.hpp>
Public Member Functions | |
Core (const std::string &xmlConfigFile={}) | |
Constructs Inference Engine Core instance using XML configuration file with plugins description. More... | |
std::map< std::string, Version > | GetVersions (const std::string &deviceName) const |
Returns plugins version information. More... | |
CNNNetwork | ReadNetwork (const std::string &modelPath, const std::string &binPath={}) const |
Reads models from IR and ONNX formats. More... | |
CNNNetwork | ReadNetwork (const std::string &model, const Blob::CPtr &weights) const |
Reads models from IR and ONNX formats. More... | |
ExecutableNetwork | LoadNetwork (const CNNNetwork &network, const std::string &deviceName, const std::map< std::string, std::string > &config={}) |
Creates an executable network from a network object. More... | |
void | AddExtension (const IExtensionPtr &extension) |
Registers extension. More... | |
ExecutableNetwork | LoadNetwork (const CNNNetwork &network, RemoteContext::Ptr context, const std::map< std::string, std::string > &config={}) |
Creates an executable network from a network object within a specified remote context. More... | |
void | AddExtension (IExtensionPtr extension, const std::string &deviceName) |
Registers extension for the specified plugin. More... | |
ExecutableNetwork | ImportNetwork (const std::string &modelFileName, const std::string &deviceName, const std::map< std::string, std::string > &config={}) |
Creates an executable network from a previously exported network. More... | |
ExecutableNetwork | ImportNetwork (std::istream &networkModel, const std::string &deviceName={}, const std::map< std::string, std::string > &config={}) |
Creates an executable network from a previously exported network. More... | |
ExecutableNetwork | ImportNetwork (std::istream &networkModel, const RemoteContext::Ptr &context, const std::map< std::string, std::string > &config={}) |
Creates an executable network from a previously exported network within a specified remote context. More... | |
QueryNetworkResult | QueryNetwork (const CNNNetwork &network, const std::string &deviceName, const std::map< std::string, std::string > &config={}) const |
Query device if it supports specified network with specified configuration. More... | |
void | SetConfig (const std::map< std::string, std::string > &config, const std::string &deviceName={}) |
Sets configuration for device, acceptable keys can be found in ie_plugin_config.hpp. More... | |
Parameter | GetConfig (const std::string &deviceName, const std::string &name) const |
Gets configuration dedicated to device behaviour. More... | |
Parameter | GetMetric (const std::string &deviceName, const std::string &name) const |
Gets general runtime metric for dedicated hardware. More... | |
std::vector< std::string > | GetAvailableDevices () const |
Returns devices available for neural networks inference. More... | |
void | RegisterPlugin (const std::string &pluginName, const std::string &deviceName) |
Register new device and plugin which implement this device inside Inference Engine. More... | |
void | UnregisterPlugin (const std::string &deviceName) |
Unloads previously loaded plugin with a specified name from Inference Engine The method is needed to remove plugin instance and free its resources. If plugin for a specified device has not been created before, the method throws an exception. More... | |
void | RegisterPlugins (const std::string &xmlConfigFile) |
Registers plugin to Inference Engine Core instance using XML configuration file with plugins description. More... | |
RemoteContext::Ptr | CreateContext (const std::string &deviceName, const ParamMap ¶ms) |
Create a new shared context object on specified accelerator device using specified plugin-specific low level device API parameters (device handle, pointer, etc.) More... | |
RemoteContext::Ptr | GetDefaultContext (const std::string &deviceName) |
Get a pointer to default(plugin-supplied) shared context object for specified accelerator device. More... | |
This class represents Inference Engine Core entity.
It can throw exceptions safely for the application, where it is properly handled.
|
explicit |
Constructs Inference Engine Core instance using XML configuration file with plugins description.
See RegisterPlugins for more details.
xmlConfigFile | A path to .xml file with plugins to load from. If XML configuration file is not specified, then default Inference Engine plugins are loaded from the default plugin.xml file. |
void InferenceEngine::Core::AddExtension | ( | const IExtensionPtr & | extension | ) |
Registers extension.
extension | Pointer to already loaded extension |
void InferenceEngine::Core::AddExtension | ( | IExtensionPtr | extension, |
const std::string & | deviceName | ||
) |
Registers extension for the specified plugin.
extension | Pointer to already loaded extension |
deviceName | Device name to identify plugin to add an executable extension |
RemoteContext::Ptr InferenceEngine::Core::CreateContext | ( | const std::string & | deviceName, |
const ParamMap & | params | ||
) |
Create a new shared context object on specified accelerator device using specified plugin-specific low level device API parameters (device handle, pointer, etc.)
deviceName | Name of a device to create new shared context on. |
params | Map of device-specific shared context parameters. |
std::vector<std::string> InferenceEngine::Core::GetAvailableDevices | ( | ) | const |
Returns devices available for neural networks inference.
Parameter InferenceEngine::Core::GetConfig | ( | const std::string & | deviceName, |
const std::string & | name | ||
) | const |
Gets configuration dedicated to device behaviour.
The method is targeted to extract information which can be set via SetConfig method.
deviceName | - A name of a device to get a configuration value. |
name | - value of config corresponding to config key. |
RemoteContext::Ptr InferenceEngine::Core::GetDefaultContext | ( | const std::string & | deviceName | ) |
Get a pointer to default(plugin-supplied) shared context object for specified accelerator device.
deviceName | - A name of a device to get create shared context from. |
Parameter InferenceEngine::Core::GetMetric | ( | const std::string & | deviceName, |
const std::string & | name | ||
) | const |
Gets general runtime metric for dedicated hardware.
The method is needed to request common device properties which are executable network agnostic. It can be device name, temperature, other devices-specific values.
deviceName | - A name of a device to get a metric value. |
name | - metric name to request. |
std::map<std::string, Version> InferenceEngine::Core::GetVersions | ( | const std::string & | deviceName | ) | const |
Returns plugins version information.
deviceName | Device name to indentify plugin |
ExecutableNetwork InferenceEngine::Core::ImportNetwork | ( | const std::string & | modelFileName, |
const std::string & | deviceName, | ||
const std::map< std::string, std::string > & | config = {} |
||
) |
Creates an executable network from a previously exported network.
deviceName | Name of device load executable network on |
modelFileName | Path to the location of the exported file |
config | Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation* |
ExecutableNetwork InferenceEngine::Core::ImportNetwork | ( | std::istream & | networkModel, |
const RemoteContext::Ptr & | context, | ||
const std::map< std::string, std::string > & | config = {} |
||
) |
Creates an executable network from a previously exported network within a specified remote context.
networkModel | Network model stream |
context | Pointer to RemoteContext object |
config | Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation |
ExecutableNetwork InferenceEngine::Core::ImportNetwork | ( | std::istream & | networkModel, |
const std::string & | deviceName = {} , |
||
const std::map< std::string, std::string > & | config = {} |
||
) |
Creates an executable network from a previously exported network.
deviceName | Name of device load executable network on |
networkModel | network model stream |
config | Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation* |
ExecutableNetwork InferenceEngine::Core::LoadNetwork | ( | const CNNNetwork & | network, |
const std::string & | deviceName, | ||
const std::map< std::string, std::string > & | config = {} |
||
) |
Creates an executable network from a network object.
Users can create as many networks as they need and use them simultaneously (up to the limitation of the hardware resources)
network | CNNNetwork object acquired from Core::ReadNetwork |
deviceName | Name of device to load network to |
config | Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation |
ExecutableNetwork InferenceEngine::Core::LoadNetwork | ( | const CNNNetwork & | network, |
RemoteContext::Ptr | context, | ||
const std::map< std::string, std::string > & | config = {} |
||
) |
Creates an executable network from a network object within a specified remote context.
network | CNNNetwork object acquired from Core::ReadNetwork |
context | Pointer to RemoteContext object |
config | Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation |
QueryNetworkResult InferenceEngine::Core::QueryNetwork | ( | const CNNNetwork & | network, |
const std::string & | deviceName, | ||
const std::map< std::string, std::string > & | config = {} |
||
) | const |
Query device if it supports specified network with specified configuration.
deviceName | A name of a device to query |
network | Network object to query |
config | Optional map of pairs: (config parameter name, config parameter value) |
CNNNetwork InferenceEngine::Core::ReadNetwork | ( | const std::string & | model, |
const Blob::CPtr & | weights | ||
) | const |
Reads models from IR and ONNX formats.
model | string with model in IR or ONNX format |
weights | shared pointer to constant blob with weights ONNX models doesn't support models with data blobs. For ONNX case the second parameter should contain empty blob. |
CNNNetwork InferenceEngine::Core::ReadNetwork | ( | const std::string & | modelPath, |
const std::string & | binPath = {} |
||
) | const |
Reads models from IR and ONNX formats.
modelPath | path to model |
binPath | path to data file For IR format (*.bin):
|
void InferenceEngine::Core::RegisterPlugin | ( | const std::string & | pluginName, |
const std::string & | deviceName | ||
) |
Register new device and plugin which implement this device inside Inference Engine.
pluginName | A name of plugin. Depending on platform pluginName is wrapped with shared library suffix and prefix to identify library full name |
deviceName | A device name to register plugin for. If device name is not specified, then it's taken from plugin itself. |
void InferenceEngine::Core::RegisterPlugins | ( | const std::string & | xmlConfigFile | ) |
Registers plugin to Inference Engine Core instance using XML configuration file with plugins description.
XML file has the following structure:
name
identifies name of device enabled by pluginlocation
specifies absolute path to dynamic library with plugin. A path can also be relative to inference engine shared library. It allows to have common config for different systems with different configurations.SetConfig
method.AddExtension
method.xmlConfigFile | A path to .xml file with plugins to register. |
void InferenceEngine::Core::SetConfig | ( | const std::map< std::string, std::string > & | config, |
const std::string & | deviceName = {} |
||
) |
Sets configuration for device, acceptable keys can be found in ie_plugin_config.hpp.
deviceName | An optinal name of a device. If device name is not specified, the config is set for all the registered devices. |
config | Map of pairs: (config parameter name, config parameter value) |
void InferenceEngine::Core::UnregisterPlugin | ( | const std::string & | deviceName | ) |
Unloads previously loaded plugin with a specified name from Inference Engine The method is needed to remove plugin instance and free its resources. If plugin for a specified device has not been created before, the method throws an exception.
deviceName | Device name identifying plugin to remove from Inference Engine |