Class InferenceEngine::Core¶
-
class Core¶
This class represents Inference Engine Core entity.
It can throw exceptions safely for the application, where it is properly handled.
Public Functions
-
explicit Core(const std::string &xmlConfigFile = {})¶
Constructs an OpenVINO Core instance with devices and their plugins description.
There are two ways how to configure device plugins:
(default) Use XML configuration file in case of dynamic libraries build;
Use strictly defined configuration in case of static libraries build.
- Parameters
xmlConfigFile – Path to the .xml file with plugins to load from. If the XML configuration file is not specified, default OpenVINO Runtime plugins are loaded from:
(dynamic build) default
plugins.xml
file located in the same folder as OpenVINO runtime shared library;(static build) statically defined configuration. In this case path to the .xml file is ignored.
-
std::map<std::string, Version> GetVersions(const std::string &deviceName) const¶
Returns plugins version information.
- Parameters
deviceName – Device name to identify plugin
- Returns
A vector of versions
-
CNNNetwork ReadNetwork(const std::wstring &modelPath, const std::wstring &binPath = {}) const¶
Reads models from IR and ONNX formats.
- Parameters
modelPath – path to model
binPath – path to data file For IR format (*.bin):
if path is empty, will try to read bin file with the same name as xml and
if bin file with the same name was not found, will load IR without weights. For ONNX format (*.onnx):
binPath parameter is not used.
- Returns
CNNNetwork
-
CNNNetwork ReadNetwork(const std::string &modelPath, const std::string &binPath = {}) const¶
Reads models from IR and ONNX formats.
- Parameters
modelPath – path to model
binPath – path to data file For IR format (*.bin):
if path is empty, will try to read bin file with the same name as xml and
if bin file with the same name was not found, will load IR without weights. For ONNX format (*.onnx):
binPath parameter is not used.
- Returns
CNNNetwork
-
CNNNetwork ReadNetwork(const std::string &model, const Blob::CPtr &weights) const¶
Reads models from IR and ONNX formats.
Note
Created InferenceEngine::CNNNetwork object shares the weights with
weights
object. So, do not createweights
on temporary data which can be later freed, since the network constant data becomes to point to invalid memory.- Parameters
model – string with model in IR or ONNX format
weights – shared pointer to constant blob with weights Reading ONNX models doesn’t support loading weights from data blobs. If you are using an ONNX model with external data files, please use the
InferenceEngine::Core::ReadNetwork(const std::string& model, const Blob::CPtr& weights) const
function overload which takes a filesystem path to the model. For ONNX case the second parameter should contain empty blob.
- Returns
CNNNetwork
-
ExecutableNetwork LoadNetwork(const CNNNetwork &network, const std::map<std::string, std::string> &config = {})¶
Creates an executable network from a network object and uses AUTO plugin as the default device to load executable network.
Users can create as many networks as they need and use them simultaneously (up to the limitation of the hardware resources)
- Parameters
network – CNNNetwork object acquired from Core::ReadNetwork
config – Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation
- Returns
An executable network reference
-
ExecutableNetwork LoadNetwork(const CNNNetwork &network, const std::string &deviceName, const std::map<std::string, std::string> &config = {})¶
Creates an executable network from a network object.
Users can create as many networks as they need and use them simultaneously (up to the limitation of the hardware resources)
- Parameters
network – CNNNetwork object acquired from Core::ReadNetwork
deviceName – Name of device to load network to
config – Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation
- Returns
An executable network reference
-
ExecutableNetwork LoadNetwork(const std::string &modelPath, const std::map<std::string, std::string> &config = {})¶
Reads model and creates an executable network from IR or ONNX file and uses AUTO plugin as the default device to load executable network.
This can be more efficient than using ReadNetwork + LoadNetwork(CNNNetwork) flow especially for cases when caching is enabled and cached model is available
- Parameters
modelPath – path to model
config – Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation/
- Returns
An executable network reference
-
ExecutableNetwork LoadNetwork(const std::string &modelPath, const std::string &deviceName, const std::map<std::string, std::string> &config = {})¶
Reads model and creates an executable network from IR or ONNX file.
This can be more efficient than using ReadNetwork + LoadNetwork(CNNNetwork) flow especially for cases when caching is enabled and cached model is available
- Parameters
modelPath – path to model
deviceName – Name of device to load network to
config – Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation/
- Returns
An executable network reference
-
void AddExtension(const IExtensionPtr &extension)¶
Registers extension.
- Parameters
extension – Pointer to already loaded extension
-
ExecutableNetwork LoadNetwork(const CNNNetwork &network, RemoteContext::Ptr context, const std::map<std::string, std::string> &config = {})¶
Creates an executable network from a network object within a specified remote context.
- Parameters
network – CNNNetwork object acquired from Core::ReadNetwork
context – Pointer to RemoteContext object
config – Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation
- Returns
An executable network object
-
void AddExtension(IExtensionPtr extension, const std::string &deviceName)¶
Registers extension for the specified plugin.
- Parameters
extension – Pointer to already loaded extension
deviceName – Device name to identify plugin to add an executable extension
-
ExecutableNetwork ImportNetwork(const std::string &modelFileName, const std::string &deviceName, const std::map<std::string, std::string> &config = {})¶
Creates an executable network from a previously exported network.
- Parameters
modelFileName – Path to the location of the exported file
deviceName – Name of device load executable network on
config – Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation*
- Returns
An executable network reference
-
ExecutableNetwork ImportNetwork(std::istream &networkModel, const std::string &deviceName, const std::map<std::string, std::string> &config = {})¶
Creates an executable network from a previously exported network.
- Parameters
networkModel – network model stream
deviceName – Name of device load executable network on
config – Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation*
- Returns
An executable network reference
-
ExecutableNetwork ImportNetwork(std::istream &networkModel)¶
Creates an executable network from a previously exported network.
- Deprecated:
Use Core::ImportNetwork with explicit device name
- Parameters
networkModel – network model stream
- Returns
An executable network reference
-
ExecutableNetwork ImportNetwork(std::istream &networkModel, const RemoteContext::Ptr &context, const std::map<std::string, std::string> &config = {})¶
Creates an executable network from a previously exported network within a specified remote context.
- Parameters
networkModel – Network model stream
context – Pointer to RemoteContext object
config – Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation
- Returns
An executable network reference
-
QueryNetworkResult QueryNetwork(const CNNNetwork &network, const std::string &deviceName, const std::map<std::string, std::string> &config = {}) const¶
Query device if it supports specified network with specified configuration.
- Parameters
deviceName – A name of a device to query
network – Network object to query
config – Optional map of pairs: (config parameter name, config parameter value)
- Returns
An object containing a map of pairs a layer name -> a device name supporting this layer.
-
void SetConfig(const std::map<std::string, std::string> &config, const std::string &deviceName = {})¶
Sets configuration for device, acceptable keys can be found in ie_plugin_config.hpp.
- Parameters
deviceName – An optional name of a device. If device name is not specified, the config is set for all the registered devices.
config – Map of pairs: (config parameter name, config parameter value)
-
Parameter GetConfig(const std::string &deviceName, const std::string &name) const¶
Gets configuration dedicated to device behaviour.
The method is targeted to extract information which can be set via SetConfig method.
- Parameters
deviceName – - A name of a device to get a configuration value.
name – - config key.
- Returns
Value of config corresponding to config key.
-
Parameter GetMetric(const std::string &deviceName, const std::string &name, const ParamMap &options = {}) const¶
Gets general runtime metric for dedicated hardware.
The method is needed to request common device properties which are executable network agnostic. It can be device name, temperature, other devices-specific values.
- Parameters
deviceName – - A name of a device to get a metric value.
name – - metric name to request.
options – - optional parameters to get a metric value
- Returns
Metric value corresponding to metric key.
-
std::vector<std::string> GetAvailableDevices() const¶
Returns devices available for neural networks inference.
- Returns
A vector of devices. The devices are returned as { CPU, GPU.0, GPU.1, GNA } If there more than one device of specific type, they are enumerated with .# suffix.
-
void RegisterPlugin(const std::string &plugin, const std::string &deviceName)¶
Register new device and plugin which implement this device inside Inference Engine.
- Parameters
plugin – Path (absolute or relative) or name of a plugin. Depending on platform,
plugin
is wrapped with shared library suffix and prefix to identify library full namedeviceName – A device name to register plugin for
-
void UnregisterPlugin(const std::string &deviceName)¶
Unloads previously loaded plugin with a specified name from Inference Engine The method is needed to remove plugin instance and free its resources. If plugin for a specified device has not been created before, the method throws an exception.
- Parameters
deviceName – Device name identifying plugin to remove from Inference Engine
-
void RegisterPlugins(const std::string &xmlConfigFile)¶
Registers plugin to Inference Engine Core instance using XML configuration file with plugins description.
XML file has the following structure:
<ie> <plugins> <plugin name="" location=""> <extensions> <extension location=""/> </extensions> <properties> <property key="" value=""/> </properties> </plugin> </plugins> </ie>
name
identifies name of device enabled by pluginlocation
specifies absolute path to dynamic library with plugin. A path can also be relative to inference engine shared library. It allows to have common config for different systems with different configurations.Properties are set to plugin via the
SetConfig
method.Extensions are set to plugin via the
AddExtension
method.
- Parameters
xmlConfigFile – A path to .xml file with plugins to register.
-
RemoteContext::Ptr CreateContext(const std::string &deviceName, const ParamMap ¶ms)¶
Create a new shared context object on specified accelerator device using specified plugin-specific low level device API parameters (device handle, pointer, etc.)
- Parameters
deviceName – Name of a device to create new shared context on.
params – Map of device-specific shared context parameters.
- Returns
A shared pointer to a created remote context.
-
RemoteContext::Ptr GetDefaultContext(const std::string &deviceName)¶
Get a pointer to default(plugin-supplied) shared context object for specified accelerator device.
- Parameters
deviceName – - A name of a device to get create shared context from.
- Returns
A shared pointer to a default remote context.
-
explicit Core(const std::string &xmlConfigFile = {})¶