Public Types | Public Member Functions
InferenceEngine::ICNNNetwork Class Referenceabstract

This is the main interface to describe the NN topology. More...

#include <ie_icnn_network.hpp>

Inheritance diagram for InferenceEngine::ICNNNetwork:
Inheritance graph
[legend]
Collaboration diagram for InferenceEngine::ICNNNetwork:
Collaboration graph
[legend]

Public Types

using  Ptr = std::shared_ptr< ICNNNetwork >
 
using  InputShapes = std::map< std::string, SizeVector >
  Map of pairs: name of corresponding data and its dimension.
 

Public Member Functions

virtual Precision  getPrecision () const noexcept=0
  Returns the main network operating precision. This may be MIXED if not homogeneous. More...
 
virtual void  getOutputsInfo (OutputsDataMap &out) const noexcept=0
  Gets the network output Data node information. The received info is stored in the given Data node. For single and multiple outputs networks. More...
 
virtual void  getInputsInfo (InputsDataMap &inputs) const noexcept=0
  Gets the network input Data node information. The received info is stored in the given InputsDataMap object. For single and multiple inputs networks. This method must be called to find out input names for using them later during filling of a map of blobs passed later to InferenceEngine::IInferencePlugin::Infer() More...
 
virtual InputInfo::Ptr  getInput (const std::string &inputName) const noexcept=0
  Returns information on certain input pointed by inputName. More...
 
virtual void  getName (char *pName, size_t len) const noexcept=0
  Gets the network name. The name is stored in the given pName string. More...
 
virtual const std::string &  getName () const noexcept=0
  Returns the network name. More...
 
virtual size_t  layerCount () const noexcept=0
  Returns the number of layers in the network as an integer value. More...
 
virtual DataPtr getData (const char *dname) noexcept=0
  Returns a smart pointer reference to a Data node given its name. If the Data node is missing, returns reference to a default initialized new empty data pointer with given name. More...
 
virtual void  addLayer (const CNNLayerPtr &layer) noexcept=0
  Insert a layer into the network. A user is responsible to connect it to other data elements. More...
 
virtual StatusCode  addOutput (const std::string &layerName, size_t outputIndex=0, ResponseDesc *resp=nullptr) noexcept=0
  Adds output to the layer. More...
 
virtual StatusCode  getLayerByName (const char *layerName, CNNLayerPtr &out, ResponseDesc *resp) const noexcept=0
  Gets network layer with the given name. More...
 
virtual void  setTargetDevice (TargetDevice device) noexcept=0
  Sets a desirable device to perform all work on. Some plug-ins might not support some target devices and may abort execution with an appropriate error message. More...
 
virtual TargetDevice  getTargetDevice () const noexcept=0
  Gets the target device. If setTargetDevice() was not called before, returns eDefault. More...
 
virtual StatusCode  setBatchSize (const size_t size) noexcept
  Changes the inference batch size. More...
 
virtual StatusCode  setBatchSize (size_t size, ResponseDesc *responseDesc) noexcept=0
  Changes the inference batch size. More...
 
virtual size_t  getBatchSize () const noexcept=0
  Gets the inference batch size. More...
 
virtual StatusCode  reshape (const InputShapes &, ResponseDesc *) noexcept
  Run shape inference with new input shapes for the network. More...
 
virtual StatusCode  AddExtension (const IShapeInferExtensionPtr &, ResponseDesc *) noexcept
  Registers extension within the plugin. More...
 
virtual StatusCode  getStats (ICNNNetworkStats **, ResponseDesc *) const noexcept
 
virtual StatusCode  serialize (const std::string &xmlPath, const std::string &binPath, ResponseDesc *resp) const noexcept=0
  Serialize network to IR and weights files. More...
 

Detailed Description

This is the main interface to describe the NN topology.

Member Function Documentation

§ AddExtension()

virtual StatusCode InferenceEngine::ICNNNetwork::AddExtension ( const IShapeInferExtensionPtr &  ,
ResponseDesc  
)
inlinevirtualnoexcept

Registers extension within the plugin.

Parameters
extension Pointer to already loaded reader extension with shape propagation implementations
resp Pointer to the response message that holds a description of an error if any occurred
Returns
Status code of the operation. OK if succeeded

§ addLayer()

virtual void InferenceEngine::ICNNNetwork::addLayer ( const CNNLayerPtr layer )
pure virtualnoexcept

Insert a layer into the network. A user is responsible to connect it to other data elements.

Parameters
layer Const reference to a layer smart pointer

§ addOutput()

virtual StatusCode InferenceEngine::ICNNNetwork::addOutput ( const std::string &  layerName,
size_t  outputIndex = 0,
ResponseDesc resp = nullptr 
)
pure virtualnoexcept

Adds output to the layer.

Parameters
layerName Name of the layer
outputIndex Index of the output
resp Response message
Returns
Status code of the operation

§ getBatchSize()

virtual size_t InferenceEngine::ICNNNetwork::getBatchSize ( ) const
pure virtualnoexcept

Gets the inference batch size.

Returns
The size of batch as a size_t value

§ getData()

virtual DataPtr& InferenceEngine::ICNNNetwork::getData ( const char *  dname )
pure virtualnoexcept

Returns a smart pointer reference to a Data node given its name. If the Data node is missing, returns reference to a default initialized new empty data pointer with given name.

Parameters
dname Name of the Data node
Returns
Data node smart pointer

§ getInput()

virtual InputInfo::Ptr InferenceEngine::ICNNNetwork::getInput ( const std::string &  inputName ) const
pure virtualnoexcept

Returns information on certain input pointed by inputName.

Parameters
inputName Name of input layer to get info on
Returns
A smart pointer to the input information

§ getInputsInfo()

virtual void InferenceEngine::ICNNNetwork::getInputsInfo ( InputsDataMap inputs ) const
pure virtualnoexcept

Gets the network input Data node information. The received info is stored in the given InputsDataMap object. For single and multiple inputs networks. This method must be called to find out input names for using them later during filling of a map of blobs passed later to InferenceEngine::IInferencePlugin::Infer()

Parameters
inputs Reference to InputsDataMap object.

§ getLayerByName()

virtual StatusCode InferenceEngine::ICNNNetwork::getLayerByName ( const char *  layerName,
CNNLayerPtr out,
ResponseDesc resp 
) const
pure virtualnoexcept

Gets network layer with the given name.

Parameters
layerName Given name of the layer
out Pointer to the found CNNLayer object with the given name
resp Pointer to the response message that holds a description of an error if any occurred
Returns
Status code of the operation. OK if succeeded

§ getName() [1/2]

virtual void InferenceEngine::ICNNNetwork::getName ( char *  pName,
size_t  len 
) const
pure virtualnoexcept

Gets the network name. The name is stored in the given pName string.

Parameters
pName - will receive actual network name, specified in IR file, pName should point to valid memory address before invoking this function
len - size in bytes of pName buffer, actual name is trimmed by this size

§ getName() [2/2]

virtual const std::string& InferenceEngine::ICNNNetwork::getName ( ) const
pure virtualnoexcept

Returns the network name.

Returns
Network name

§ getOutputsInfo()

virtual void InferenceEngine::ICNNNetwork::getOutputsInfo ( OutputsDataMap out ) const
pure virtualnoexcept

Gets the network output Data node information. The received info is stored in the given Data node. For single and multiple outputs networks.

Parameters
out Reference to the OutputsDataMap object

§ getPrecision()

virtual Precision InferenceEngine::ICNNNetwork::getPrecision ( ) const
pure virtualnoexcept

Returns the main network operating precision. This may be MIXED if not homogeneous.

Returns
A precision type

§ getTargetDevice()

virtual TargetDevice InferenceEngine::ICNNNetwork::getTargetDevice ( ) const
pure virtualnoexcept

Gets the target device. If setTargetDevice() was not called before, returns eDefault.

Deprecated:
Deprecated since TargetDevice is deprecated
Returns
A TargetDevice instance

§ layerCount()

virtual size_t InferenceEngine::ICNNNetwork::layerCount ( ) const
pure virtualnoexcept

Returns the number of layers in the network as an integer value.

Returns
The number of layers as an integer value

§ reshape()

virtual StatusCode InferenceEngine::ICNNNetwork::reshape ( const InputShapes ,
ResponseDesc  
)
inlinevirtualnoexcept

Run shape inference with new input shapes for the network.

Parameters
inputShapes - map of pairs: name of corresponding data and its dimension.
resp Pointer to the response message that holds a description of an error if any occurred
Returns
Status code of the operation

§ serialize()

virtual StatusCode InferenceEngine::ICNNNetwork::serialize ( const std::string &  xmlPath,
const std::string &  binPath,
ResponseDesc resp 
) const
pure virtualnoexcept

Serialize network to IR and weights files.

Parameters
xmlPath Path to output IR file.
binPath Path to output weights file.
Returns
Status code of the operation

§ setBatchSize() [1/2]

virtual StatusCode InferenceEngine::ICNNNetwork::setBatchSize ( const size_t  size )
inlinevirtualnoexcept

Changes the inference batch size.

Deprecated:
Use ICNNNetwork::setBatchSize(size_t, ResponseDesc*)

§ setBatchSize() [2/2]

virtual StatusCode InferenceEngine::ICNNNetwork::setBatchSize ( size_t  size,
ResponseDesc responseDesc 
)
pure virtualnoexcept

Changes the inference batch size.

Note
There are several limitations and it's not recommended to use it. Set batch to the input shape and call ICNNNetwork::reshape.
Parameters
size Size of batch to set
Returns
Status code of the operation
Note
: Current implementation of the function sets batch size to the first dimension of all layers in the networks. Before calling it make sure that all your layers have batch in the first dimension, otherwise the method works incorrectly. This limitation is resolved via shape inference feature by using InferenceEngine::ICNNNetwork::reshape method. To read more refer to the Shape Inference section in documentation

§ setTargetDevice()

virtual void InferenceEngine::ICNNNetwork::setTargetDevice ( TargetDevice  device )
pure virtualnoexcept

Sets a desirable device to perform all work on. Some plug-ins might not support some target devices and may abort execution with an appropriate error message.

Deprecated:
Deprecated since TargetDevice is deprecated. Specify target device in InferenceEngine::Core directly.
Parameters
device Device to set as a target

The documentation for this class was generated from the following file: