Public Types | Public Member Functions
InferenceEngine::ICNNNetwork Interface Referenceabstract

This is the main interface to describe the NN topology. More...

#include <ie_icnn_network.hpp>

Inheritance diagram for InferenceEngine::ICNNNetwork:
Inheritance graph
[legend]
Collaboration diagram for InferenceEngine::ICNNNetwork:
Collaboration graph
[legend]

Public Types

using Ptr = std::shared_ptr< ICNNNetwork >
 A shared pointer to a ICNNNetwork interface.
 
using InputShapes = std::map< std::string, SizeVector >
 Map of pairs: name of corresponding data and its dimension.
 

Public Member Functions

virtual std::shared_ptr< ngraph::Function > getFunction () noexcept=0
 Returns nGraph function. More...
 
virtual std::shared_ptr< const ngraph::Function > getFunction () const noexcept=0
 Returns constant nGraph function. More...
 
virtual void getOutputsInfo (OutputsDataMap &out) const noexcept=0
 Gets the network output Data node information. The received info is stored in the given Data node. More...
 
virtual void getInputsInfo (InputsDataMap &inputs) const noexcept=0
 Gets the network input Data node information. The received info is stored in the given InputsDataMap object. More...
 
virtual InputInfo::Ptr getInput (const std::string &inputName) const noexcept=0
 Returns information on certain input pointed by inputName. More...
 
virtual const std::string & getName () const noexcept=0
 Returns the network name. More...
 
virtual size_t layerCount () const noexcept=0
 Returns the number of layers in the network as an integer value. More...
 
virtual StatusCode addOutput (const std::string &layerName, size_t outputIndex=0, ResponseDesc *resp=nullptr) noexcept=0
 Adds output to the layer. More...
 
virtual StatusCode setBatchSize (size_t size, ResponseDesc *responseDesc) noexcept=0
 Changes the inference batch size. More...
 
virtual size_t getBatchSize () const noexcept=0
 Gets the inference batch size. More...
 
virtual StatusCode reshape (const InputShapes &inputShapes, ResponseDesc *resp) noexcept
 Run shape inference with new input shapes for the network. More...
 
virtual StatusCode serialize (const std::string &xmlPath, const std::string &binPath, ResponseDesc *resp) const noexcept=0
 Serialize network to IR and weights files. More...
 
virtual ~ICNNNetwork ()
 A virtual destructor.
 

Detailed Description

This is the main interface to describe the NN topology.

Member Function Documentation

◆ addOutput()

virtual StatusCode InferenceEngine::ICNNNetwork::addOutput ( const std::string &  layerName,
size_t  outputIndex = 0,
ResponseDesc resp = nullptr 
)
pure virtualnoexcept

Adds output to the layer.

Parameters
layerNameName of the layer
outputIndexIndex of the output
respResponse message
Returns
Status code of the operation

◆ getBatchSize()

virtual size_t InferenceEngine::ICNNNetwork::getBatchSize ( ) const
pure virtualnoexcept

Gets the inference batch size.

Returns
The size of batch as a size_t value

◆ getFunction() [1/2]

virtual std::shared_ptr<const ngraph::Function> InferenceEngine::ICNNNetwork::getFunction ( ) const
pure virtualnoexcept

Returns constant nGraph function.

Returns
constant nGraph function

◆ getFunction() [2/2]

virtual std::shared_ptr<ngraph::Function> InferenceEngine::ICNNNetwork::getFunction ( )
pure virtualnoexcept

Returns nGraph function.

Returns
nGraph function

◆ getInput()

virtual InputInfo::Ptr InferenceEngine::ICNNNetwork::getInput ( const std::string &  inputName) const
pure virtualnoexcept

Returns information on certain input pointed by inputName.

Parameters
inputNameName of input layer to get info on
Returns
A smart pointer to the input information

◆ getInputsInfo()

virtual void InferenceEngine::ICNNNetwork::getInputsInfo ( InputsDataMap inputs) const
pure virtualnoexcept

Gets the network input Data node information. The received info is stored in the given InputsDataMap object.

For single and multiple inputs networks. This method need to be called to find out input names for using them later when calling InferenceEngine::InferRequest::SetBlob

Parameters
inputsReference to InputsDataMap object.

◆ getName()

virtual const std::string& InferenceEngine::ICNNNetwork::getName ( ) const
pure virtualnoexcept

Returns the network name.

Returns
Network name

◆ getOutputsInfo()

virtual void InferenceEngine::ICNNNetwork::getOutputsInfo ( OutputsDataMap out) const
pure virtualnoexcept

Gets the network output Data node information. The received info is stored in the given Data node.

For single and multiple outputs networks.

This method need to be called to find output names for using them later when calling InferenceEngine::InferRequest::GetBlob or InferenceEngine::InferRequest::SetBlob

Parameters
outReference to the OutputsDataMap object

◆ layerCount()

virtual size_t InferenceEngine::ICNNNetwork::layerCount ( ) const
pure virtualnoexcept

Returns the number of layers in the network as an integer value.

Returns
The number of layers as an integer value

◆ reshape()

virtual StatusCode InferenceEngine::ICNNNetwork::reshape ( const InputShapes inputShapes,
ResponseDesc resp 
)
inlinevirtualnoexcept

Run shape inference with new input shapes for the network.

Parameters
inputShapes- map of pairs: name of corresponding data and its dimension.
respPointer to the response message that holds a description of an error if any occurred
Returns
Status code of the operation

◆ serialize()

virtual StatusCode InferenceEngine::ICNNNetwork::serialize ( const std::string &  xmlPath,
const std::string &  binPath,
ResponseDesc resp 
) const
pure virtualnoexcept

Serialize network to IR and weights files.

Parameters
xmlPathPath to output IR file.
binPathPath to output weights file.
respPointer to the response message that holds a description of an error if any occurred
Returns
Status code of the operation

◆ setBatchSize()

virtual StatusCode InferenceEngine::ICNNNetwork::setBatchSize ( size_t  size,
ResponseDesc responseDesc 
)
pure virtualnoexcept

Changes the inference batch size.

Note
There are several limitations and it's not recommended to use it. Set batch to the input shape and call ICNNNetwork::reshape.
Parameters
sizeSize of batch to set
responseDescPointer to the response message that holds a description of an error if any occurred
Returns
Status code of the operation
Note
Current implementation of the function sets batch size to the first dimension of all layers in the networks. Before calling it make sure that all your layers have batch in the first dimension, otherwise the method works incorrectly. This limitation is resolved via shape inference feature by using InferenceEngine::ICNNNetwork::reshape method. To read more refer to the Shape Inference section in documentation
Current implementation of the function sets batch size to the first dimension of all layers in the networks. Before calling it make sure that all your layers have batch in the first dimension, otherwise the method works incorrectly. This limitation is resolved via shape inference feature by using InferenceEngine::ICNNNetwork::reshape method. To read more refer to the Shape Inference section in documentation

The documentation for this interface was generated from the following file: