Public Types | Public Member Functions | Protected Member Functions | Protected Attributes
InferenceEngine::ExecutableNetworkInternal Class Reference

Minimum implementation of IExecutableNetworkInternal interface. Must not be used as a base class in plugins As base classes, use ExecutableNetworkThreadSafeDefault or ExecutableNetworkThreadSafeAsyncOnly. More...

#include <ie_executable_network_internal.hpp>

Inheritance diagram for InferenceEngine::ExecutableNetworkInternal:
InferenceEngine::IExecutableNetworkInternal InferenceEngine::ExecutableNetworkThreadSafeAsyncOnly InferenceEngine::ExecutableNetworkThreadSafeDefault

Public Types

typedef std::shared_ptr< ExecutableNetworkInternalPtr
 A shared pointer to ExecutableNetworkInternal object.
 
- Public Types inherited from InferenceEngine::IExecutableNetworkInternal
typedef std::shared_ptr< IExecutableNetworkInternalPtr
 A shared pointer to IExecutableNetworkInternal interface.
 

Public Member Functions

virtual void setNetworkInputs (const InferenceEngine::InputsDataMap networkInputs)
 Sets the network inputs info. More...
 
virtual void setNetworkOutputs (const InferenceEngine::OutputsDataMap networkOutputs)
 Sets the network outputs data. More...
 
ConstOutputsDataMap GetOutputsInfo () const override
 Gets the Executable network output Data node information. The received info is stored in the given Data node. This method need to be called to find output names for using them later during filling of a map of blobs passed later to InferenceEngine::IInferencePlugin::Infer() More...
 
ConstInputsDataMap GetInputsInfo () const override
 Gets the Executable network input Data node information. The received info is stored in the given InputsDataMap object. This method need to be called to find out input names for using them later during filling of a map of blobs passed later to InferenceEngine::IInferencePlugin::Infer() More...
 
void Export (const std::string &modelFileName) override
 Export the current created executable network so it can be used later in the Import() main API. More...
 
void Export (std::ostream &networkModel) override
 Export the current created executable network so it can be used later in the Import() main API. More...
 
void GetExecGraphInfo (ICNNNetwork::Ptr &graphPtr) override
 Get executable graph information from a device. More...
 
void SetPointerToPlugin (IInferencePlugin::Ptr plugin)
 Sets the pointer to plugin internal. More...
 
std::vector< IMemoryStateInternal::Ptr > QueryState () override
 Queries memory states. More...
 
void SetConfig (const std::map< std::string, Parameter > &config, ResponseDesc *) override
 Sets configuration for current executable network. More...
 
void GetConfig (const std::string &, Parameter &, ResponseDesc *) const override
 Gets configuration dedicated to plugin behaviour. More...
 
void GetMetric (const std::string &, Parameter &, ResponseDesc *) const override
 Gets general runtime metric for dedicated hardware. More...
 
void GetContext (RemoteContext::Ptr &, ResponseDesc *) const override
 Gets the remote context. More...
 
- Public Member Functions inherited from InferenceEngine::IExecutableNetworkInternal
virtual ~IExecutableNetworkInternal ()=default
 Destroys the object.
 
virtual void CreateInferRequest (IInferRequest::Ptr &req)=0
 Create an inference request object used to infer the network Note: the returned request will have allocated input and output blobs (that can be changed later) More...
 

Protected Member Functions

virtual void ExportImpl (std::ostream &networkModel)
 Exports an internal hardware-dependent model to a stream. More...
 

Protected Attributes

InferenceEngine::InputsDataMap _networkInputs
 Holds infromation about network inputs info.
 
InferenceEngine::OutputsDataMap _networkOutputs
 Holds information about network outputs data.
 
IInferencePlugin::Ptr _plugin
 A pointer to a IInferencePlugin interface. More...
 

Detailed Description

Minimum implementation of IExecutableNetworkInternal interface. Must not be used as a base class in plugins As base classes, use ExecutableNetworkThreadSafeDefault or ExecutableNetworkThreadSafeAsyncOnly.

Member Function Documentation

◆ Export() [1/2]

void InferenceEngine::ExecutableNetworkInternal::Export ( const std::string &  modelFileName)
inlineoverridevirtual

Export the current created executable network so it can be used later in the Import() main API.

Parameters
modelFileName- path to the location of the exported file

Implements InferenceEngine::IExecutableNetworkInternal.

◆ Export() [2/2]

void InferenceEngine::ExecutableNetworkInternal::Export ( std::ostream &  networkModel)
inlineoverridevirtual

Export the current created executable network so it can be used later in the Import() main API.

Parameters
networkModel- Reference to network model output stream

Implements InferenceEngine::IExecutableNetworkInternal.

◆ ExportImpl()

virtual void InferenceEngine::ExecutableNetworkInternal::ExportImpl ( std::ostream &  networkModel)
inlineprotectedvirtual

Exports an internal hardware-dependent model to a stream.

Note
The function is called from ExecutableNetworkInternal::Export(std::ostream&), which performs common export first and calls this plugin-dependent implementation after.
Parameters
networkModelA stream to export network to.

◆ GetConfig()

void InferenceEngine::ExecutableNetworkInternal::GetConfig ( const std::string &  name,
Parameter result,
ResponseDesc resp 
) const
inlineoverridevirtual

Gets configuration dedicated to plugin behaviour.

Parameters
name- config key, can be found in ie_plugin_config.hpp
result- value of config corresponding to config key
respPointer to the response message that holds a description of an error if any occurred

Implements InferenceEngine::IExecutableNetworkInternal.

◆ GetContext()

void InferenceEngine::ExecutableNetworkInternal::GetContext ( RemoteContext::Ptr pContext,
ResponseDesc resp 
) const
inlineoverridevirtual

Gets the remote context.

Parameters
pContextA reference to a context
respA response

Implements InferenceEngine::IExecutableNetworkInternal.

◆ GetExecGraphInfo()

void InferenceEngine::ExecutableNetworkInternal::GetExecGraphInfo ( ICNNNetwork::Ptr graphPtr)
inlineoverridevirtual

Get executable graph information from a device.

Parameters
graphPtrnetwork ptr to store executable graph information

Implements InferenceEngine::IExecutableNetworkInternal.

◆ GetInputsInfo()

ConstInputsDataMap InferenceEngine::ExecutableNetworkInternal::GetInputsInfo ( ) const
inlineoverridevirtual

Gets the Executable network input Data node information. The received info is stored in the given InputsDataMap object. This method need to be called to find out input names for using them later during filling of a map of blobs passed later to InferenceEngine::IInferencePlugin::Infer()

Returns
inputs Reference to ConstInputsDataMap object.

Implements InferenceEngine::IExecutableNetworkInternal.

◆ GetMetric()

void InferenceEngine::ExecutableNetworkInternal::GetMetric ( const std::string &  name,
Parameter result,
ResponseDesc resp 
) const
inlineoverridevirtual

Gets general runtime metric for dedicated hardware.

Parameters
name- metric name to request
result- metric value corresponding to metric key
resp- Pointer to the response message that holds a description of an error if any occurred

Implements InferenceEngine::IExecutableNetworkInternal.

◆ GetOutputsInfo()

ConstOutputsDataMap InferenceEngine::ExecutableNetworkInternal::GetOutputsInfo ( ) const
inlineoverridevirtual

Gets the Executable network output Data node information. The received info is stored in the given Data node. This method need to be called to find output names for using them later during filling of a map of blobs passed later to InferenceEngine::IInferencePlugin::Infer()

Returns
out Reference to the ConstOutputsDataMap object

Implements InferenceEngine::IExecutableNetworkInternal.

◆ QueryState()

std::vector<IMemoryStateInternal::Ptr> InferenceEngine::ExecutableNetworkInternal::QueryState ( )
inlineoverridevirtual

Queries memory states.

Returns
Returns memory states

Implements InferenceEngine::IExecutableNetworkInternal.

◆ SetConfig()

void InferenceEngine::ExecutableNetworkInternal::SetConfig ( const std::map< std::string, Parameter > &  config,
ResponseDesc resp 
)
inlineoverridevirtual

Sets configuration for current executable network.

Parameters
configMap of pairs: (config parameter name, config parameter value)
respPointer to the response message that holds a description of an error if any occurred

Implements InferenceEngine::IExecutableNetworkInternal.

◆ setNetworkInputs()

virtual void InferenceEngine::ExecutableNetworkInternal::setNetworkInputs ( const InferenceEngine::InputsDataMap  networkInputs)
inlinevirtual

Sets the network inputs info.

Parameters
[in]networkInputsThe network inputs info

◆ setNetworkOutputs()

virtual void InferenceEngine::ExecutableNetworkInternal::setNetworkOutputs ( const InferenceEngine::OutputsDataMap  networkOutputs)
inlinevirtual

Sets the network outputs data.

Parameters
[in]networkOutputsThe network outputs

◆ SetPointerToPlugin()

void InferenceEngine::ExecutableNetworkInternal::SetPointerToPlugin ( IInferencePlugin::Ptr  plugin)
inline

Sets the pointer to plugin internal.

Parameters
[in]pluginThe plugin
Note
Needed to correctly handle ownership between objects.

Field Documentation

◆ _plugin

IInferencePlugin::Ptr InferenceEngine::ExecutableNetworkInternal::_plugin
protected

A pointer to a IInferencePlugin interface.

Note
Needed to correctly handle ownership between objects.

The documentation for this class was generated from the following file: