class InferenceEngine::IExecutableNetwork

Overview

This is an interface of an executable network. More…

#include <ie_iexecutable_network.hpp>

class IExecutableNetwork: public std::enable_shared_from_this< IExecutableNetwork >
{
public:
    // typedefs

    typedef std::shared_ptr<IExecutableNetwork> Ptr;

    // methods

    virtual StatusCode GetOutputsInfo(
        ConstOutputsDataMap& out,
        ResponseDesc \* resp
        ) const = 0;

    virtual StatusCode GetInputsInfo(
        ConstInputsDataMap& inputs,
        ResponseDesc \* resp
        ) const = 0;

    virtual StatusCode CreateInferRequest(
        IInferRequest::Ptr& req,
        ResponseDesc \* resp
        ) = 0;

    virtual StatusCode Export(
        const std::string& modelFileName,
        ResponseDesc \* resp
        ) = 0;

    virtual StatusCode Export(std::ostream& networkModel, ResponseDesc \* resp) = 0;

    virtual StatusCode GetExecGraphInfo(
        ICNNNetwork::Ptr& graphPtr,
        ResponseDesc \* resp
        ) = 0;

    virtual StatusCode SetConfig(
        const std::map<std::string, Parameter>& config,
        ResponseDesc \* resp
        ) = 0;

    virtual StatusCode GetConfig(
        const std::string& name,
        Parameter& result,
        ResponseDesc \* resp
        ) const = 0;

    virtual StatusCode GetMetric(
        const std::string& name,
        Parameter& result,
        ResponseDesc \* resp
        ) const = 0;

    virtual StatusCode GetContext(
        RemoteContext::Ptr& pContext,
        ResponseDesc \* resp
        ) const = 0;

protected:
};

Detailed Documentation

This is an interface of an executable network.

Typedefs

typedef std::shared_ptr<IExecutableNetwork> Ptr

A smart pointer to the current IExecutableNetwork object.

Methods

virtual StatusCode GetOutputsInfo(
    ConstOutputsDataMap& out,
    ResponseDesc \* resp
    ) const = 0

Gets the Executable network output Data node information.

The received info is stored in the given InferenceEngine::ConstOutputsDataMap node. This method need to be called to find output names for using them later when calling InferenceEngine::InferRequest::GetBlob or InferenceEngine::InferRequest::SetBlob

Parameters:

out

Reference to the InferenceEngine::ConstOutputsDataMap object

resp

Optional: pointer to an already allocated object to contain information in case of failure

Returns:

Status code of the operation: InferenceEngine::OK (0) for success

virtual StatusCode GetInputsInfo(
    ConstInputsDataMap& inputs,
    ResponseDesc \* resp
    ) const = 0

Gets the executable network input Data node information.

The received info is stored in the given InferenceEngine::ConstInputsDataMap object. This method need to be called to find out input names for using them later when calling InferenceEngine::InferRequest::SetBlob

Parameters:

inputs

Reference to InferenceEngine::ConstInputsDataMap object.

resp

Optional: pointer to an already allocated object to contain information in case of failure

Returns:

Status code of the operation: InferenceEngine::OK (0) for success

virtual StatusCode CreateInferRequest(
    IInferRequest::Ptr& req,
    ResponseDesc \* resp
    ) = 0

Creates an inference request object used to infer the network.

The created request has allocated input and output blobs (that can be changed later).

Parameters:

req

Shared pointer to the created request object

resp

Optional: pointer to an already allocated object to contain information in case of failure

Returns:

Status code of the operation: InferenceEngine::OK (0) for success

virtual StatusCode Export(
    const std::string& modelFileName,
    ResponseDesc \* resp
    ) = 0

Exports the current executable network.

Parameters:

modelFileName

Full path to the location of the exported file

resp

Optional: pointer to an already allocated object to contain information in case of failure

Returns:

Status code of the operation: InferenceEngine::OK (0) for success

See also:

Core::ImportNetwork

virtual StatusCode Export(std::ostream& networkModel, ResponseDesc \* resp) = 0

Exports the current executable network.

Parameters:

networkModel

Network model output stream

resp

Optional: pointer to an already allocated object to contain information in case of failure

Returns:

Status code of the operation: InferenceEngine::OK (0) for success

See also:

Core::ImportNetwork

virtual StatusCode GetExecGraphInfo(
    ICNNNetwork::Ptr& graphPtr,
    ResponseDesc \* resp
    ) = 0

Get executable graph information from a device.

Deprecated Use InferenceEngine::ExecutableNetwork::GetExecGraphInfo instead

Parameters:

graphPtr

network ptr to store executable graph information

resp

Optional: pointer to an already allocated object to contain information in case of failure

Returns:

Status code of the operation: InferenceEngine::OK (0) for success

virtual StatusCode SetConfig(
    const std::map<std::string, Parameter>& config,
    ResponseDesc \* resp
    ) = 0

Sets configuration for current executable network.

Parameters:

config

Map of pairs: (config parameter name, config parameter value)

resp

Pointer to the response message that holds a description of an error if any occurred

Returns:

code of the operation. InferenceEngine::OK if succeeded

virtual StatusCode GetConfig(
    const std::string& name,
    Parameter& result,
    ResponseDesc \* resp
    ) const = 0

Gets configuration for current executable network.

The method is responsible to extract information which affects executable network execution. The list of supported configuration values can be extracted via ExecutableNetwork::GetMetric with the SUPPORTED_CONFIG_KEYS key, but some of these keys cannot be changed dymanically, e.g. DEVICE_ID cannot changed if an executable network has already been compiled for particular device.

Parameters:

name

config key, can be found in ie_plugin_config.hpp

result

value of config corresponding to config key

resp

Pointer to the response message that holds a description of an error if any occurred

Returns:

code of the operation. InferenceEngine::OK if succeeded

virtual StatusCode GetMetric(
    const std::string& name,
    Parameter& result,
    ResponseDesc \* resp
    ) const = 0

Gets general runtime metric for an executable network.

It can be network name, actual device ID on which executable network is running or all other properties which cannot be changed dynamically.

Parameters:

name

metric name to request

result

metric value corresponding to metric key

resp

Pointer to the response message that holds a description of an error if any occurred

Returns:

code of the operation. InferenceEngine::OK if succeeded

virtual StatusCode GetContext(
    RemoteContext::Ptr& pContext,
    ResponseDesc \* resp
    ) const = 0

Gets shared context used to create an executable network.

Parameters:

pContext

Reference to a pointer that will receive resulting shared context object ptr

resp

Pointer to the response message that holds a description of an error if any occurred

Returns:

code of the operation. InferenceEngine::OK if succeeded