interface InferenceEngine::ILayerExecImpl

Overview

This class provides interface for the implementation with the custom execution code. More…

#include <ie_iextension.h>

template ILayerExecImpl: public InferenceEngine::ILayerImpl
{
    // typedefs

    typedef std::shared_ptr<ILayerExecImpl> Ptr;

    // methods

    virtual StatusCode getSupportedConfigurations(
        std::vector<LayerConfig>& conf,
        ResponseDesc \* resp
        ) = 0;

    virtual StatusCode init(LayerConfig& config, ResponseDesc \* resp) = 0;

    virtual StatusCode execute(
        std::vector<Blob::Ptr>& inputs,
        std::vector<Blob::Ptr>& outputs,
        ResponseDesc \* resp
        ) = 0;
};

Inherited Members

public:
    // typedefs

    typedef std::shared_ptr<ILayerImpl> Ptr;

Detailed Documentation

This class provides interface for the implementation with the custom execution code.

Deprecated The Inference Engine API is deprecated and will be removed in the 2024.0 release. For instructions on transitioning to the new API, please refer to https://docs.openvino.ai/latest/openvino_2_0_transition_guide.html

Typedefs

typedef std::shared_ptr<ILayerExecImpl> Ptr

A shared pointer to the ILayerExecImpl interface.

Methods

virtual StatusCode getSupportedConfigurations(
    std::vector<LayerConfig>& conf,
    ResponseDesc \* resp
    ) = 0

Gets all supported configurations for the current layer.

Parameters:

conf

Vector with supported configurations

resp

Response descriptor

Returns:

Status code

virtual StatusCode init(LayerConfig& config, ResponseDesc \* resp) = 0

Initializes the implementation.

Parameters:

config

Selected supported configuration

resp

Response descriptor

Returns:

Status code

virtual StatusCode execute(
    std::vector<Blob::Ptr>& inputs,
    std::vector<Blob::Ptr>& outputs,
    ResponseDesc \* resp
    ) = 0

Execute method.

Parameters:

inputs

Vector of blobs with input memory

outputs

Vector of blobs with output memory

resp

Response descriptor

Returns:

Status code