This class provides optimal thread safe default implementation. The class is recommended to be used as a base class for Executable Network impleentation during plugin development. More...
#include <ie_executable_network_thread_safe_default.hpp>
Public Types | |
typedef std::shared_ptr< ExecutableNetworkThreadSafeDefault > | Ptr |
A shared pointer to a ExecutableNetworkThreadSafeDefault object. | |
Public Types inherited from InferenceEngine::ExecutableNetworkInternal | |
typedef std::shared_ptr< ExecutableNetworkInternal > | Ptr |
A shared pointer to ExecutableNetworkInternal object. | |
Public Types inherited from InferenceEngine::IExecutableNetworkInternal | |
typedef std::shared_ptr< IExecutableNetworkInternal > | Ptr |
A shared pointer to IExecutableNetworkInternal interface. | |
Public Member Functions | |
ExecutableNetworkThreadSafeDefault (const ITaskExecutor::Ptr &taskExecutor=std::make_shared< CPUStreamsExecutor >(IStreamsExecutor::Config{"Default"}), const ITaskExecutor::Ptr &callbackExecutor=std::make_shared< CPUStreamsExecutor >(IStreamsExecutor::Config{"Callback"})) | |
Constructs a new instance. More... | |
IInferRequest::Ptr | CreateInferRequest () override |
Given optional implementation of creating asynchronous inference request to avoid need for it to be implemented by plugin. More... | |
Public Member Functions inherited from InferenceEngine::ExecutableNetworkInternal | |
virtual void | setNetworkInputs (const InferenceEngine::InputsDataMap networkInputs) |
Sets the network inputs info. More... | |
virtual void | setNetworkOutputs (const InferenceEngine::OutputsDataMap networkOutputs) |
Sets the network outputs data. More... | |
ConstOutputsDataMap | GetOutputsInfo () const override |
Gets the Executable network output Data node information. The received info is stored in the given Data node. This method need to be called to find output names for using them later during filling of a map of blobs passed later to InferenceEngine::IInferencePlugin::Infer() More... | |
ConstInputsDataMap | GetInputsInfo () const override |
Gets the Executable network input Data node information. The received info is stored in the given InputsDataMap object. This method need to be called to find out input names for using them later during filling of a map of blobs passed later to InferenceEngine::IInferencePlugin::Infer() More... | |
void | Export (const std::string &modelFileName) override |
Export the current created executable network so it can be used later in the Import() main API. More... | |
void | Export (std::ostream &networkModel) override |
Export the current created executable network so it can be used later in the Import() main API. More... | |
CNNNetwork | GetExecGraphInfo () override |
Get executable graph information from a device. More... | |
void | SetPointerToPlugin (IInferencePlugin::Ptr plugin) |
Sets the pointer to plugin internal. More... | |
std::vector< IVariableStateInternal::Ptr > | QueryState () override |
Queries memory states. More... | |
void | SetConfig (const std::map< std::string, Parameter > &config) override |
Sets configuration for current executable network. More... | |
Parameter | GetConfig (const std::string &name) const override |
Gets configuration dedicated to plugin behaviour. More... | |
Parameter | GetMetric (const std::string &name) const override |
Gets general runtime metric for dedicated hardware. More... | |
RemoteContext::Ptr | GetContext () const override |
Gets the remote context. More... | |
Public Member Functions inherited from InferenceEngine::IExecutableNetworkInternal | |
virtual | ~IExecutableNetworkInternal ()=default |
Destroys the object. | |
Protected Member Functions | |
template<typename AsyncInferRequestType = AsyncInferRequestThreadSafeDefault> | |
IInferRequest::Ptr | CreateAsyncInferRequestFromSync () |
Creates asyncronous inference request from synchronous request returned by CreateInferRequestImpl. More... | |
virtual InferRequestInternal::Ptr | CreateInferRequestImpl (InputsDataMap networkInputs, OutputsDataMap networkOutputs)=0 |
Creates a synchronous inference request object used to infer the network. More... | |
Protected Member Functions inherited from InferenceEngine::ExecutableNetworkInternal | |
virtual void | ExportImpl (std::ostream &networkModel) |
Exports an internal hardware-dependent model to a stream. More... | |
Protected Attributes | |
ITaskExecutor::Ptr | _taskExecutor = nullptr |
Holds a task executor. | |
ITaskExecutor::Ptr | _callbackExecutor = nullptr |
Holds a callback executor. | |
Protected Attributes inherited from InferenceEngine::ExecutableNetworkInternal | |
InferenceEngine::InputsDataMap | _networkInputs |
Holds infromation about network inputs info. | |
InferenceEngine::OutputsDataMap | _networkOutputs |
Holds information about network outputs data. | |
IInferencePlugin::Ptr | _plugin |
A pointer to a IInferencePlugin interface. More... | |
This class provides optimal thread safe default implementation. The class is recommended to be used as a base class for Executable Network impleentation during plugin development.
|
inlineexplicit |
Constructs a new instance.
[in] | taskExecutor | The task executor used |
[in] | callbackExecutor | The callback executor |
|
inlineprotected |
Creates asyncronous inference request from synchronous request returned by CreateInferRequestImpl.
AsyncInferRequestType | A type of asynchronous inference request to use a wrapper for synchronous request |
|
inlineoverridevirtual |
Given optional implementation of creating asynchronous inference request to avoid need for it to be implemented by plugin.
Implements InferenceEngine::IExecutableNetworkInternal.
|
protectedpure virtual |
Creates a synchronous inference request object used to infer the network.
networkInputs | An input info map needed to create input blobs |
networkOutputs | An output data map needed to create output blobs |