class InferenceEngine::ExecutableNetworkThreadSafeDefault¶
Overview¶
This class provides optimal thread safe default implementation. The class is recommended to be used as a base class for Executable Network implementation during plugin development. More…
#include <ie_executable_network_thread_safe_default.hpp>
class ExecutableNetworkThreadSafeDefault: public IExecutableNetworkInternal
{
public:
// typedefs
typedef std::shared_ptr<ExecutableNetworkThreadSafeDefault> Ptr;
// construction
ExecutableNetworkThreadSafeDefault(
const ITaskExecutor::Ptr& taskExecutor = std::make_shared<CPUStreamsExecutor>(IStreamsExecutor::Config{ "Default"}),
const ITaskExecutor::Ptr& callbackExecutor = std::make_shared<CPUStreamsExecutor>(IStreamsExecutor::Config{ "Callback"})
);
// methods
IInferRequestInternal::Ptr CreateInferRequest();
};
Detailed Documentation¶
This class provides optimal thread safe default implementation. The class is recommended to be used as a base class for Executable Network implementation during plugin development.
Typedefs¶
typedef std::shared_ptr<ExecutableNetworkThreadSafeDefault> Ptr
A shared pointer to a ExecutableNetworkThreadSafeDefault object.
Construction¶
ExecutableNetworkThreadSafeDefault(
const ITaskExecutor::Ptr& taskExecutor = std::make_shared<CPUStreamsExecutor>(IStreamsExecutor::Config{ "Default"}),
const ITaskExecutor::Ptr& callbackExecutor = std::make_shared<CPUStreamsExecutor>(IStreamsExecutor::Config{ "Callback"})
)
Constructs a new instance.
Parameters:
taskExecutor |
The task executor used |
callbackExecutor |
The callback executor |
Methods¶
IInferRequestInternal::Ptr CreateInferRequest()
Given optional implementation of creating asynchronous inference request to avoid need for it to be implemented by plugin.
Returns:
shared_ptr for the created asynchronous inference request