Data Structures | Public Types | Public Member Functions | Protected Types | Protected Member Functions | Protected Attributes
InferenceEngine::AsyncInferRequestThreadSafeDefault Class Reference

Base class with default implementation of asynchronous multi staged inference request. To customize pipeline stages derived class should change the content of AsyncInferRequestThreadSafeDefault::_pipeline member container. It consists of pairs of tasks and executors which will run the task. The class is recommended to be used by plugins as a base class for asynchronous inference request implementation. More...

#include <ie_infer_async_request_thread_safe_default.hpp>

Inheritance diagram for InferenceEngine::AsyncInferRequestThreadSafeDefault:
InferenceEngine::AsyncInferRequestThreadSafeInternal InferenceEngine::IAsyncInferRequestInternal InferenceEngine::IInferRequestInternal

Public Types

using Ptr = std::shared_ptr< AsyncInferRequestThreadSafeDefault >
 A shared pointer to AsyncInferRequestThreadSafeDefault.
 
- Public Types inherited from InferenceEngine::AsyncInferRequestThreadSafeInternal
typedef std::shared_ptr< AsyncInferRequestThreadSafeInternalPtr
 A shared pointer to a AsyncInferRequestThreadSafeInternal implementation.
 
- Public Types inherited from InferenceEngine::IAsyncInferRequestInternal
typedef std::shared_ptr< IAsyncInferRequestInternalPtr
 A shared pointer to IAsyncInferRequestInternal interface.
 
- Public Types inherited from InferenceEngine::IInferRequestInternal
typedef std::shared_ptr< IInferRequestInternalPtr
 A shared pointer to a IInferRequestInternal interface.
 

Public Member Functions

 AsyncInferRequestThreadSafeDefault (const InferRequestInternal::Ptr &request, const ITaskExecutor::Ptr &taskExecutor, const ITaskExecutor::Ptr &callbackExecutor)
 Wraps a InferRequestInternal::Ptr implementation and constructs a AsyncInferRequestThreadSafeDefault::_pipeline where taskExecutor is used to run InferRequestInternal::Infer asynchronously. More...
 
 ~AsyncInferRequestThreadSafeDefault ()
 Destroys the object, stops AsyncInferRequestThreadSafeDefault::_pipeline and waits for a finish.
 
StatusCode Wait (int64_t millis_timeout) override
 Waits for completion of all pipeline stages If the pipeline raises an exception it will be rethrown here. More...
 
void SetPointerToPublicInterface (InferenceEngine::IInferRequest::Ptr ptr)
 Sets the pointer to public interface. More...
 
std::vector< InferenceEngine::IVariableStateInternal::PtrQueryState () override
 Queries memory states. More...
 
- Public Member Functions inherited from InferenceEngine::AsyncInferRequestThreadSafeInternal
 AsyncInferRequestThreadSafeInternal ()
 Constructs a new instance.
 
void StartAsync () override
 Start inference of specified input(s) in asynchronous mode. More...
 
void GetUserData (void **data) override
 Get arbitrary data for the request. More...
 
void SetUserData (void *data) override
 Set arbitrary data for the request. More...
 
void SetCompletionCallback (IInferRequest::CompletionCallback callback) override
 Set callback function which will be called on success or failure of asynchronous request. More...
 
void Infer () override
 Infers specified input(s) in synchronous mode. More...
 
void GetPerformanceCounts (std::map< std::string, InferenceEngineProfileInfo > &perfMap) const override
 Queries performance measures per layer to get feedback of what is the most time consuming layer. Note: not all plugins may provide meaningful data. More...
 
void SetBlob (const char *name, const Blob::Ptr &data) override
 Set input/output data to infer. More...
 
void SetBlob (const char *name, const Blob::Ptr &data, const PreProcessInfo &info) override
 Sets pre-process for input data. More...
 
void GetBlob (const char *name, Blob::Ptr &data) override
 Get input/output data to infer. More...
 
void GetPreProcess (const char *name, const PreProcessInfo **info) const override
 Gets pre-process for input data. More...
 
void SetBatch (int batch) override
 Sets new batch size when dynamic batching is enabled in executable network that created this request. More...
 
- Public Member Functions inherited from InferenceEngine::IAsyncInferRequestInternal
virtual ~IAsyncInferRequestInternal ()=default
 A virtual destructor.
 
- Public Member Functions inherited from InferenceEngine::IInferRequestInternal
virtual ~IInferRequestInternal ()=default
 Destroys the object.
 

Protected Types

using Stage = std::pair< ITaskExecutor::Ptr, Task >
 Each pipeline stage is a Task that is executed by specified ITaskExecutor implementation.
 
using Pipeline = std::vector< Stage >
 Pipeline is vector of stages.
 

Protected Member Functions

void RunFirstStage (const Pipeline::iterator itBeginStage, const Pipeline::iterator itEndStage, const ITaskExecutor::Ptr callbackExecutor={})
 Creates and run the first stage task. If destructor was not called add a new std::future to the AsyncInferRequestThreadSafeDefault::_futures list that would be used to wait AsyncInferRequestThreadSafeDefault::_pipeline finish. More...
 
void StopAndWait ()
 Forbids pipeline start and wait for all started pipelines. More...
 
void InferUsingAsync ()
 Implements Infer() using StartAsync() and Wait()
 
void InferUsingSync ()
 Implements Infer() using synchronous pipeline and Wait()
 
void StartAsync_ThreadUnsafe () override
 Starts an asynchronous pipeline thread unsafe. More...
 
void Infer_ThreadUnsafe () override
 Performs inference of pipeline in syncronous mode. More...
 
void GetPerformanceCounts_ThreadUnsafe (std::map< std::string, InferenceEngineProfileInfo > &perfMap) const override
 Gets the performance counts thread unsafe. More...
 
void SetBlob_ThreadUnsafe (const char *name, const Blob::Ptr &data) override
 Sets the blob thread unsafe. More...
 
void SetBlob_ThreadUnsafe (const char *name, const Blob::Ptr &data, const PreProcessInfo &info) override
 Sets the blob with preprocessing information thread unsafe. More...
 
void GetBlob_ThreadUnsafe (const char *name, Blob::Ptr &data) override
 Gets the input or output blob thread unsafe. More...
 
void GetPreProcess_ThreadUnsafe (const char *name, const PreProcessInfo **info) const override
 Gets the preprocessing information thread unsafe. More...
 
void SetCompletionCallback_ThreadUnsafe (IInferRequest::CompletionCallback callback) override
 Sets the completion callback thread unsafe. More...
 
void GetUserData_ThreadUnsafe (void **data) override
 Gets the user data thread unsafe. More...
 
void SetUserData_ThreadUnsafe (void *data) override
 Sets the user data thread unsafe. More...
 
void SetBatch_ThreadUnsafe (int batch) override
 Sets the dynamic batch thread unsafe. More...
 
- Protected Member Functions inherited from InferenceEngine::AsyncInferRequestThreadSafeInternal
virtual bool isRequestBusy () const
 Determines if request busy. More...
 
virtual bool setIsRequestBusy (bool isBusy)
 Sets the is request busy. More...
 
void CheckBusy () const
 Checks whether an inference request is busy and calls ThrowBusy if true
 

Protected Attributes

ITaskExecutor::Ptr _requestExecutor
 Used to run inference CPU tasks.
 
ITaskExecutor::Ptr _callbackExecutor
 Used to run post inference callback in asynchronous pipline.
 
ITaskExecutor::Ptr _syncCallbackExecutor
 Used to run post inference callback in synchronous pipline.
 
Pipeline _pipeline
 Pipeline variable that should be filled by inherited class.
 
Pipeline _syncPipeline
 Synchronous pipeline variable that should be filled by inherited class.
 

Additional Inherited Members

- Static Protected Member Functions inherited from InferenceEngine::AsyncInferRequestThreadSafeInternal
static void ThrowBusy ()
 Throws an exception that an inference request is busy.
 

Detailed Description

Base class with default implementation of asynchronous multi staged inference request. To customize pipeline stages derived class should change the content of AsyncInferRequestThreadSafeDefault::_pipeline member container. It consists of pairs of tasks and executors which will run the task. The class is recommended to be used by plugins as a base class for asynchronous inference request implementation.

Note
To synchronize derived context with stages derived class should call AsyncInferRequestThreadSafeDefault::StopAndWait() function in destructor.
Example
Here is an example of asynchronous inference request implementation for some accelerator device. It uses 5 different executors to run different stages of a synchronous inference request.
// Inherits from AsyncInferRequestThreadSafeDefault
class AcceleratorAsyncInferRequest : public AsyncInferRequestThreadSafeDefault {
// Store the pointer to the synchronous request and five executors
AcceleratorAsyncInferRequest(const AcceleratorSyncRequest::Ptr& syncRequest,
const ITaskExecutor::Ptr& preprocessExecutor,
const ITaskExecutor::Ptr& writeToDeviceExecutor,
const ITaskExecutor::Ptr& runOnDeviceExecutor,
const ITaskExecutor::Ptr& readFromDeviceExecutor,
const ITaskExecutor::Ptr& postProcessExecutor) :
AsyncInferRequestThreadSafeDefault(syncRequest, nullptr, nullptr),
_accSyncRequest{syncRequest},
_preprocessExecutor{preprocessExecutor},
_writeToDeviceExecutor{writeToDeviceExecutor},
_runOnDeviceExecutor{runOnDeviceExecutor},
_readFromDeviceExecutor{readFromDeviceExecutor},
_postProcessExecutor{postProcessExecutor}
{
// Five pipeline stages of synchronous infer request are run by different executors
{ _preprocessExecutor , [this] {
_accSyncRequest->Preprocess();
}},
{ _writeToDeviceExecutor , [this] {
_accSyncRequest->WriteToDevice();
}},
{ _runOnDeviceExecutor , [this] {
_accSyncRequest->RunOnDevice();
}},
{ _readFromDeviceExecutor , [this] {
_accSyncRequest->ReadFromDevice();
}},
{ _postProcessExecutor , [this] {
_accSyncRequest->PostProcess();
}},
};
}
// As all stages use _accSyncRequest member we should wait for all stages tasks before the destructor destroy this member.
~AcceleratorAsyncInferRequest() {
}
AcceleratorSyncRequest::Ptr _accSyncRequest;
ITaskExecutor::Ptr _preprocessExecutor, _writeToDeviceExecutor, _runOnDeviceExecutor, _readFromDeviceExecutor, _postProcessExecutor;
};
Pipeline _pipeline
Pipeline variable that should be filled by inherited class.
Definition: ie_infer_async_request_thread_safe_default.hpp:254
AsyncInferRequestThreadSafeDefault(const InferRequestInternal::Ptr &request, const ITaskExecutor::Ptr &taskExecutor, const ITaskExecutor::Ptr &callbackExecutor)
Wraps a InferRequestInternal::Ptr implementation and constructs a AsyncInferRequestThreadSafeDefault:...
Definition: ie_infer_async_request_thread_safe_default.hpp:81
void StopAndWait()
Forbids pipeline start and wait for all started pipelines.
Definition: ie_infer_async_request_thread_safe_default.hpp:216
std::shared_ptr< ITaskExecutor > Ptr
Definition: ie_itask_executor.hpp:51

Constructor & Destructor Documentation

◆ AsyncInferRequestThreadSafeDefault()

InferenceEngine::AsyncInferRequestThreadSafeDefault::AsyncInferRequestThreadSafeDefault ( const InferRequestInternal::Ptr request,
const ITaskExecutor::Ptr taskExecutor,
const ITaskExecutor::Ptr callbackExecutor 
)
inline

Wraps a InferRequestInternal::Ptr implementation and constructs a AsyncInferRequestThreadSafeDefault::_pipeline where taskExecutor is used to run InferRequestInternal::Infer asynchronously.

Parameters
[in]requestThe synchronous request
[in]taskExecutorThe task executor
[in]callbackExecutorThe callback executor

Member Function Documentation

◆ GetBlob_ThreadUnsafe()

void InferenceEngine::AsyncInferRequestThreadSafeDefault::GetBlob_ThreadUnsafe ( const char *  name,
Blob::Ptr data 
)
inlineoverrideprotectedvirtual

Gets the input or output blob thread unsafe.

Note
Used by AsyncInferRequestThreadSafeInternal::GetBlob which ensures thread-safety and calls this method after.
Parameters
[in]nameThe name of input / output data to get a blob for
dataThe data

Implements InferenceEngine::AsyncInferRequestThreadSafeInternal.

◆ GetPerformanceCounts_ThreadUnsafe()

void InferenceEngine::AsyncInferRequestThreadSafeDefault::GetPerformanceCounts_ThreadUnsafe ( std::map< std::string, InferenceEngineProfileInfo > &  perfMap) const
inlineoverrideprotectedvirtual

Gets the performance counts thread unsafe.

Note
Used by AsyncInferRequestThreadSafeInternal::GetPerformanceCounts which ensures thread-safety and calls this method after.
Parameters
perfMapThe performance map

Implements InferenceEngine::AsyncInferRequestThreadSafeInternal.

◆ GetPreProcess_ThreadUnsafe()

void InferenceEngine::AsyncInferRequestThreadSafeDefault::GetPreProcess_ThreadUnsafe ( const char *  name,
const PreProcessInfo **  info 
) const
inlineoverrideprotectedvirtual

Gets the preprocessing information thread unsafe.

Note
Used by AsyncInferRequestThreadSafeInternal::GetPreProcess which ensures thread-safety and calls this method after.
Parameters
[in]nameThe name of input / output data to get a processing information for
infoThe preprocessing information

Implements InferenceEngine::AsyncInferRequestThreadSafeInternal.

◆ GetUserData_ThreadUnsafe()

void InferenceEngine::AsyncInferRequestThreadSafeDefault::GetUserData_ThreadUnsafe ( void **  data)
inlineoverrideprotectedvirtual

Gets the user data thread unsafe.

Note
Used by AsyncInferRequestThreadSafeInternal::GetUserData which ensures thread-safety and calls this method after.
Parameters
dataThe user data

Implements InferenceEngine::AsyncInferRequestThreadSafeInternal.

◆ Infer_ThreadUnsafe()

void InferenceEngine::AsyncInferRequestThreadSafeDefault::Infer_ThreadUnsafe ( )
inlineoverrideprotectedvirtual

Performs inference of pipeline in syncronous mode.

Note
Used by AsyncInferRequestThreadSafeInternal::Infer which ensures thread-safety and calls this method after.

Implements InferenceEngine::AsyncInferRequestThreadSafeInternal.

◆ QueryState()

std::vector<InferenceEngine::IVariableStateInternal::Ptr> InferenceEngine::AsyncInferRequestThreadSafeDefault::QueryState ( )
inlineoverridevirtual

Queries memory states.

Returns
Returns memory states

Implements InferenceEngine::IInferRequestInternal.

◆ RunFirstStage()

void InferenceEngine::AsyncInferRequestThreadSafeDefault::RunFirstStage ( const Pipeline::iterator  itBeginStage,
const Pipeline::iterator  itEndStage,
const ITaskExecutor::Ptr  callbackExecutor = {} 
)
inlineprotected

Creates and run the first stage task. If destructor was not called add a new std::future to the AsyncInferRequestThreadSafeDefault::_futures list that would be used to wait AsyncInferRequestThreadSafeDefault::_pipeline finish.

Parameters
[in]itBeginStageIterator to begin of pipeline
[in]itEndStageEnd pipeline iterator
[in]callbackExecutorFinal or error stage executor

◆ SetBatch_ThreadUnsafe()

void InferenceEngine::AsyncInferRequestThreadSafeDefault::SetBatch_ThreadUnsafe ( int  batch)
inlineoverrideprotectedvirtual

Sets the dynamic batch thread unsafe.

Note
Used by AsyncInferRequestThreadSafeInternal::SetBatch which ensures thread-safety and calls this method after.
Parameters
[in]batchThe dynamic batch value

Implements InferenceEngine::AsyncInferRequestThreadSafeInternal.

◆ SetBlob_ThreadUnsafe() [1/2]

void InferenceEngine::AsyncInferRequestThreadSafeDefault::SetBlob_ThreadUnsafe ( const char *  name,
const Blob::Ptr data 
)
inlineoverrideprotectedvirtual

Sets the blob thread unsafe.

Note
Used by AsyncInferRequestThreadSafeInternal::SetBlob which ensures thread-safety and calls this method after.
Parameters
[in]nameThe name of input / output data to set a blob to
[in]dataThe blob to set

Implements InferenceEngine::AsyncInferRequestThreadSafeInternal.

◆ SetBlob_ThreadUnsafe() [2/2]

void InferenceEngine::AsyncInferRequestThreadSafeDefault::SetBlob_ThreadUnsafe ( const char *  name,
const Blob::Ptr data,
const PreProcessInfo info 
)
inlineoverrideprotectedvirtual

Sets the blob with preprocessing information thread unsafe.

Note
Used by AsyncInferRequestThreadSafeInternal::SetBlob which ensures thread-safety and calls this method after.
Parameters
[in]nameThe name of input / output data to set a blob to
[in]dataThe blob to set
[in]infoThe preprocessing information

Implements InferenceEngine::AsyncInferRequestThreadSafeInternal.

◆ SetCompletionCallback_ThreadUnsafe()

void InferenceEngine::AsyncInferRequestThreadSafeDefault::SetCompletionCallback_ThreadUnsafe ( IInferRequest::CompletionCallback  callback)
inlineoverrideprotectedvirtual

Sets the completion callback thread unsafe.

Note
Used by AsyncInferRequestThreadSafeInternal::SetCompletionCallback which ensures thread-safety and calls this method after.
Parameters
[in]callbackThe callback to set

Implements InferenceEngine::AsyncInferRequestThreadSafeInternal.

◆ SetPointerToPublicInterface()

void InferenceEngine::AsyncInferRequestThreadSafeDefault::SetPointerToPublicInterface ( InferenceEngine::IInferRequest::Ptr  ptr)
inline

Sets the pointer to public interface.

Note
Needed to correctly handle ownership between objects
Parameters
[in]ptrA shared pointer to a public IInferRequest interface.

◆ SetUserData_ThreadUnsafe()

void InferenceEngine::AsyncInferRequestThreadSafeDefault::SetUserData_ThreadUnsafe ( void *  data)
inlineoverrideprotectedvirtual

Sets the user data thread unsafe.

Note
Used by AsyncInferRequestThreadSafeInternal::SetUserData which ensures thread-safety and calls this method after.
Parameters
dataThe user data

Implements InferenceEngine::AsyncInferRequestThreadSafeInternal.

◆ StartAsync_ThreadUnsafe()

void InferenceEngine::AsyncInferRequestThreadSafeDefault::StartAsync_ThreadUnsafe ( )
inlineoverrideprotectedvirtual

Starts an asynchronous pipeline thread unsafe.

Note
Used by AsyncInferRequestThreadSafeInternal::StartAsync which ensures thread-safety and calls this method after.

Implements InferenceEngine::AsyncInferRequestThreadSafeInternal.

◆ StopAndWait()

void InferenceEngine::AsyncInferRequestThreadSafeDefault::StopAndWait ( )
inlineprotected

Forbids pipeline start and wait for all started pipelines.

Note
Should be called in derived class destructor to wait for completion of usage of derived context captured by pipeline tasks

◆ Wait()

StatusCode InferenceEngine::AsyncInferRequestThreadSafeDefault::Wait ( int64_t  millis_timeout)
inlineoverridevirtual

Waits for completion of all pipeline stages If the pipeline raises an exception it will be rethrown here.

Parameters
millis_timeoutA timeout is ms to wait or special enum value of IInferRequest::WaitMode
Returns
A status code

Implements InferenceEngine::IAsyncInferRequestInternal.


The documentation for this class was generated from the following file: