Public Types | Public Member Functions
InferenceEngine::ITaskExecutor Interface Referenceabstract

Interface for Task Executor. Inference Engine uses InferenceEngine::ITaskExecutor interface to run all asynchronous internal tasks. Different implementations of task executors can be used for different purposes: More...

#include <ie_itask_executor.hpp>

Inheritance diagram for InferenceEngine::ITaskExecutor:
InferenceEngine::ImmediateExecutor InferenceEngine::IStreamsExecutor InferenceEngine::CPUStreamsExecutor

Public Types

using Ptr = std::shared_ptr< ITaskExecutor >
 

Public Member Functions

virtual ~ITaskExecutor ()=default
 Destroys the object.
 
virtual void run (Task task)=0
 Execute InferenceEngine::Task inside task executor context. More...
 
virtual void runAndWait (const std::vector< Task > &tasks)
 Execute all of the tasks and waits for its completion. Default runAndWait() method implementation uses run() pure virtual method and higher level synchronization primitives from STL. The task is wrapped into std::packaged_task which returns std::future. std::packaged_task will call the task and signal to std::future that the task is finished or the exception is thrown from task Than std::future is used to wait for task execution completion and task exception extraction. More...
 

Detailed Description

Interface for Task Executor. Inference Engine uses InferenceEngine::ITaskExecutor interface to run all asynchronous internal tasks. Different implementations of task executors can be used for different purposes:

Synchronization

It is InferenceEngine::ITaskExecutor user responsibility to wait for task execution completion. The c++11 standard way to wait task completion is to use std::packaged_task or std::promise with std::future. Here is an example of how to use std::promise to wait task completion and process task's exceptions:

// std::promise is move only object so to satisfy copy callable constraint we use std::shared_ptr
auto promise = std::make_shared<std::promise<void>>();
// When the promise is created we can get std::future to wait the result
auto future = promise->get_future();
// Rather simple task
InferenceEngine::Task task = [] {std::cout << "Some Output" << std::endl; };
// Create an executor
InferenceEngine::ITaskExecutor::Ptr taskExecutor = std::make_shared<InferenceEngine::CPUStreamsExecutor>();
if (taskExecutor == nullptr) {
// ProcessError(e);
return;
}
// We capture the task and the promise. When the task is executed in the task executor context
// we munually call std::promise::set_value() method
taskExecutor->run([task, promise] {
std::exception_ptr currentException;
try {
task();
} catch(...) {
// If there is some exceptions store the pointer to current exception
currentException = std::current_exception();
}
if (nullptr == currentException) {
promise->set_value(); // <-- If there is no problems just call std::promise::set_value()
} else {
promise->set_exception(currentException); // <-- If there is an exception forward it to std::future object
}
});
// To wait the task completion we call std::future::wait method
future.wait(); // The current thread will be blocked here and wait when std::promise::set_value()
// or std::promise::set_exception() method will be called.
// If the future store the exception it will be rethrown in std::future::get method
try {
future.get();
} catch(std::exception& /*e*/) {
// ProcessError(e);
}

Member Typedef Documentation

◆ Ptr

A shared pointer to ITaskExecutor interface

Member Function Documentation

◆ run()

virtual void InferenceEngine::ITaskExecutor::run ( Task  task)
pure virtual

Execute InferenceEngine::Task inside task executor context.

Parameters
taskA task to start

Implemented in InferenceEngine::ImmediateExecutor, and InferenceEngine::CPUStreamsExecutor.

◆ runAndWait()

virtual void InferenceEngine::ITaskExecutor::runAndWait ( const std::vector< Task > &  tasks)
virtual

Execute all of the tasks and waits for its completion. Default runAndWait() method implementation uses run() pure virtual method and higher level synchronization primitives from STL. The task is wrapped into std::packaged_task which returns std::future. std::packaged_task will call the task and signal to std::future that the task is finished or the exception is thrown from task Than std::future is used to wait for task execution completion and task exception extraction.

Note
runAndWait() does not copy or capture tasks!
Parameters
tasksA vector of tasks to execute

The documentation for this interface was generated from the following file:
InferenceEngine::ITaskExecutor::Ptr
std::shared_ptr< ITaskExecutor > Ptr
Definition: ie_itask_executor.hpp:51
InferenceEngine::Task
std::function< void()> Task
Inference Engine Task Executor can use any copyable callable without parameters and output as a task....
Definition: ie_itask_executor.hpp:25