Documentation
Install OpenVINO™
threading
ie_itask_executor.hpp
Go to the documentation of this file.
1
// Copyright (C) 2018-2020 Intel Corporation
2
// SPDX-License-Identifier: Apache-2.0
3
//
4
5
/**
6
* @file ie_itask_executor.hpp
7
* @brief A header file for Inference Engine Task Executor Interface
8
*/
9
10
#pragma once
11
12
#include <functional>
13
#include <memory>
14
#include <vector>
15
16
#include "
ie_api.h
"
17
18
namespace
InferenceEngine
{
19
20
/**
21
* @brief Inference Engine Task Executor can use any copyable callable without parameters and output as a task.
22
* It would be wrapped into std::function object
23
* @ingroup ie_dev_api_threading
24
*/
25
using
Task
= std::function<void()>;
26
27
/**
28
* @interface ITaskExecutor
29
* @ingroup ie_dev_api_threading
30
* @brief Interface for Task Executor.
31
* Inference Engine uses `InferenceEngine::ITaskExecutor` interface to run all asynchronous internal tasks.
32
* Different implementations of task executors can be used for different purposes:
33
* - To improve cache locality of memory bound CPU tasks some executors can limit task's affinity and maximum
34
concurrency.
35
* - The executor with one worker thread can be used to serialize access to acceleration device.
36
* - Immediate task executor can be used to satisfy `InferenceEngine::ITaskExecutor` interface restrictions but
37
run tasks in current thread.
38
* @note Implementation should guaranty thread safety of all methods
39
* @section Synchronization
40
* It is `InferenceEngine::ITaskExecutor` user responsibility to wait for task execution completion.
41
* The `c++11` standard way to wait task completion is to use `std::packaged_task` or `std::promise` with
42
`std::future`.
43
* Here is an example of how to use `std::promise` to wait task completion and process task's exceptions:
44
* @snippet example_itask_executor.cpp itask_executor:define_pipeline
45
*/
46
class
INFERENCE_ENGINE_API_CLASS(ITaskExecutor) {
47
public
:
48
/**
49
* A shared pointer to ITaskExecutor interface
50
*/
51
using
Ptr
= std::shared_ptr<ITaskExecutor>;
52
53
/**
54
* @brief Destroys the object.
55
*/
56
virtual
~ITaskExecutor
() =
default
;
57
58
/**
59
* @brief Execute InferenceEngine::Task inside task executor context
60
* @param task A task to start
61
*/
62
virtual
void
run
(
Task
task) = 0;
63
64
/**
65
* @brief Execute all of the tasks and waits for its completion.
66
* Default runAndWait() method implementation uses run() pure virtual method
67
* and higher level synchronization primitives from STL.
68
* The task is wrapped into std::packaged_task which returns std::future.
69
* std::packaged_task will call the task and signal to std::future that the task is finished
70
* or the exception is thrown from task
71
* Than std::future is used to wait for task execution completion and
72
* task exception extraction
73
* @note runAndWait() does not copy or capture tasks!
74
* @param tasks A vector of tasks to execute
75
*/
76
virtual
void
runAndWait
(
const
std::vector<Task>& tasks);
77
};
78
79
}
// namespace InferenceEngine
InferenceEngine
Inference Engine Plugin API namespace.
InferenceEngine::ITaskExecutor::runAndWait
virtual void runAndWait(const std::vector< Task > &tasks)
Execute all of the tasks and waits for its completion. Default runAndWait() method implementation use...
InferenceEngine::ITaskExecutor::Ptr
std::shared_ptr< ITaskExecutor > Ptr
Definition:
ie_itask_executor.hpp:51
InferenceEngine::ITaskExecutor::~ITaskExecutor
virtual ~ITaskExecutor()=default
Destroys the object.
InferenceEngine::ITaskExecutor::run
virtual void run(Task task)=0
Execute InferenceEngine::Task inside task executor context.
ie_api.h
InferenceEngine::Task
std::function< void()> Task
Inference Engine Task Executor can use any copyable callable without parameters and output as a task....
Definition:
ie_itask_executor.hpp:25