Public Types | Public Member Functions
InferenceEngine::CPUStreamsExecutor Class Reference

CPU Streams executor implementation. The executor splits the CPU into groups of threads, that can be pinned to cores or NUMA nodes. It uses custom threads to pull tasks from single queue. More...

#include <ie_cpu_streams_executor.hpp>

Inheritance diagram for InferenceEngine::CPUStreamsExecutor:
InferenceEngine::IStreamsExecutor InferenceEngine::ITaskExecutor

Public Types

using Ptr = std::shared_ptr< CPUStreamsExecutor >
 A shared pointer to a CPUStreamsExecutor object.
 
- Public Types inherited from InferenceEngine::IStreamsExecutor
enum  ThreadBindingType : std::uint8_t {
  NONE ,
  CORES ,
  NUMA
}
 Defines thread binding type. More...
 
using Ptr = std::shared_ptr< IStreamsExecutor >
 
- Public Types inherited from InferenceEngine::ITaskExecutor
using Ptr = std::shared_ptr< ITaskExecutor >
 

Public Member Functions

 CPUStreamsExecutor (const Config &config={})
 Constructor. More...
 
 ~CPUStreamsExecutor () override
 A class destructor.
 
void run (Task task) override
 Execute InferenceEngine::Task inside task executor context. More...
 
void Execute (Task task) override
 Execute the task in the current thread using streams executor configuration and constraints. More...
 
int GetStreamId () override
 Return the index of current stream. More...
 
int GetNumaNodeId () override
 Return the id of current NUMA Node. More...
 
- Public Member Functions inherited from InferenceEngine::IStreamsExecutor
 ~IStreamsExecutor () override
 A virtual destructor.
 
- Public Member Functions inherited from InferenceEngine::ITaskExecutor
virtual ~ITaskExecutor ()=default
 Destroys the object.
 
virtual void runAndWait (const std::vector< Task > &tasks)
 Execute all of the tasks and waits for its completion. Default runAndWait() method implementation uses run() pure virtual method and higher level synchronization primitives from STL. The task is wrapped into std::packaged_task which returns std::future. std::packaged_task will call the task and signal to std::future that the task is finished or the exception is thrown from task Than std::future is used to wait for task execution completion and task exception extraction. More...
 

Detailed Description

CPU Streams executor implementation. The executor splits the CPU into groups of threads, that can be pinned to cores or NUMA nodes. It uses custom threads to pull tasks from single queue.

Constructor & Destructor Documentation

◆ CPUStreamsExecutor()

InferenceEngine::CPUStreamsExecutor::CPUStreamsExecutor ( const Config config = {})
explicit

Constructor.

Parameters
configStream executor parameters

Member Function Documentation

◆ Execute()

void InferenceEngine::CPUStreamsExecutor::Execute ( Task  task)
overridevirtual

Execute the task in the current thread using streams executor configuration and constraints.

Parameters
taskA task to start

Implements InferenceEngine::IStreamsExecutor.

◆ GetNumaNodeId()

int InferenceEngine::CPUStreamsExecutor::GetNumaNodeId ( )
overridevirtual

Return the id of current NUMA Node.

Returns
ID of current NUMA Node, or throws exceptions if called not from stream thread

Implements InferenceEngine::IStreamsExecutor.

◆ GetStreamId()

int InferenceEngine::CPUStreamsExecutor::GetStreamId ( )
overridevirtual

Return the index of current stream.

Returns
An index of current stream. Or throw exceptions if called not from stream thread

Implements InferenceEngine::IStreamsExecutor.

◆ run()

void InferenceEngine::CPUStreamsExecutor::run ( Task  task)
overridevirtual

Execute InferenceEngine::Task inside task executor context.

Parameters
taskA task to start

Implements InferenceEngine::ITaskExecutor.


The documentation for this class was generated from the following file: