An internal API of synchronous inference request to be implemented by plugin, which is used in InferRequestBase forwarding mechanism. More...
#include <ie_iinfer_request_internal.hpp>
Public Types | |
| using | Ptr = std::shared_ptr< IInferRequestInternal > |
| A shared pointer to a IInferRequestInternal interface. | |
| using | Callback = std::function< void(std::exception_ptr)> |
| Alias for callback type. | |
Public Member Functions | |
| IInferRequestInternal (const InputsDataMap &networkInputs, const OutputsDataMap &networkOutputs) | |
| Constructs a new instance. More... | |
| virtual void | Infer () |
| Infers specified input(s) in synchronous mode. More... | |
| virtual void | InferImpl () |
| The minimal infer function to be implemented by plugins. It infers specified input(s) in synchronous mode. More... | |
| virtual void | Cancel () |
| Cancel current inference request execution. | |
| virtual std::map< std::string, InferenceEngineProfileInfo > | GetPerformanceCounts () const |
| Queries performance measures per layer to get feedback of what is the most time consuming layer. Note: not all plugins may provide meaningful data. More... | |
| virtual void | SetBlob (const std::string &name, const Blob::Ptr &data) |
| Set input/output data to infer. More... | |
| virtual Blob::Ptr | GetBlob (const std::string &name) |
| Get input/output data to infer. More... | |
| virtual void | SetBlob (const std::string &name, const Blob::Ptr &data, const PreProcessInfo &info) |
| Sets pre-process for input data. More... | |
| virtual const PreProcessInfo & | GetPreProcess (const std::string &name) const |
| Gets pre-process for input data. More... | |
| virtual void | SetBatch (int batch) |
| Sets new batch size when dynamic batching is enabled in executable network that created this request. More... | |
| virtual std::vector< std::shared_ptr< IVariableStateInternal > > | QueryState () |
| Queries memory states. More... | |
| virtual void | StartAsync () |
| Start inference of specified input(s) in asynchronous mode. More... | |
| virtual void | StartAsyncImpl () |
| The minimal asynchronous inference function to be implemented by plugins. It starts inference of specified input(s) in asynchronous mode. More... | |
| virtual StatusCode | Wait (int64_t millis_timeout) |
| Waits for the result to become available. Blocks until specified millis_timeout has elapsed or the result becomes available, whichever comes first. More... | |
| virtual void | SetCallback (Callback callback) |
| Set callback function which will be called on success or failure of asynchronous request. More... | |
| void | checkBlob (const Blob::Ptr &blob, const std::string &name, bool isInput, const SizeVector &refDims={}) const |
Check that blob is valid. Throws an exception if it's not. More... | |
| virtual void | checkBlobs () |
| Check that all of the blobs is valid. Throws an exception if it's not. | |
| void | setPointerToExecutableNetworkInternal (const std::shared_ptr< IExecutableNetworkInternal > &exeNetwork) |
| Sets the pointer to executable network internal. More... | |
| The method will be removed void * | GetUserData () noexcept |
| Gets the pointer to userData. More... | |
| The method will be removed void | SetUserData (void *userData) noexcept |
| Sets the pointer to userData. More... | |
Protected Member Functions | |
| ~IInferRequestInternal () | |
| Destroys the object. | |
| void | execDataPreprocessing (InferenceEngine::BlobMap &preprocessedBlobs, bool serial=false) |
| Checks and executes input data pre-processing if needed. More... | |
| bool | findInputAndOutputBlobByName (const std::string &name, InputInfo::Ptr &foundInput, DataPtr &foundOutput) const |
| Helper function to find input or output blob by name. More... | |
| bool | preProcessingRequired (const InputInfo::Ptr &info, const Blob::Ptr &userBlob, const Blob::Ptr &deviceBlob=nullptr) |
| Checks whether pre-processing step is required for a given input. More... | |
| void | addInputPreProcessingFor (const std::string &name, Blob::Ptr const &from, const Blob::Ptr &to) |
Protected Attributes | |
| InferenceEngine::InputsDataMap | _networkInputs |
| Holds information about network inputs info. | |
| InferenceEngine::OutputsDataMap | _networkOutputs |
| Holds information about network outputs data. | |
| InferenceEngine::BlobMap | _inputs |
| A map of user passed blobs for network inputs. | |
| InferenceEngine::BlobMap | _deviceInputs |
| A map of actual network inputs, in plugin specific format. | |
| InferenceEngine::BlobMap | _outputs |
| A map of user passed blobs for network outputs. | |
| std::map< std::string, PreProcessDataPtr > | _preProcData |
| A map of pre-process data per input. | |
| int | m_curBatch = -1 |
| Current batch value used in dynamic batching. | |
| std::shared_ptr< IExecutableNetworkInternal > | _exeNetwork |
| A shared pointer to IInferRequestInternal. More... | |
| Callback | _callback |
| A callback. | |
An internal API of synchronous inference request to be implemented by plugin, which is used in InferRequestBase forwarding mechanism.
| InferenceEngine::IInferRequestInternal::IInferRequestInternal | ( | const InputsDataMap & | networkInputs, |
| const OutputsDataMap & | networkOutputs | ||
| ) |
Constructs a new instance.
| [in] | networkInputs | The network inputs info |
| [in] | networkOutputs | The network outputs data |
| void InferenceEngine::IInferRequestInternal::checkBlob | ( | const Blob::Ptr & | blob, |
| const std::string & | name, | ||
| bool | isInput, | ||
| const SizeVector & | refDims = {} |
||
| ) | const |
Check that blob is valid. Throws an exception if it's not.
| [in] | blob | The blob to check |
| [in] | name | The name of input or output depending of if the blob is input or output |
| [in] | isInput | Indicates if is input |
| [in] | refDims | The reference dims, empty if not specified |
|
protected |
Checks and executes input data pre-processing if needed.
| inputs | Inputs blobs to perform preprocessing on |
| serial | Whether to use multiple threads to execute the step |
|
protected |
Helper function to find input or output blob by name.
| name | A name of input or output blob. |
| foundInput | A pointer to input information if found. |
| foundOutput | A pointer to output DataPtr if found. |
True - if loaded network has input with provided name, false - if loaded network has output with provided name | [not_found] | exception if there is no input and output layers with given name |
|
virtual |
Get input/output data to infer.
| name | - a name of input or output blob. |
| data | - a reference to input or output blob. The type of Blob must correspond to the network input precision and size. |
Implemented in InferenceEngine::AsyncInferRequestThreadSafeDefault.
|
virtual |
Queries performance measures per layer to get feedback of what is the most time consuming layer. Note: not all plugins may provide meaningful data.
Implemented in InferenceEngine::AsyncInferRequestThreadSafeDefault.
|
virtual |
Gets pre-process for input data.
| name | Name of input blob. |
| info | pointer to a pointer to PreProcessInfo structure |
Implemented in InferenceEngine::AsyncInferRequestThreadSafeDefault.
|
noexcept |
Gets the pointer to userData.
|
virtual |
Infers specified input(s) in synchronous mode.
Implemented in InferenceEngine::AsyncInferRequestThreadSafeDefault.
|
virtual |
The minimal infer function to be implemented by plugins. It infers specified input(s) in synchronous mode.
|
protected |
|
virtual |
Queries memory states.
Implemented in InferenceEngine::AsyncInferRequestThreadSafeDefault.
|
virtual |
Sets new batch size when dynamic batching is enabled in executable network that created this request.
| batch | - new batch size to be used by all the following inference calls for this request. |
Implemented in InferenceEngine::AsyncInferRequestThreadSafeDefault.
|
virtual |
Set input/output data to infer.
| name | - a name of input or output blob. |
| data | - a reference to input or output blob. The type of Blob must correspond to the network input precision and size. |
Implemented in InferenceEngine::AsyncInferRequestThreadSafeDefault.
|
virtual |
Sets pre-process for input data.
| name | Name of input blob. |
| data | - a reference to input or output blob. The type of Blob must correspond to the network input precision and size. |
| info | Preprocess info for blob. |
Implemented in InferenceEngine::AsyncInferRequestThreadSafeDefault.
|
virtual |
Set callback function which will be called on success or failure of asynchronous request.
| callback | - function to be called with the following description: |
Implemented in InferenceEngine::AsyncInferRequestThreadSafeDefault.
| void InferenceEngine::IInferRequestInternal::setPointerToExecutableNetworkInternal | ( | const std::shared_ptr< IExecutableNetworkInternal > & | exeNetwork | ) |
Sets the pointer to executable network internal.
| [in] | exeNetwork | The executable network |
|
noexcept |
Sets the pointer to userData.
| [in] | Pointer | to user data |
|
virtual |
Start inference of specified input(s) in asynchronous mode.
Implemented in InferenceEngine::AsyncInferRequestThreadSafeDefault.
|
virtual |
The minimal asynchronous inference function to be implemented by plugins. It starts inference of specified input(s) in asynchronous mode.
|
virtual |
Waits for the result to become available. Blocks until specified millis_timeout has elapsed or the result becomes available, whichever comes first.
| millis_timeout | - maximum duration in milliseconds to block for |
Implemented in InferenceEngine::AsyncInferRequestThreadSafeDefault.
|
protected |
A shared pointer to IInferRequestInternal.