Namespaces | Data Structures | Typedefs | Enumerations | Functions | Variables
InferenceEngine Namespace Reference

Inference Engine API. More...

Namespaces

  Builder
  Neural network builder API.
 
  CLDNNConfigParams
  GPU plugin configuration.
 
  DLIAConfigParams
  DLIA plugin configuration.
 
  DliaMetrics
  DLIA plugin metrics.
 
  GNAConfigParams
  GNA plugin configuration.
 
  HeteroConfigParams
  Heterogeneous plugin configuration.
 
  Metrics
  Metrics
 
  MultiDeviceConfigParams
  Multi Device plugin configuration.
 
  PluginConfigParams
  Generic plugin configuration.
 
  VPUConfigParams
  VPU plugin configuration.
 

Data Structures

class   BatchNormalizationLayer
  This class represents a Batch Normalization Layer. More...
 
class   BinaryConvolutionLayer
  This class represents a standard binary convolution layer. More...
 
class   Blob
  This class represents a universal container in the Inference Engine. More...
 
class   BlockingDesc
  This class describes blocking layouts. More...
 
class   BroadcastLayer
  This class represents a standard Broadcast layer Broadcast modifies input tensor dimensions according parameters. More...
 
class   ClampLayer
  This class represents a Clamp activation layer Clamps all tensor elements into the range [min_value, max_value]. More...
 
class   CNNLayer
  This is a base abstraction Layer - all DNN Layers inherit from this class. More...
 
class   CNNNetReader
  This is a wrapper class used to build and parse a network from the given IR. All the methods here can throw exceptions. More...
 
class   CNNNetwork
  This class contains all the information about the Neural Network and the related binary information. More...
 
class   CompoundBlob
  This class represents a blob that contains other blobs. More...
 
class   ConcatLayer
  This class represents concatenation layer Takes as input several data elements and merges them to one using the supplied axis. More...
 
class   Connection
  This class is the main object to describe the Inference Engine connection. More...
 
class   ConvolutionLayer
  This class represents a standard 3D Convolution Layer. More...
 
class   Core
  This class represents Inference Engine Core entity. It can throw exceptions safely for the application, where it is properly handled. More...
 
class   CropLayer
  This class represents a standard crop layer. More...
 
class   Data
  This class represents the main Data representation node. More...
 
struct   DataConfig
  This structure describes data configuration. More...
 
class   DeconvolutionLayer
  This class represents a standard deconvolution layer. More...
 
class   DeformableConvolutionLayer
  This class represents a standard deformable convolution layer. More...
 
class   DepthToSpaceLayer
  This class represents a standard Depth To Space layer Depth To Space picks from input tensor according parameters. More...
 
class   EltwiseLayer
  This class represents an element wise operation layer. More...
 
class   ExecutableNetwork
  wrapper over IExecutableNetwork More...
 
class   Extension
  This class is a C++ helper to work with objects created using extensions. More...
 
class   FillLayer
  This class represents a standard Fill layer RFill modifies input tensor according parameters. More...
 
class   FullyConnectedLayer
  This class represents a fully connected layer. More...
 
class   GatherLayer
  This class represents a standard Gather layer Gather slices from Dictionary according to Indexes. More...
 
class   GemmLayer
  This class represents a general matrix multiplication operation layer Formula is: dst := alpha*src1*src2 + beta*src3. More...
 
class   GeneralError
  This class represents StatusCode::GENERIC_ERROR exception. More...
 
class   GRNLayer
  This class represents standard GRN Layer. More...
 
class   GRUCell
  GRU Cell layer. More...
 
class   IAllocator
  Allocator concept to be used for memory management and is used as part of the Blob. More...
 
class   ICNNNetReader
  This class is the main interface to build and parse a network from a given IR. More...
 
class   ICNNNetwork
  This is the main interface to describe the NN topology. More...
 
class   ICNNNetworkStats
  This is the interface to describe the NN topology scoring statistics. More...
 
class   IErrorListener
  This class represents a custom error listener. Plugin consumers can provide it via InferenceEngine::SetLogCallback. More...
 
class   IExecutableNetwork
  This is an interface of an executable network. More...
 
class   IExtension
  This class is the main extension interface. More...
 
class   IInferRequest
  This is an interface of asynchronous infer request. More...
 
class   ILayer
  This class is the main interface to describe the Inference Engine layer. All methods here are constant and do not throw exceptions. More...
 
class   ILayerExecImpl
  This class provides interface for the implementation with the custom execution code. More...
 
class   ILayerImpl
  This class provides interface for extension implementations. More...
 
class   ILayerImplFactory
  This class provides interface for extension factories. More...
 
class   IMemoryState
  manages data for reset operations More...
 
class   INetwork
  This class is the main interface to describe the Inference Engine network. More...
 
class   INetwotkIterator
 
class   InferenceEngine
  This class is a C++ API wrapper for IInferencePlugin. It can throw exceptions safely for the application, where it is properly handled. More...
 
struct   InferenceEngineProfileInfo
  Represents basic inference profiling information per layer. If the layer is executed using tiling, the sum time per each tile is indicated as the total execution time. Due to parallel execution, the total execution time for all layers might be greater than the total inference time. More...
 
class   InferNotStarted
  This class represents StatusCode::INFER_NOT_STARTED exception. More...
 
class   InferRequest
  This class is a wrapper of IInferRequest to provide setters/getters of input/output which operates with BlobMaps. It can throw exceptions safely for the application, where it is properly handled. More...
 
class   InputInfo
  This class contains information about each input of the network. More...
 
class   IShapeInferExtension
  This class is the reader extension interface to provide implementation for shape propagation. More...
 
class   IShapeInferImpl
  This class provides interface for the implementation with the custom execution code. More...
 
struct   LayerConfig
  This structure describes Layer configuration. More...
 
struct   LayerParams
  This is an internal common Layer parameter parsing arguments. More...
 
class   LockedMemory
  This class represents locked memory for read/write memory. More...
 
class   LockedMemory< const T >
  This class is for read-only segments. More...
 
class   LockedMemory< void >
  This class is for <void*> data and allows casting to any pointers. More...
 
class   LSTMCell
  LSTM Cell layer. More...
 
class   MathLayer
  This class represents a standard Math layers Math modifies input tensor dimensions according parameters. More...
 
class   MemoryBlob
  This class implements a container object that represents a tensor in memory (host and remote/accelerated) More...
 
class   MemoryState
  c++ exception based error reporting wrapper of API class IMemoryState More...
 
class   MVNLayer
  This class represents standard MVN Layer. More...
 
class   NetworkNodeStats
  This class implements a container which stores statistics for a layer. More...
 
class   NetworkNotLoaded
  This class represents StatusCode::NETWORK_NOT_LOADED exception. More...
 
class   NonMaxSuppressionLayer
  This class represents a standard NonMaxSuppression layer. More...
 
class   NormLayer
  This class represents a Linear Response Normalization (LRN) Layer. More...
 
class   NotAllocated
  This class represents StatusCode::NOT_ALLOCATED exception. More...
 
class   NotFound
  This class represents StatusCode::NOT_FOUND exception. More...
 
class   NotImplemented
  This class represents StatusCode::NOT_IMPLEMENTED exception. More...
 
class   NV12Blob
  Represents a blob that contains two planes (Y and UV) in NV12 color format. More...
 
class   OneHotLayer
  This class represents a OneHot layer Converts input into OneHot representation. More...
 
class   OutOfBounds
  This class represents StatusCode::OUT_OF_BOUNDS exception. More...
 
class   PadLayer
  This class represents a standard Pad layer Adds paddings to input tensor. More...
 
class   Parameter
  This class represents an object to work with different parameters. More...
 
class   ParameterMismatch
  This class represents StatusCode::PARAMETER_MISMATCH exception. More...
 
class   PoolingLayer
  This class represents a standard pooling layer. More...
 
class   PowerLayer
  This class represents a standard Power Layer Formula is: output = (offset + scale * input) ^ power. More...
 
class   Precision
  This class holds precision value and provides precision related operations. More...
 
struct   PrecisionTrait
  Particular precision traits. More...
 
class   PReLULayer
  This class represents a Layer which performs Scale and Shift. More...
 
struct   PreProcessChannel
  This structure stores info about pre-processing of network inputs (scale, mean image, ...) More...
 
class   PreProcessInfo
  This class stores pre-process information for the input. More...
 
struct   PrimitiveInfo
  Structure with information about Primitive. More...
 
class   PropertyVector
 
class   QuantizeLayer
  This class represents a quantization operation layer Element-wise linear quantization of floating point input values into a descrete set of floating point values. More...
 
struct   QueryNetworkResult
  Responce structure encapsulating information about supported layer. More...
 
class   RangeLayer
  This class represents a standard RangeLayer layer RangeLayer modifies input tensor dimensions according parameters. More...
 
class   ReduceLayer
  This class represents a standard Reduce layers Reduce modifies input tensor according parameters. More...
 
class   ReLU6Layer
  This class represents a ReLU6 activation layer Clamps all tensor elements into the range [0, 6.0]. More...
 
class   ReLULayer
  This class represents a Rectified Linear activation layer. More...
 
class   RequestBusy
  This class represents StatusCode::REQUEST_BUSY exception. More...
 
class   ReshapeLayer
  This class represents a standard reshape layer. More...
 
struct   ResponseDesc
  Represents detailed information for an error. More...
 
class   ResultNotReady
  This class represents StatusCode::RESULT_NOT_READY exception. More...
 
class   ReverseSequenceLayer
  This class represents a standard Reverse Sequence layer Reverse Sequence modifies input tensor according parameters. More...
 
class   RNNCell
  RNN Cell layer. More...
 
class   RNNCellBase
  Base class for recurrent cell layers. More...
 
class   RNNSequenceLayer
  Sequence of recurrent cells. More...
 
struct   ROI
  This structure describes ROI data. More...
 
class   ScaleShiftLayer
  This class represents a Layer which performs Scale and Shift. More...
 
class   ScatterLayer
  This class represents a standard Scatter layer. More...
 
class   SelectLayer
  This class represents a SelectLayer layer SelectLayer layer takes elements from the second (“then”) or the third (“else”) input based on condition mask (“cond”) provided in the first input. The “cond” tensor is broadcasted to “then” and “else” tensors. The output tensor shape is equal to broadcasted shape of “cond”, “then” and “else”. More...
 
class   ShapeInferExtension
  This class is a C++ helper to work with objects created using extensions. More...
 
class   ShuffleChannelsLayer
  This class represents a standard Shuffle Channels layer Shuffle Channels picks from input tensor according parameters. More...
 
class   SoftMaxLayer
  This class represents standard softmax Layer. More...
 
class   SpaceToDepthLayer
  This class represents a standard Space To Depth layer Depth To Space picks from input tensor according parameters. More...
 
class   SparseFillEmptyRowsLayer
  This class represents SparseFillEmptyRows layer SparseFillEmptyRows fills empty rows in a sparse tensor. More...
 
class   SparseSegmentReduceLayer
  This class represents SparseSegmentMean(SqrtN, Sum) layers SparseSegmentMean(SqrtN, Sum) layer reduces data along sparse segments of a tensor. More...
 
class   SplitLayer
  This class represents a layer that evenly splits the input into the supplied outputs. More...
 
class   StridedSliceLayer
  This class represents a standard Strided Slice layer Strided Slice picks from input tensor according parameters. More...
 
class   TBlob
  Represents real host memory allocated for a Tensor/Blob per C type. More...
 
class   TensorDesc
  This class defines Tensor description. More...
 
struct   TensorInfo
  This structure describes tensor information. More...
 
class   TensorIterator
  This class represents TensorIterator layer. More...
 
class   TileLayer
  This class represents a standard Tile Layer. More...
 
class   TopKLayer
  This class represents a standard TopK layer TopK picks top K values from input tensor according parameters. More...
 
class   Unexpected
  This class represents StatusCode::UNEXPECTED exception. More...
 
class   UniqueLayer
  This class represents Unique layer. The Unique operation searches for unique elements in 1-D input. More...
 
union   UserValue
  The method holds the user values to enable binding of data per graph node. More...
 
struct   Version
  Represents version information that describes plugins and the inference engine runtime library. More...
 
class   WeightableLayer
  This class represents a layer with Weights and/or Biases (e.g. Convolution/Fully Connected, etc.) More...
 

Typedefs

using  BlobMap = std::map< std::string, Blob::Ptr >
  This is a convenient type for working with a map containing pairs(string, pointer to a Blob instance).
 
using  SizeVector = std::vector< size_t >
  Represents tensor size. The order is opposite to the order in Caffe*: (w,h,n,b) where the most frequently changing element in memory is first.
 
using  CNNLayerPtr = std::shared_ptr< CNNLayer >
  A smart pointer to the CNNLayer.
 
using  CNNLayerWeakPtr = std::weak_ptr< CNNLayer >
  A smart weak pointer to the CNNLayer.
 
using  DataPtr = std::shared_ptr< Data >
  Smart pointer to Data.
 
using  CDataPtr = std::shared_ptr< const Data >
  Smart pointer to constant Data.
 
using  DataWeakPtr = std::weak_ptr< Data >
  Smart weak pointer to Data.
 
using  OutputsDataMap = std::map< std::string, DataPtr >
  A collection that contains string as key, and Data smart pointer as value.
 
using  NetworkNodeStatsPtr = std::shared_ptr< NetworkNodeStats >
  A shared pointer to the NetworkNodeStats object.
 
using  NetworkNodeStatsWeakPtr = std::weak_ptr< NetworkNodeStats >
  A smart pointer to the NetworkNodeStats object.
 
using  NetworkStatsMap = std::map< std::string, NetworkNodeStatsPtr >
  A map of pairs: name of a layer and related statistics.
 
using  ConstOutputsDataMap = std::map< std::string, CDataPtr >
  A collection that contains string as key, and const Data smart pointer as value.
 
using  IExtensionPtr = std::shared_ptr< IExtension >
 
using  IShapeInferExtensionPtr = std::shared_ptr< IShapeInferExtension >
 
using  InputsDataMap = std::map< std::string, InputInfo::Ptr >
  A collection that contains string as key, and InputInfo smart pointer as value.
 
using  ConstInputsDataMap = std::map< std::string, InputInfo::CPtr >
  A collection that contains string as key, and const InputInfo smart pointer as value.
 
using  GenericLayer = class CNNLayer
  Alias for CNNLayer object.
 
using  idx_t = size_t
  A type of network objects indexes. More...
 
using  InferenceEnginePluginPtr = InferenceEngine::details::SOPointer< IInferencePlugin >
  A C++ helper to work with objects created by the plugin. Implements different interfaces.
 

Enumerations

enum   LockOp { LOCK_FOR_READ = 0, LOCK_FOR_WRITE }
  Allocator handle mapping type.
 
enum   Layout : uint8_t {
  ANY = 0, NCHW = 1, NHWC = 2, NCDHW = 3,
  NDHWC = 4, OIHW = 64, GOIHW = 65, OIDHW = 66,
  GOIDHW = 67, SCALAR = 95, C = 96, CHW = 128,
  HW = 192, NC = 193, CN = 194, BLOCKED = 200
}
  Layouts that the inference engine supports.
 
enum   ColorFormat : uint32_t {
  RAW = 0u, RGB, BGR, RGBX,
  BGRX, NV12
}
  Extra information about input color format for preprocessing. More...
 
enum   StatusCode : int {
  OK = 0, GENERAL_ERROR = -1, NOT_IMPLEMENTED = -2, NETWORK_NOT_LOADED = -3,
  PARAMETER_MISMATCH = -4, NOT_FOUND = -5, OUT_OF_BOUNDS = -6, UNEXPECTED = -7,
  REQUEST_BUSY = -8, RESULT_NOT_READY = -9, NOT_ALLOCATED = -10, INFER_NOT_STARTED = -11,
  NETWORK_NOT_READ = -12
}
  This enum contains codes for all possible return values of the interface functions.
 
enum   eDIMS_AXIS : uint8_t { X_AXIS = 0, Y_AXIS, Z_AXIS }
 
enum   MeanVariant { MEAN_IMAGE, MEAN_VALUE, NONE }
  Defines available types of mean. More...
 
enum   ResizeAlgorithm { NO_RESIZE = 0, RESIZE_BILINEAR, RESIZE_AREA }
  Represents the list of supported resize algorithms.
 

Functions

template<class T >
std::shared_ptr< T >  make_so_pointer (const file_name_t &name)=delete
  Creates a special shared_pointer wrapper for the given type from a specific shared module. More...
 
InferenceEngine::IAllocator CreateDefaultAllocator () noexcept
  Creates the default implementation of the Inference Engine allocator per plugin. More...
 
template<typename T , typename std::enable_if<!std::is_pointer< T >::value &&!std::is_reference< T >::value, int >::type = 0, typename std::enable_if< std::is_base_of< Blob, T >::value, int >::type = 0>
std::shared_ptr< T >  as (const Blob::Ptr &blob) noexcept
  Helper cast function to work with shared Blob objects. More...
 
template<typename T , typename std::enable_if<!std::is_pointer< T >::value &&!std::is_reference< T >::value, int >::type = 0, typename std::enable_if< std::is_base_of< Blob, T >::value, int >::type = 0>
std::shared_ptr< const T >  as (const Blob::CPtr &blob) noexcept
  Helper cast function to work with shared Blob objects. More...
 
template<typename Type >
InferenceEngine::TBlob< Type >::Ptr  make_shared_blob (const TensorDesc &tensorDesc)
  Creates a blob with the given tensor descriptor. More...
 
template<typename Type >
InferenceEngine::TBlob< Type >::Ptr  make_shared_blob (const TensorDesc &tensorDesc, Type *ptr, size_t size=0)
  Creates a blob with the given tensor descriptor from the pointer to the pre-allocated memory. More...
 
template<typename Type >
InferenceEngine::TBlob< Type >::Ptr  make_shared_blob (const TensorDesc &tensorDesc, const std::shared_ptr< InferenceEngine::IAllocator > &alloc)
  Creates a blob with the given tensor descriptor and allocator. More...
 
template<typename TypeTo >
InferenceEngine::TBlob< TypeTo >::Ptr  make_shared_blob (const TBlob< TypeTo > &arg)
  Creates a copy of given TBlob instance. More...
 
template<typename T , typename... Args, typename std::enable_if< std::is_base_of< Blob, T >::value, int >::type = 0>
std::shared_ptr< T >  make_shared_blob (Args &&... args)
  Creates a Blob object of the specified type. More...
 
Blob::Ptr  make_shared_blob (const Blob::Ptr &inputBlob, const ROI &roi)
  Creates a blob describing given ROI object based on the given blob with pre-allocated memory. More...
 
std::ostream &  operator<< (std::ostream &out, const Layout &p)
 
std::ostream &  operator<< (std::ostream &out, const ColorFormat &fmt)
 
class  INFERENCE_ENGINE_NN_BUILDER_API_CLASS (Context)
  This class implements object. More...
 
template<>
std::shared_ptr< IShapeInferExtension make_so_pointer (const file_name_t &name)
  Creates a special shared_pointer wrapper for the given type from a specific shared module. More...
 
template<>
std::shared_ptr< IExtension make_so_pointer (const file_name_t &name)
  Creates a special shared_pointer wrapper for the given type from a specific shared module. More...
 
ICNNNetReader CreateCNNNetReader () noexcept
  Creates a CNNNetReader instance. More...
 
StatusCode  CreateExtension (IExtension *&ext, ResponseDesc *resp) noexcept
  Creates the default instance of the extension. More...
 
StatusCode  CreateShapeInferExtension (IShapeInferExtension *&ext, ResponseDesc *resp) noexcept
  Creates the default instance of the shape infer extension. More...
 
class Use ngraph API NN Builder API will be removed in  R2 (PortInfo)
  This class contains a pair from layerId and port index. More...
 
class Use ngraph API NN Builder API will be removed in  R2 (PortData)
 
class Use ngraph API NN Builder API will be removed in  R2 (Port)
  This class is the main object to describe the Inference Engine port. More...
 
template<typename F >
void  parallel_nt (int nthr, const F &func)
 
template<typename F >
void  parallel_nt_static (int nthr, const F &func)
 
template<typename I , typename F >
void  parallel_sort (I begin, I end, const F &comparator)
 
template<typename T0 , typename R , typename F >
parallel_sum (const T0 &D0, const R &input, const F &func)
 
template<typename T0 , typename T1 , typename R , typename F >
parallel_sum2d (const T0 &D0, const T1 &D1, const R &input, const F &func)
 
template<typename T0 , typename T1 , typename T2 , typename R , typename F >
parallel_sum3d (const T0 &D0, const T1 &D1, const T2 &D2, const R &input, const F &func)
 
template<typename T >
parallel_it_init (T start)
 
template<typename T , typename Q , typename R , typename... Args>
parallel_it_init (T start, Q &x, const R &X, Args &&... tuple)
 
bool  parallel_it_step ()
 
template<typename Q , typename R , typename... Args>
bool  parallel_it_step (Q &x, const R &X, Args &&... tuple)
 
template<typename T , typename Q >
void  splitter (const T &n, const Q &team, const Q &tid, T &n_start, T &n_end)
 
template<typename T0 , typename F >
void  for_1d (const int &ithr, const int &nthr, const T0 &D0, const F &func)
 
template<typename T0 , typename F >
void  parallel_for (const T0 &D0, const F &func)
 
template<typename T0 , typename T1 , typename F >
void  for_2d (const int &ithr, const int &nthr, const T0 &D0, const T1 &D1, const F &func)
 
template<typename T0 , typename T1 , typename F >
void  parallel_for2d (const T0 &D0, const T1 &D1, const F &func)
 
template<typename T0 , typename T1 , typename T2 , typename F >
void  for_3d (const int &ithr, const int &nthr, const T0 &D0, const T1 &D1, const T2 &D2, const F &func)
 
template<typename T0 , typename T1 , typename T2 , typename F >
void  parallel_for3d (const T0 &D0, const T1 &D1, const T2 &D2, const F &func)
 
template<typename T0 , typename T1 , typename T2 , typename T3 , typename F >
void  for_4d (const int &ithr, const int &nthr, const T0 &D0, const T1 &D1, const T2 &D2, const T3 &D3, const F &func)
 
template<typename T0 , typename T1 , typename T2 , typename T3 , typename F >
void  parallel_for4d (const T0 &D0, const T1 &D1, const T2 &D2, const T3 &D3, const F &func)
 
template<typename T0 , typename T1 , typename T2 , typename T3 , typename T4 , typename F >
void  for_5d (const int &ithr, const int &nthr, const T0 &D0, const T1 &D1, const T2 &D2, const T3 &D3, const T4 &D4, const F &func)
 
template<typename T0 , typename T1 , typename T2 , typename T3 , typename T4 , typename F >
void  parallel_for5d (const T0 &D0, const T1 &D1, const T2 &D2, const T3 &D3, const T4 &D4, const F &func)
 
StatusCode  CreatePluginEngine (IInferencePlugin *&plugin, ResponseDesc *resp) noexcept
  Creates the default instance of the interface (per plugin) More...
 
std::string  fileNameToString (const file_name_t &str)
  Conversion from possibly-wide character string to a single-byte chain.
 
file_name_t  stringToFileName (const std::string &str)
  Conversion from single-byte character string to a possibly-wide one.
 
const Version GetInferenceEngineVersion () noexcept
  Gets the current Inference Engine version. More...
 
template<class T >
InferenceEngine utility functions are not a part of public API Will be removed in R2 void  TopResults (unsigned int n, TBlob< T > &input, std::vector< unsigned > &output)
  Gets the top n results from a tblob. More...
 
InferenceEngine utility functions are not a part of public API Will be removed in R2 void  TopResults (unsigned int n, Blob &input, std::vector< unsigned > &output)
  Gets the top n results from a blob. More...
 
template<typename data_t >
InferenceEngine utility functions are not a part of public API Will be removed in R2 void  copyFromRGB8 (uint8_t *RGB8, size_t RGB8_size, InferenceEngine::TBlob< data_t > *blob)
  Copies a 8-bit RGB image to the blob. Throws an exception in case of dimensions or input size mismatch. More...
 
InferenceEngine utility functions are not a part of public API Will be removed in R2 void  ConvertImageToInput (unsigned char *imgBufRGB8, size_t lengthbytesSize, Blob &input)
  Splits the RGB channels to either I16 Blob or float blob. The image buffer is assumed to be packed with no support for strides. More...
 
template<typename T >
InferenceEngine utility functions are not a part of public API Will be removed in R2 void  copyToFloat (float *dst, const InferenceEngine::Blob *src)
  Copies data from a certain precision to float. More...
 

Variables

constexpr const int  MAX_DIMS_NUMBER = 12
 

Detailed Description

Inference Engine API.

Typedef Documentation

§ idx_t

using InferenceEngine::idx_t = typedef size_t

A type of network objects indexes.

Deprecated:
Use ngraph API instead.

Enumeration Type Documentation

§ ColorFormat

Extra information about input color format for preprocessing.

Enumerator
RAW 

Plain blob (default), no extra color processing required.

RGB 

RGB color format.

BGR 

BGR color format, default in DLDT.

RGBX 

RGBX color format with X ignored during inference.

BGRX 

BGRX color format with X ignored during inference.

NV12 

NV12 color format represented as compound Y+UV blob.

§ MeanVariant

Defines available types of mean.

Enumerator
MEAN_IMAGE 

mean value is specified for each input pixel

MEAN_VALUE 

mean value is specified for each input channel

NONE 

no mean value specified

Function Documentation

§ as() [1/2]

template<typename T , typename std::enable_if<!std::is_pointer< T >::value &&!std::is_reference< T >::value, int >::type = 0, typename std::enable_if< std::is_base_of< Blob, T >::value, int >::type = 0>
std::shared_ptr<T> InferenceEngine::as ( const Blob::Ptr blob )
noexcept

Helper cast function to work with shared Blob objects.

Returns
shared_ptr to the type T. Returned shared_ptr shares ownership of the object with the input Blob::Ptr

§ as() [2/2]

template<typename T , typename std::enable_if<!std::is_pointer< T >::value &&!std::is_reference< T >::value, int >::type = 0, typename std::enable_if< std::is_base_of< Blob, T >::value, int >::type = 0>
std::shared_ptr<const T> InferenceEngine::as ( const Blob::CPtr blob )
noexcept

Helper cast function to work with shared Blob objects.

Returns
shared_ptr to the type const T. Returned shared_ptr shares ownership of the object with the input Blob::Ptr

§ ConvertImageToInput()

InferenceEngine utility functions are not a part of public API Will be removed in R2 void InferenceEngine::ConvertImageToInput ( unsigned char *  imgBufRGB8,
size_t  lengthbytesSize,
Blob input 
)
inline

Splits the RGB channels to either I16 Blob or float blob. The image buffer is assumed to be packed with no support for strides.

Deprecated:
InferenceEngine utility functions are not a part of public API
Parameters
imgBufRGB8 Packed 24bit RGB image (3 bytes per pixel: R-G-B)
lengthbytesSize Size in bytes of the RGB image. It is equal to amount of pixels times 3 (number of channels)
input Blob to contain the split image (to 3 channels)

§ copyFromRGB8()

template<typename data_t >
InferenceEngine utility functions are not a part of public API Will be removed in R2 void InferenceEngine::copyFromRGB8 ( uint8_t *  RGB8,
size_t  RGB8_size,
InferenceEngine::TBlob< data_t > *  blob 
)

Copies a 8-bit RGB image to the blob. Throws an exception in case of dimensions or input size mismatch.

Deprecated:
InferenceEngine utility functions are not a part of public API
Template Parameters
data_t Type of the target blob
Parameters
RGB8 8-bit RGB image
RGB8_size Size of the image
blob Target blob to write image to

§ copyToFloat()

template<typename T >
InferenceEngine utility functions are not a part of public API Will be removed in R2 void InferenceEngine::copyToFloat ( float *  dst,
const InferenceEngine::Blob src 
)

Copies data from a certain precision to float.

Deprecated:
InferenceEngine utility functions are not a part of public API
Parameters
dst Pointer to an output float buffer, must be allocated before the call
src Source blob to take data from

§ CreateCNNNetReader()

ICNNNetReader* InferenceEngine::CreateCNNNetReader ( )
noexcept

Creates a CNNNetReader instance.

Returns
An object that implements the ICNNNetReader interface

§ CreateDefaultAllocator()

InferenceEngine::IAllocator* InferenceEngine::CreateDefaultAllocator ( )
noexcept

Creates the default implementation of the Inference Engine allocator per plugin.

Returns
The Inference Engine IAllocator* instance

§ CreateExtension()

StatusCode InferenceEngine::CreateExtension ( IExtension *&  ext,
ResponseDesc resp 
)
noexcept

Creates the default instance of the extension.

Parameters
ext Extension interface
resp Response description
Returns
Status code

§ CreatePluginEngine()

StatusCode InferenceEngine::CreatePluginEngine ( IInferencePlugin *&  plugin,
ResponseDesc resp 
)
noexcept

Creates the default instance of the interface (per plugin)

Parameters
plugin Pointer to the plugin
resp Pointer to the response message that holds a description of an error if any occurred
Returns
Status code of the operation. OK if succeeded

§ CreateShapeInferExtension()

StatusCode InferenceEngine::CreateShapeInferExtension ( IShapeInferExtension *&  ext,
ResponseDesc resp 
)
noexcept

Creates the default instance of the shape infer extension.

Parameters
ext Shape Infer Extension interface
resp Response description
Returns
Status code

§ GetInferenceEngineVersion()

const Version* InferenceEngine::GetInferenceEngineVersion ( )
noexcept

Gets the current Inference Engine version.

Returns
The current Inference Engine version

§ INFERENCE_ENGINE_NN_BUILDER_API_CLASS()

class InferenceEngine::INFERENCE_ENGINE_NN_BUILDER_API_CLASS ( Context  )

This class implements object.

Deprecated:
Use ngraph API instead.

Registers extension within the context

Parameters
ext Pointer to already loaded extension

Registers Shape Infer implementation within the Context

Parameters
type Layer type
impl Shape Infer implementation

Returns the shape infer implementation by layer type

Parameters
type Layer type
Returns
Shape Infer implementation

§ make_shared_blob() [1/6]

template<typename Type >
InferenceEngine::TBlob<Type>::Ptr InferenceEngine::make_shared_blob ( const TensorDesc tensorDesc )
inline

Creates a blob with the given tensor descriptor.

Template Parameters
Type Type of the shared pointer to be created
Parameters
tensorDesc Tensor descriptor for Blob creation
Returns
A shared pointer to the newly created blob of the given type

§ make_shared_blob() [2/6]

template<typename Type >
InferenceEngine::TBlob<Type>::Ptr InferenceEngine::make_shared_blob ( const TensorDesc tensorDesc,
Type *  ptr,
size_t  size = 0 
)
inline

Creates a blob with the given tensor descriptor from the pointer to the pre-allocated memory.

Template Parameters
Type Type of the shared pointer to be created
Parameters
tensorDesc TensorDesc for Blob creation
ptr Pointer to the pre-allocated memory
size Length of the pre-allocated array
Returns
A shared pointer to the newly created blob of the given type

§ make_shared_blob() [3/6]

template<typename Type >
InferenceEngine::TBlob<Type>::Ptr InferenceEngine::make_shared_blob ( const TensorDesc tensorDesc,
const std::shared_ptr< InferenceEngine::IAllocator > &  alloc 
)
inline

Creates a blob with the given tensor descriptor and allocator.

Template Parameters
Type Type of the shared pointer to be created
Parameters
tensorDesc Tensor descriptor for Blob creation
alloc Shared pointer to IAllocator to use in the blob
Returns
A shared pointer to the newly created blob of the given type

§ make_shared_blob() [4/6]

template<typename TypeTo >
InferenceEngine::TBlob<TypeTo>::Ptr InferenceEngine::make_shared_blob ( const TBlob< TypeTo > &  arg )
inline

Creates a copy of given TBlob instance.

Template Parameters
TypeTo Type of the shared pointer to be created
Parameters
arg given pointer to blob
Returns
A shared pointer to the newly created blob of the given type

§ make_shared_blob() [5/6]

template<typename T , typename... Args, typename std::enable_if< std::is_base_of< Blob, T >::value, int >::type = 0>
std::shared_ptr<T> InferenceEngine::make_shared_blob ( Args &&...  args )

Creates a Blob object of the specified type.

Parameters
args Constructor arguments for the Blob object
Returns
A shared pointer to the newly created Blob object

§ make_shared_blob() [6/6]

Blob::Ptr InferenceEngine::make_shared_blob ( const Blob::Ptr inputBlob,
const ROI roi 
)

Creates a blob describing given ROI object based on the given blob with pre-allocated memory.

Parameters
inputBlob original blob with pre-allocated memory.
roi A ROI object inside of the original blob.
Returns
A shared pointer to the newly created blob.

§ make_so_pointer() [1/3]

template<class T >
std::shared_ptr<T> InferenceEngine::make_so_pointer ( const file_name_t &  name )
inlinedelete

Creates a special shared_pointer wrapper for the given type from a specific shared module.

Parameters
name Name of the shared library file
name Name of the shared library file
Returns
shared_pointer A wrapper for the given type from a specific shared module

§ make_so_pointer() [2/3]

template<>
std::shared_ptr<IShapeInferExtension> InferenceEngine::make_so_pointer ( const file_name_t &  name )
inlinedelete

Creates a special shared_pointer wrapper for the given type from a specific shared module.

Parameters
name Name of the shared library file
Returns
shared_pointer A wrapper for the given type from a specific shared module

§ make_so_pointer() [3/3]

template<>
std::shared_ptr<IExtension> InferenceEngine::make_so_pointer ( const file_name_t &  name )
inlinedelete

Creates a special shared_pointer wrapper for the given type from a specific shared module.

Parameters
name Name of the shared library file
Returns
shared_pointer A wrapper for the given type from a specific shared module

§ R2() [1/3]

class Use ngraph API NN Builder API will be removed in InferenceEngine::R2 ( PortData  )
Deprecated:
Use ngraph API instead. This class describes port data

A shared pointer to the PortData object.

Default constructor

Creates port data with precision and shape

Parameters
shape Dimensions
precision Precision

virtual destructor

Returns data

Returns
Blob with data

Sets data

Parameters
data Blob with data

Returns data parameters

Returns
Map of parameters

Sets new shapes for data

Parameters
shape New shapes

§ R2() [2/3]

class Use ngraph API NN Builder API will be removed in InferenceEngine::R2 ( Port  )

This class is the main object to describe the Inference Engine port.

Deprecated:
Use ngraph API instead.

Default constructor of a port object.

Constructor of a port object with shapes.

Parameters
shapes port shapes
precision Port precision

Virtual destructor

Copy constructor.

Parameters
port object to copy

Compares the given Port with the current one

Parameters
rhs Port to compare with
Returns
true if the given Port is equal to the current one, false - otherwise

Compares the given Port with the current one

Parameters
rhs Port to compare with
Returns
true if the given Port is NOT equal to the current one, false - otherwise

Returns a constant reference to a vector with shapes. Shapes should be initialized if shape is empty.

Returns
constant reference to shapes

Sets new shapes for current port

Parameters
shape New shapes

Returns a constant reference to parameters

Returns
Map with parameters

Sets new parameters for current port

Parameters
params New parameters

Sets the new parameter for current port

Parameters
name Name of parameter
param New value

Returns port data

Returns
Port data

Sets new port data for current port

Parameters
data Port data

§ R2() [3/3]

class Use ngraph API NN Builder API will be removed in InferenceEngine::R2 ( PortInfo  )

This class contains a pair from layerId and port index.

Deprecated:
Use ngraph API instead.

The constructor creates a PortInfo object for port 0

Parameters
layerID Layer id

The constructor creates a PortInfo object

Parameters
layerID Layer id
portID Port id

Get layer id

Returns
Layer id

Get port id

Returns
Port id

Compares the given PortInfo object with the current one

Parameters
portInfo PortInfo object to compare with
Returns
true if the given PortInfo object is equal to the current one, false - otherwise

Checks if the given PortInfo object is not equal to the current one

Parameters
portInfo PortInfo object to compare with
Returns
true if the given PortInfo object is not equal to the current one, false - otherwise

§ TopResults() [1/2]

template<class T >
InferenceEngine utility functions are not a part of public API Will be removed in R2 void InferenceEngine::TopResults ( unsigned int  n,
TBlob< T > &  input,
std::vector< unsigned > &  output 
)
inline

Gets the top n results from a tblob.

Deprecated:
InferenceEngine utility functions are not a part of public API
Parameters
n Top n count
input 1D tblob that contains probabilities
output Vector of indexes for the top n places

§ TopResults() [2/2]

InferenceEngine utility functions are not a part of public API Will be removed in R2 void InferenceEngine::TopResults ( unsigned int  n,
Blob input,
std::vector< unsigned > &  output 
)
inline

Gets the top n results from a blob.

Deprecated:
InferenceEngine utility functions are not a part of public API
Parameters
n Top n count
input 1D blob that contains probabilities
output Vector of indexes for the top n places