Class InferenceEngine::BatchedBlob

class BatchedBlob : public InferenceEngine::CompoundBlob

This class represents a blob that contains other blobs - one per batch.

Plugin which supports BatchedBlob input should report BATCHED_BLOB in the OPTIMIZATION_CAPABILITIES metric.

Public Types

using Ptr = std::shared_ptr<BatchedBlob>

A smart pointer to the BatchedBlob object.

using CPtr = std::shared_ptr<const BatchedBlob>

A smart pointer to the const BatchedBlob object.

Public Functions

explicit BatchedBlob(const std::vector<Blob::Ptr> &blobs)

Constructs a batched blob from a vector of blobs.

All passed blobs should meet following requirements:

  • all blobs have equal tensor descriptors,

  • blobs layouts should be one of: NCHW, NHWC, NCDHW, NDHWC, NC, CN, C, CHW, HWC

  • batch dimensions should be equal to 1 or not defined (C, CHW, HWC). Resulting blob’s tensor descriptor is constructed using tensor descriptors of passed blobs by setting batch dimension to blobs.size()

Parameters

blobs – A vector of blobs that is copied to this object

explicit BatchedBlob(std::vector<Blob::Ptr> &&blobs)

Constructs a batched blob from a vector of blobs.

All passed blobs should meet following requirements:

  • all blobs have equal tensor descriptors,

  • blobs layouts should be one of: NCHW, NHWC, NCDHW, NDHWC, NC, CN, C, CHW, HWC

  • batch dimensions should be equal to 1 or not defined (C, CHW, HWC). Resulting blob’s tensor descriptor is constructed using tensor descriptors of passed blobs by setting batch dimension to blobs.size()

Parameters

blobs – A vector of blobs that is moved to this object