The sections below contain detailed list of changes made to the Inference Engine API in recent releases.
2020.1
New API
Integration with ngraph API:
Offline compilation: import / export to std::stream:
RemoteBlob accelerator memory sharing API:
GNA firmware model image generation:
MemoryBlob mapping of memory to the user space:
Memory interoperability on acceleration devices. General classes and GPU helper functions
- InferenceEngine::RemoteBlob class
- InferenceEngine::RemoteContext class
- InferenceEngine::Core::CreateContext(const std::string& deviceName, const ParamMap& params) method
- InferenceEngine::Core::GetDefaultContext(const std::string& deviceName) method
- InferenceEngine::make_shared_blob(const TensorDesc& desc, RemoteContext::Ptr ctx) function
- InferenceEngine::gpu::make_shared_blob_nv12(size_t height, size_t width, RemoteContext::Ptr ctx, VASurfaceID nv12_surf) function
- InferenceEngine::gpu::make_shared_context(Core& core, std::string deviceName, VADisplay device) function
- InferenceEngine::gpu::make_shared_blob(const TensorDesc& desc, RemoteContext::Ptr ctx, VASurfaceID surface, uint32_t plane = 0) function
- InferenceEngine::gpu::make_shared_blob_nv12(RemoteContext::Ptr ctx, cl::Image2D& nv12_image_plane_y, cl::Image2D& nv12_image_plane_uv) function
- InferenceEngine::gpu::make_shared_context(Core& core, std::string deviceName, cl_context ctx) function
- InferenceEngine::gpu::make_shared_blob(const TensorDesc& desc, ClContext::Ptr ctx) function
- InferenceEngine::gpu::make_shared_blob(const TensorDesc& desc, RemoteContext::Ptr ctx, cl::Buffer& buffer) function
- InferenceEngine::gpu::make_shared_blob(const TensorDesc& desc, RemoteContext::Ptr ctx, cl_mem buffer) function
- InferenceEngine::gpu::make_shared_blob(const TensorDesc& desc, RemoteContext::Ptr ctx, cl::Image2D& image) function
Deprecated API
Inference Engine NN Builder API:
Plugin API:
- InferenceEngine::InferencePlugin C++ plugin wrapper class
- InferenceEngine::IInferencePlugin plugin interface
- InferenceEngine::PluginDispatcher class
- InferenceEngine::InferenceEnginePluginPtr typedef
- InferenceEngine::ICNNNetReader reader interface
- InferenceEngine::CNNNetReader class
Blob API:
- Blob::element_size() const noexcept method
- Blob::buffer() noexcept method
- Blob::cbuffer() noexcept method
- MemoryBlob::buffer() noexcept method
- MemoryBlob::cbuffer() noexcept method
Removed API
Removed all Inference Engine API which deprecated in 2019'R2
2019 R3
New API
New supported layers:
FPGA plugin streaming support:
Removed API
2019 R2
New API
Inference Engine Core API:
Query API extensions to InferenceEngine::ExecutableNetwork and InferenceEngine::IExecutableNetwork:
Metrics and values for Query API:
Common API:
New supported primitives:
Extensions to Blob creation API:
Deprecated API
The methods listed below are deprecated and will be removed in 2019 R4 release:
Common API:
- InferenceEngine::InputInfo::getInputPrecision method
- InferenceEngine::InputInfo::setInputPrecision method
- InferenceEngine::InputInfo::getDims method
- InferenceEngine::CNNLayer::GetParamsAsBool method
- InferenceEngine::CNNNetwork::CNNNetwork(ICNNNetwork* actual) constructor
- InferenceEngine::CNNNetwork::setTargetDevice method
- HETERO_CONFIG_KEY(DUMP_DLA_MESSAGES) config key
- InferenceEngine::ILayerImplFactory::getShapes method
- InferenceEngine::IShapeInferImpl::inferShapes(const std::vector<SizeVector>&, const std::map<std::string, std::string>& , const std::map<std::string, Blob::Ptr>&, std::vector<SizeVector>&, ResponseDesc*) method
- InferenceEngine::Data::setBatchSize method
- InferenceEngine::QueryNetworkResult::supportedLayers field
- InferenceEngine::ICNNNetwork::setBatchSize(const size_t size) method
- InferenceEngine::Blob::Resize method
- InferenceEngine::Blob::Reshape method
- InferenceEngine::TBlob::set method
InferenceEngine::IInferencePlugin and InferenceEngine:InferencePlugin obsolete methods:
- InferenceEngine::InferencePlugin::LoadNetwork(ICNNNetwork &network) method
- InferenceEngine::InferencePlugin::Infer method
- InferenceEngine::InferencePlugin::GetPerformanceCounts method
- InferenceEngine::InferencePlugin::QueryNetwork(const ICNNNetwork &network, QueryNetworkResult &res) const method
- InferenceEngine::IInferencePlugin::LoadNetwork(ICNNNetwork &network, ResponseDesc *resp) method
- InferenceEngine::IInferencePlugin::Infer(const Blob &input, Blob &result, ResponseDesc *resp) method
- InferenceEngine::IInferencePlugin::Infer(const BlobMap &input, BlobMap &result, ResponseDesc *resp) method
- InferenceEngine::IInferencePlugin::GetPerformanceCounts method
- InferenceEngine::IInferencePlugin::QueryNetwork(const ICNNNetwork& network, QueryNetworkResult& res) const method
Fields in InferenceEngine::Data class are replaced with appropriate methods:
- InferenceEngine::Data::precision field
- InferenceEngine::Data::layout field
- InferenceEngine::Data::dims field
- InferenceEngine::Data::creatorLayer field
- InferenceEngine::Data::name field
- InferenceEngine::Data::inputTo field
- InferenceEngine::Data::userObject field
Heterogeneous plugin:
- InferenceEngine::IHeteroDeviceLoader class
- InferenceEngine::IHeteroInferencePlugin class
- InferenceEngine::HeteroPluginPtr class
- operator InferenceEngine::InferencePlugin::HeteroPluginPtr operator
Blob creation API with dimensions in reverse order:
- InferenceEngine::Blob::Blob(Precision p) constructor
- InferenceEngine::Blob::Blob(Precision p, Layout l) constructor
- InferenceEngine::Blob::Blob(Precision p, const SizeVector &dims) constructor
- InferenceEngine::Blob::Blob(Precision p, Layout l, const SizeVector &dims) constructor
- InferenceEngine::TBlob::TBlob(Precision p, Layout l) constructor
- InferenceEngine::TBlob::TBlob(Precision p, Layout l, const SizeVector& dims) constructor
- InferenceEngine::TBlob::TBlob(Precision p, Layout l, const SizeVector& dims, T* ptr, size_t data_size) constructor
- InferenceEngine::TBlob::TBlob(Precision p, Layout l, const SizeVector &dims, std::shared_ptr<IAllocator> alloc) constructor
- InferenceEngine::Blob::type() method
- InferenceEngine::Blob::precision() method
- InferenceEngine::Blob::layout() method
- InferenceEngine::Blob::dims() method
- InferenceEngine::make_shared_blob(Precision p, Layout l, const SizeVector &dims) function
- InferenceEngine::make_shared_blob(Precision p, const SizeVector &dims) function
- InferenceEngine::make_shared_blob(Precision p, Layout l, const TArg &arg) function
- InferenceEngine::make_shared_blob(Precision p, const TArg &arg) function
- InferenceEngine::make_shared_blob(TBlob<TypeTo> &&arg) function
- InferenceEngine::make_shared_blob(Precision p, Layout l) function
- InferenceEngine::make_shared_blob(Precision p, Layout l, SizeVector dims, const std::vector<TypeTo> &arg) function
- InferenceEngine::make_shared_blob(Precision p, Layout l, const std::vector<TypeTo> &arg) function
- InferenceEngine::make_shared_blob(Precision p, const std::vector<TypeTo> &arg) function
- InferenceEngine::make_shared_blob(Precision p, Layout l, const SizeVector &dims, TypeTo * ptr, size_t size) function
- InferenceEngine::make_shared_blob(Precision p, const SizeVector &dims, TypeTo * ptr, size_t size) function
- InferenceEngine::I_N variable
- InferenceEngine::I_C variable
- InferenceEngine::I_H variable
- InferenceEngine::I_W variable
- InferenceEngine::LayoutOffsetCounter class
- InferenceEngine::ConvertLayout function
API working with device enumeration:
- InferenceEngine::TargetDevice enumeration
- InferenceEngine::TargetDeviceInfo class
- InferenceEngine::getDeviceName function
- InferenceEngine::FindPluginRequest class
- InferenceEngine::FindPluginResponse class
- InferenceEngine::findPlugin(const FindPluginRequest &req, FindPluginResponse &result, ResponseDesc *resp) function
- InferenceEngine::ICNNNetwork::setTargetDevice method
- InferenceEngine::ICNNNetwork::getTargetDevice method
- InferenceEngine::PluginDispatcher::getPluginByDevice method
- InferenceEngine::PluginDispatcher::getSuitablePlugin method