Deprecated List

Global IE_DEFINE_EXTENSION_CREATE_FUNCTION (ExtensionType)

The Inference Engine API is deprecated and will be removed in the 2024.0 release. For instructions on transitioning to the new API, please refer to https://docs.openvino.ai/latest/openvino_2_0_transition_guide.html

Global InferenceEngine::Blob::element_size () const =0

Cast to MemoryBlob and use its API instead. Blob class can represent compound blob, which do not refer to the only solid memory.

Global InferenceEngine::Blob::product (const SizeVector &dims) noexcept

Cast to MemoryBlob and use its API instead.

Global InferenceEngine::Blob::properProduct (const SizeVector &dims) noexcept

Cast to MemoryBlob and use its API instead.

Global InferenceEngine::Core::ImportNetwork (std::istream &networkModel)

Use Core::ImportNetwork with explicit device name

Class InferenceEngine::Data

The Inference Engine API is deprecated and will be removed in the 2024.0 release. For instructions on transitioning to the new API, please refer to https://docs.openvino.ai/latest/openvino_2_0_transition_guide.html

Global InferenceEngine::Data::reshape (const std::initializer_list< size_t > &dims, Layout layout)

Use InferenceEngine::Data::reshape(const SizeVector&, Layout)

Class InferenceEngine::DataConfig

The Inference Engine API is deprecated and will be removed in the 2024.0 release. For instructions on transitioning to the new API, please refer to https://docs.openvino.ai/latest/openvino_2_0_transition_guide.html

Class InferenceEngine::Extension

The Inference Engine API is deprecated and will be removed in the 2024.0 release. For instructions on transitioning to the new API, please refer to https://docs.openvino.ai/latest/openvino_2_0_transition_guide.html

Class InferenceEngine::ICNNNetwork

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::addOutput (const std::string &layerName, size_t outputIndex=0, ResponseDesc *resp=nullptr) noexcept=0

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::getBatchSize () const =0

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::getFunction () noexcept=0

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::getFunction () const noexcept=0

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::getInput (const std::string &inputName) const noexcept=0

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::getInputsInfo (InputsDataMap &inputs) const noexcept=0

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::getName () const noexcept=0

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::getOutputsInfo (OutputsDataMap &out) const noexcept=0

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::getOVNameForTensor (std::string &ov_name, const std::string &orig_name, ResponseDesc *resp) const noexcept

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::InputShapes

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::layerCount () const =0

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::Ptr

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::reshape (const InputShapes &inputShapes, ResponseDesc *resp) noexcept

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::reshape (const std::map< std::string, ngraph::PartialShape > &partialShapes, ResponseDesc *resp) noexcept

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::serialize (std::ostream &xmlStream, Blob::Ptr &binData, ResponseDesc *resp) const noexcept=0

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::serialize (const std::string &xmlPath, const std::string &binPath, ResponseDesc *resp) const noexcept=0

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::serialize (std::ostream &xmlStream, std::ostream &binStream, ResponseDesc *resp) const noexcept=0

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::ICNNNetwork::setBatchSize (size_t size, ResponseDesc *responseDesc) noexcept=0

Use InferenceEngine::CNNNetwork wrapper instead

Global InferenceEngine::IExecutableNetwork::GetExecGraphInfo (ICNNNetwork::Ptr &graphPtr, ResponseDesc *resp) noexcept=0

Use InferenceEngine::ExecutableNetwork::GetExecGraphInfo instead

Class InferenceEngine::IExtension

The Inference Engine API is deprecated and will be removed in the 2024.0 release. For instructions on transitioning to the new API, please refer to https://docs.openvino.ai/latest/openvino_2_0_transition_guide.html

Class InferenceEngine::IInferRequest

Use InferenceEngine::InferRequest C++ wrapper

Class InferenceEngine::ILayerExecImpl

The Inference Engine API is deprecated and will be removed in the 2024.0 release. For instructions on transitioning to the new API, please refer to https://docs.openvino.ai/latest/openvino_2_0_transition_guide.html

Class InferenceEngine::ILayerImpl

The Inference Engine API is deprecated and will be removed in the 2024.0 release. For instructions on transitioning to the new API, please refer to https://docs.openvino.ai/latest/openvino_2_0_transition_guide.html

Global InferenceEngine::INFERENCE_ENGINE_1_0_DEPRECATED (ExecutableNetwork)

Will be removed. Use operator bool

Use ExecutableNetwork::CreateInferRequest

Global InferenceEngine::INFERENCE_ENGINE_1_0_DEPRECATED (CNNNetwork)

Don’t use this constructor. It will be removed soon

InferenceEngine::ICNNNetwork interface is deprecated

InferenceEngine::ICNNNetwork interface is deprecated

InferenceEngine::ICNNNetwork interface is deprecated

Global InferenceEngine::INFERENCE_ENGINE_1_0_DEPRECATED (ExecutableNetwork)

The method Will be removed

Class InferenceEngine::LayerConfig

The Inference Engine API is deprecated and will be removed in the 2024.0 release. For instructions on transitioning to the new API, please refer to https://docs.openvino.ai/latest/openvino_2_0_transition_guide.html

Global InferenceEngine::make_so_pointer (const std::wstring &name)

The Inference Engine API is deprecated and will be removed in the 2024.0 release. For instructions on transitioning to the new API, please refer to https://docs.openvino.ai/latest/openvino_2_0_transition_guide.html

Global InferenceEngine::make_so_pointer (const std::string &name)

The Inference Engine API is deprecated and will be removed in the 2024.0 release. For instructions on transitioning to the new API, please refer to https://docs.openvino.ai/latest/openvino_2_0_transition_guide.html

Global InferenceEngine::PluginConfigParams::KEY_DUMP_EXEC_GRAPH_AS_DOT

Use InferenceEngine::ExecutableNetwork::GetExecGraphInfo::serialize method

Class InferenceEngine::Version::ApiVersion

Use IE_VERSION_[MAJOR|MINOR|PATCH] definitions, buildNumber property

Class ngraph::CoordinateIterator

Global ngraph::CoordinateTransformBasic::index (const Coordinate &c) const

Global ngraph::maximum_value (const Output< Node > &value)

Use evaluate_upper_bound instead

Class ov::AllocatorImpl

This class will be removed in 2024.0 release

Global ov::Core::add_extension (const std::shared_ptr< InferenceEngine::IExtension > &extension)

This method is deprecated. Please use other Core::add_extension methods.

Global ov::IPlugin::add_extension (const std::shared_ptr< InferenceEngine::IExtension > &extension)

This method allows to load legacy Inference Engine Extensions and will be removed in 2024.0 release

Global ov::Model::evaluate (const ov::HostTensorVector &output_tensors, const ov::HostTensorVector &input_tensors) const

Use evaluate with ov::Tensor instead

Global ov::Model::evaluate (const ov::HostTensorVector &output_tensors, const ov::HostTensorVector &input_tensors, ov::EvaluationContext &evaluation_context) const

Use evaluate with ov::Tensor instead

Global ov::Node::evaluate (const ov::HostTensorVector &output_values, const ov::HostTensorVector &input_values, const EvaluationContext &evaluationContext) const

Use evaluate with ov::Tensor instead

Global ov::Node::evaluate (const ov::HostTensorVector &output_values, const ov::HostTensorVector &input_values) const

Use evaluate with ov::Tensor instead

Global ov::op::util::VariableValue::get_state () const

This method is deprecated and will be removed in 2024.0 release. Please use method with ov::Tensor instead

Global ov::op::util::VariableValue::get_value () const

This method is deprecated and will be removed in 2024.0 release. Please use method with ov::Tensor instead

Global ov::op::util::VariableValue::set_state (const ov::Tensor &value)

This method is deprecated and will be removed in 2024.0 release. Please use method with ov::Tensor instead

Global ov::op::util::VariableValue::set_value (const ngraph::HostTensorPtr &value)

This method is deprecated and will be removed in 2024.0 release. Please use method with ov::Tensor instead

Global ov::op::util::VariableValue::VariableValue (const ov::Tensor &value, bool reset)

This method is deprecated and will be removed in 2024.0 release. Please use method with ov::Tensor instead

Global ov::op::util::VariableValue::VariableValue (ngraph::HostTensorPtr value, bool reset)

This method is deprecated and will be removed in 2024.0 release. Please use method with ov::Tensor instead

Global ov::op::util::VariableValue::VariableValue (ngraph::HostTensorPtr value)

This method is deprecated and will be removed in 2024.0 release. Please use method with ov::Tensor instead