This is a header file with common inference engine definitions. More...
#include <algorithm>
#include <cstdlib>
#include <details/ie_exception.hpp>
#include <memory>
#include <ostream>
#include <string>
#include <vector>
#include "ie_unicode.hpp"
Go to the source code of this file.
Data Structures | |
union | InferenceEngine::UserValue |
The method holds the user values to enable binding of data per graph node. More... | |
struct | InferenceEngine::InferenceEngineProfileInfo |
Represents basic inference profiling information per layer. More... | |
struct | InferenceEngine::ResponseDesc |
Represents detailed information for an error. More... | |
class | InferenceEngine::GeneralError |
This class represents StatusCode::GENERIC_ERROR exception. More... | |
class | InferenceEngine::NotImplemented |
This class represents StatusCode::NOT_IMPLEMENTED exception. More... | |
class | InferenceEngine::NetworkNotLoaded |
This class represents StatusCode::NETWORK_NOT_LOADED exception. More... | |
class | InferenceEngine::ParameterMismatch |
This class represents StatusCode::PARAMETER_MISMATCH exception. More... | |
class | InferenceEngine::NotFound |
This class represents StatusCode::NOT_FOUND exception. More... | |
class | InferenceEngine::OutOfBounds |
This class represents StatusCode::OUT_OF_BOUNDS exception. More... | |
class | InferenceEngine::Unexpected |
This class represents StatusCode::UNEXPECTED exception. More... | |
class | InferenceEngine::RequestBusy |
This class represents StatusCode::REQUEST_BUSY exception. More... | |
class | InferenceEngine::ResultNotReady |
This class represents StatusCode::RESULT_NOT_READY exception. More... | |
class | InferenceEngine::NotAllocated |
This class represents StatusCode::NOT_ALLOCATED exception. More... | |
class | InferenceEngine::InferNotStarted |
This class represents StatusCode::INFER_NOT_STARTED exception. More... | |
class | NetworkNotRead |
This class represents StatusCode::NETWORK_NOT_READ exception. More... | |
Typedefs | |
using | InferenceEngine::SizeVector = std::vector< size_t > |
Represents tensor size. More... | |
using | InferenceEngine::CNNLayerPtr = std::shared_ptr< CNNLayer > |
A smart pointer to the CNNLayer. | |
using | InferenceEngine::CNNLayerWeakPtr = std::weak_ptr< CNNLayer > |
A smart weak pointer to the CNNLayer. | |
using | InferenceEngine::DataPtr = std::shared_ptr< Data > |
Smart pointer to Data. | |
using | InferenceEngine::CDataPtr = std::shared_ptr< const Data > |
Smart pointer to constant Data. | |
using | InferenceEngine::DataWeakPtr = std::weak_ptr< Data > |
Smart weak pointer to Data. | |
Enumerations | |
enum | InferenceEngine::Layout : uint8_t { InferenceEngine::ANY = 0, InferenceEngine::NCHW = 1, InferenceEngine::NHWC = 2, InferenceEngine::NCDHW = 3, InferenceEngine::NDHWC = 4, InferenceEngine::OIHW = 64, InferenceEngine::GOIHW = 65, InferenceEngine::OIDHW = 66, InferenceEngine::GOIDHW = 67, InferenceEngine::SCALAR = 95, InferenceEngine::C = 96, InferenceEngine::CHW = 128, InferenceEngine::HW = 192, InferenceEngine::NC = 193, InferenceEngine::CN = 194, InferenceEngine::BLOCKED = 200 } |
Layouts that the inference engine supports. More... | |
enum | InferenceEngine::ColorFormat : uint32_t { InferenceEngine::RAW = 0u, InferenceEngine::RGB, InferenceEngine::BGR, InferenceEngine::RGBX, InferenceEngine::BGRX, InferenceEngine::NV12, InferenceEngine::I420 } |
Extra information about input color format for preprocessing. More... | |
enum | InferenceEngine::StatusCode : int { OK = 0, GENERAL_ERROR = -1, NOT_IMPLEMENTED = -2, NETWORK_NOT_LOADED = -3, PARAMETER_MISMATCH = -4, NOT_FOUND = -5, OUT_OF_BOUNDS = -6, UNEXPECTED = -7, REQUEST_BUSY = -8, RESULT_NOT_READY = -9, NOT_ALLOCATED = -10, INFER_NOT_STARTED = -11, NETWORK_NOT_READ = -12 } |
This enum contains codes for all possible return values of the interface functions. | |
Functions | |
std::ostream & | InferenceEngine::operator<< (std::ostream &out, const Layout &p) |
std::ostream & | InferenceEngine::operator<< (std::ostream &out, const ColorFormat &fmt) |
This is a header file with common inference engine definitions.
using InferenceEngine::SizeVector = typedef std::vector<size_t> |
Represents tensor size.
The order is opposite to the order in Caffe*: (w,h,n,b) where the most frequently changing element in memory is first.
enum InferenceEngine::ColorFormat : uint32_t |
Extra information about input color format for preprocessing.
enum InferenceEngine::Layout : uint8_t |
Layouts that the inference engine supports.