Variables
InferenceEngine::Metrics Namespace Reference

Metrics More...

Variables

static constexpr auto METRIC_AVAILABLE_DEVICES = "AVAILABLE_DEVICES"
 Metric to get a std::vector<std::string> of available device IDs. String value is "AVAILABLE_DEVICES".
 
static constexpr auto METRIC_SUPPORTED_METRICS = "SUPPORTED_METRICS"
 Metric to get a std::vector<std::string> of supported metrics. String value is "SUPPORTED_METRICS". More...
 
static constexpr auto METRIC_SUPPORTED_CONFIG_KEYS = "SUPPORTED_CONFIG_KEYS"
 Metric to get a std::vector<std::string> of supported config keys. String value is "SUPPORTED_CONFIG_KEYS". More...
 
static constexpr auto METRIC_FULL_DEVICE_NAME = "FULL_DEVICE_NAME"
 Metric to get a std::string value representing a full device name. String value is "FULL_DEVICE_NAME".
 
static constexpr auto METRIC_OPTIMIZATION_CAPABILITIES = "OPTIMIZATION_CAPABILITIES"
 Metric to get a std::vector<std::string> of optimization options per device. String value is "OPTIMIZATION_CAPABILITIES". More...
 
static constexpr auto FP32 = "FP32"
 
static constexpr auto BF16 = "BF16"
 
static constexpr auto FP16 = "FP16"
 
static constexpr auto INT8 = "INT8"
 
static constexpr auto BIN = "BIN"
 
static constexpr auto WINOGRAD = "WINOGRAD"
 
static constexpr auto METRIC_RANGE_FOR_STREAMS = "RANGE_FOR_STREAMS"
 Metric to provide information about a range for streams on platforms where streams are supported. More...
 
static constexpr auto METRIC_RANGE_FOR_ASYNC_INFER_REQUESTS = "RANGE_FOR_ASYNC_INFER_REQUESTS"
 Metric to provide a hint for a range for number of async infer requests. If device supports streams, the metric provides range for number of IRs per stream. More...
 
static constexpr auto METRIC_NUMBER_OF_WAITING_INFER_REQUESTS = "NUMBER_OF_WAITING_INFER_REQUESTS"
 Metric to get an unsigned int value of number of waiting infer request. More...
 
static constexpr auto METRIC_NUMBER_OF_EXEC_INFER_REQUESTS = "NUMBER_OF_EXEC_INFER_REQUESTS"
 Metric to get an unsigned int value of number of infer request in execution stage. More...
 
static constexpr auto METRIC_NETWORK_NAME = "NETWORK_NAME"
 Metric to get a name of network. String value is "NETWORK_NAME".
 
static constexpr auto METRIC_DEVICE_THERMAL = "DEVICE_THERMAL"
 Metric to get a float of device thermal. String value is "DEVICE_THERMAL".
 
static constexpr auto METRIC_OPTIMAL_NUMBER_OF_INFER_REQUESTS = "OPTIMAL_NUMBER_OF_INFER_REQUESTS"
 Metric to get an unsigned integer value of optimal number of executable network infer requests.
 

Detailed Description

Metrics

Variable Documentation

§ METRIC_NUMBER_OF_EXEC_INFER_REQUESTS

constexpr auto InferenceEngine::Metrics::METRIC_NUMBER_OF_EXEC_INFER_REQUESTS = "NUMBER_OF_EXEC_INFER_REQUESTS"
static

Metric to get an unsigned int value of number of infer request in execution stage.

String value is "NUMBER_OF_EXEC_INFER_REQUESTS". This can be used as an executable network metric as well

§ METRIC_NUMBER_OF_WAITING_INFER_REQUESTS

constexpr auto InferenceEngine::Metrics::METRIC_NUMBER_OF_WAITING_INFER_REQUESTS = "NUMBER_OF_WAITING_INFER_REQUESTS"
static

Metric to get an unsigned int value of number of waiting infer request.

String value is "NUMBER_OF_WAITNING_INFER_REQUESTS". This can be used as an executable network metric as well

§ METRIC_OPTIMIZATION_CAPABILITIES

constexpr auto InferenceEngine::Metrics::METRIC_OPTIMIZATION_CAPABILITIES = "OPTIMIZATION_CAPABILITIES"
static

Metric to get a std::vector<std::string> of optimization options per device. String value is "OPTIMIZATION_CAPABILITIES".

The possible values:

  • "FP32" - device can support FP32 models
  • "BF16" - device can support BF16 computations for models
  • "FP16" - device can support FP16 models
  • "INT8" - device can support models with INT8 layers
  • "BIN" - device can support models with BIN layers
  • "WINOGRAD" - device can support models where convolution implemented via Winograd transformations

§ METRIC_RANGE_FOR_ASYNC_INFER_REQUESTS

constexpr auto InferenceEngine::Metrics::METRIC_RANGE_FOR_ASYNC_INFER_REQUESTS = "RANGE_FOR_ASYNC_INFER_REQUESTS"
static

Metric to provide a hint for a range for number of async infer requests. If device supports streams, the metric provides range for number of IRs per stream.

Metric returns a value of std::tuple<unsigned int, unsigned int, unsigned int> type, where:

  • First value is bottom bound.
  • Second value is upper bound.
  • Third value is step inside this range. String value for metric name is "RANGE_FOR_ASYNC_INFER_REQUESTS".

§ METRIC_RANGE_FOR_STREAMS

constexpr auto InferenceEngine::Metrics::METRIC_RANGE_FOR_STREAMS = "RANGE_FOR_STREAMS"
static

Metric to provide information about a range for streams on platforms where streams are supported.

Metric returns a value of std::tuple<unsigned int, unsigned int> type, where:

  • First value is bottom bound.
  • Second value is upper bound. String value for metric name is "RANGE_FOR_STREAMS".

§ METRIC_SUPPORTED_CONFIG_KEYS

constexpr auto InferenceEngine::Metrics::METRIC_SUPPORTED_CONFIG_KEYS = "SUPPORTED_CONFIG_KEYS"
static

Metric to get a std::vector<std::string> of supported config keys. String value is "SUPPORTED_CONFIG_KEYS".

This can be used as an executable network metric as well.

Each of the returned device configuration keys can be passed to Core::SetConfig, Core::GetConfig, and Core::LoadNetwork, configuration keys for executable networks can be passed to ExecutableNetwork::SetConfig and ExecutableNetwork::GetConfig.

§ METRIC_SUPPORTED_METRICS

constexpr auto InferenceEngine::Metrics::METRIC_SUPPORTED_METRICS = "SUPPORTED_METRICS"
static

Metric to get a std::vector<std::string> of supported metrics. String value is "SUPPORTED_METRICS".

This can be used as an executable network metric as well.

Each of the returned device metrics can be passed to Core::GetMetric, executable network metrics can be passed to ExecutableNetwork::GetMetric.