View Inference Results

Inference Results

Once an initial inference has been run with a model sample dataset and target, you can view performance results on the project dashboard.

model_performance_summary-b.png

The components specified below provide visual representation of a model performance on a selected dataset and help find potential bottlenecks and areas for improvement:

Model Analyzer

The Model Analyzer is used for generating estimated performance information on neural networks. The tool analyzes of the following characteristics:

Characteristic Unit of MeasurementExplanation
Computational Complexity GFLOPsRepresents a number of floating point operations required to infer a model.
Number of Parameters Millions Represents a total number of weights in a model.
Minimum Memory Consumption,
Maximum Memory Consumption
Millions of units A unit depends on the precision of model weights. For example, for FP32 model these parameters must be multiplied by 4 bytes.

Model analysis data is collected when the model is imported. All parameters depend on the size of a batch. Currently, information is gathered on the default model batch.

To view model analysis data, click the plus button next to a model name on the Configurations page.

model_analyzer_01-b.png

A table with characteristics of a model appears:

model_analyzer_02-b.png