View Inference Results

Inference Results

Once an initial inference has been run with a model sample data set and target, you can view performance results on the project Dashboard.

model_performance_summary.png

All of these components provide visual representation of a model performance on a selected dataset, and help find potential bottlenecks and areas for improvement.

Model Analyzer

The Model Analyzer is used for generating estimated performance information on neural networks. The tool analyzes of the following characteristics:

Characteristic Explanation
Computational Complexity Measured in GFLOPs. This parameter represents a number of floating point operations required to infer a model.
Number of Parameters Measured in millions. This parameter represents a total number of weights in a model.
Minimum Memory Consumption,
Maximum Memory Consumption
Measured in millions of units. A unit depends on the precision of model weights. For example, for FP32 model these parameters must be multiplied by 4 bytes.

Model analysis data is collected when the model is imported. All parameters depend on the size of a batch. Currently, information is gained on the default model batch.

To view model analysis data, click on the plus button next to a model name on the Projects page.

model_analyzer_01.png

A table with characteristics of a model appear:

model_analyzer_02.png