Model Creation C++ Sample¶
This sample demonstrates how to execute an synchronous inference using model built on the fly which uses weights from LeNet classification model, which is known to work well on digit classification tasks.
You do not need an XML file to create a model. The API of ov::Model allows creating a model on the fly from the source code.
The following C++ API is used in the application:
Feature |
API |
Description |
---|---|---|
OpenVINO Runtime Info |
Get device plugins versions |
|
Shape Operations |
|
Operate with shape |
Tensor Operations |
Get tensor byte size and its data |
|
Model Operations |
Operate with model batch size |
|
Infer Request Operations |
Get a input tensor |
|
Model creation objects |
|
Used to construct an OpenVINO model |
Basic OpenVINO™ Runtime API is covered by Hello Classification C++ sample.
Options |
Values |
---|---|
Validated Models |
LeNet |
Model Format |
model weights file (*.bin) |
Validated images |
single-channel |
Supported devices |
|
Other language realization |
How It Works¶
At startup, the sample application does the following:
Reads command line parameters
Build a Model and passed weights file
Loads the model and input data to the OpenVINO™ Runtime plugin
Performs synchronous inference and processes output data, logging each step in a standard output stream
You can see the explicit description of each sample step at Integration Steps section of “Integrate OpenVINO™ Runtime with Your Application” guide.
Building¶
To build the sample, please use instructions available at Build the Sample Applications section in OpenVINO™ Toolkit Samples guide.
Running¶
model_creation_sample <path_to_lenet_weights> <device>
NOTES :
you can use LeNet model weights in the sample folder:
lenet.bin
with FP32 weights fileThe
lenet.bin
with FP32 weights file was generated by the Model Optimizer tool from the public LeNet model with the--input_shape [64,1,28,28]
parameter specified.
The original model is available in the Caffe* repository on GitHub*.
You can do inference of an image using a pre-trained model on a GPU using the following command:
model_creation_sample lenet.bin GPU
Sample Output¶
The sample application logs each step in a standard output stream and outputs top-10 inference results.
[ INFO ] OpenVINO Runtime version ......... <version>
[ INFO ] Build ........... <build>
[ INFO ]
[ INFO ] Device info:
[ INFO ] GPU
[ INFO ] Intel GPU plugin version ......... <version>
[ INFO ] Build ........... <build>
[ INFO ]
[ INFO ]
[ INFO ] Create model from weights: lenet.bin
[ INFO ] model name: lenet
[ INFO ] inputs
[ INFO ] input name: NONE
[ INFO ] input type: f32
[ INFO ] input shape: {64, 1, 28, 28}
[ INFO ] outputs
[ INFO ] output name: output_tensor
[ INFO ] output type: f32
[ INFO ] output shape: {64, 10}
[ INFO ] Batch size is 10
[ INFO ] model name: lenet
[ INFO ] inputs
[ INFO ] input name: NONE
[ INFO ] input type: u8
[ INFO ] input shape: {10, 28, 28, 1}
[ INFO ] outputs
[ INFO ] output name: output_tensor
[ INFO ] output type: f32
[ INFO ] output shape: {10, 10}
[ INFO ] Compiling a model for the GPU device
[ INFO ] Create infer request
[ INFO ] Combine images in batch and set to input tensor
[ INFO ] Start sync inference
[ INFO ] Processing output tensor
Top 1 results:
Image 0
classid probability label
------- ----------- -----
0 1.0000000 0
Image 1
classid probability label
------- ----------- -----
1 1.0000000 1
Image 2
classid probability label
------- ----------- -----
2 1.0000000 2
Image 3
classid probability label
------- ----------- -----
3 1.0000000 3
Image 4
classid probability label
------- ----------- -----
4 1.0000000 4
Image 5
classid probability label
------- ----------- -----
5 1.0000000 5
Image 6
classid probability label
------- ----------- -----
6 1.0000000 6
Image 7
classid probability label
------- ----------- -----
7 1.0000000 7
Image 8
classid probability label
------- ----------- -----
8 1.0000000 8
Image 9
classid probability label
------- ----------- -----
9 1.0000000 9
Deprecation Notice¶
Deprecation Begins |
June 1, 2020 |
---|---|
Removal Date |
December 1, 2020 |