NOTE: Intel® Arria® 10 FPGA (Mustang-F100-A10) Speed Grade 1 is not available in the OpenVINO 2020.3 package.
C:\intelFPGA\18.1
directory.<install location>\bin
path to your system PATH
variable.Intel_vision_accel_win_driver_1.2_SG2.zip
from C:\Program Files (x86)\IntelSWTools\openvino\a10_vision_design_sg2_bitstreams\BSP
to C:\intelFPGA\19.2\aclrte-windows64\board
aocl diagnose
: DIAGNOSTIC_PASSED
.The bitstream you program should correspond to the topology you want to deploy. In this section, you program a SqueezeNet bitstream and deploy the classification sample with a SqueezeNet model that you used the Model Optimizer to convert in the steps before.
IMPORTANT: Only use bitstreams from the installed version of the Intel® Distribution of OpenVINO™ toolkit. Bitstreams from older versions of the Intel® Distribution of OpenVINO™ toolkit are incompatible with later versions of the Intel® Distribution of OpenVINO™ toolkit. For example, you cannot use the
2019R4_PL2_FP11_AlexNet_GoogleNet_Generic
bitstream, when the Intel® Distribution of OpenVINO™ toolkit supports the2020-3_PL2_FP11_AlexNet_GoogleNet_Generic
bitstream.
Depending on how many bitstreams you selected, there are different folders for each FPGA card type which were downloaded in the Intel® Distribution of OpenVINO™ toolkit package:
C:\Program Files (x86)\IntelSWTools\openvino\a10_vision_design_sg2_bitstreams
. This example uses a SqueezeNet bitstream with low precision for the classification sample.NOTE: The SqueezeNet Caffe* model was already downloaded and converted to an FP16 IR when you ran the Image Classification Verification Script while installing the Intel® Distribution of OpenVINO™ toolkit for Windows* with FPGA Support. Read this section only if you want to convert the model manually, otherwise skip and go to the next section to run the Image Classification sample application.
In this section, you will prepare a sample FP16 model suitable for hardware accelerators. For more information, see the FPGA plugin section in the Inference Engine Developer Guide.
HOMEPATH%\squeezenet1.1_FP16
: squeezenet1.1.labels
file contains the classes ImageNet
uses. This file is included so that the inference results show text instead of classification numbers. Copy squeezenet1.1.labels
to the your optimized model location: In this section you will run the Image Classification sample application, with the Caffe* Squeezenet1.1 model on your Intel® Vision Accelerator Design with an Intel® Arria® 10 FPGA.
Image Classification sample application binary file was automatically built and the FP16 model IR files are created when you ran the Image Classification Verification Script while installing the Intel® Distribution of OpenVINO™ toolkit for Windows* with FPGA Support:
HOMEPATH%\Documents\Intel\OpenVINO\inference_engine_samples_build\intel64\Release
folder.HOMEPATH%\Documents\Intel\OpenVINO\openvino_models\ir\public\squeezenet1.1\FP16
folder.-d
option to target the FPGA: Congratulations, you are done with the Intel® Distribution of OpenVINO™ toolkit installation for FPGA. To learn more about how the Intel® Distribution of OpenVINO™ toolkit works, try the other resources that are provided below.
Intel® Distribution of OpenVINO™ toolkit home page: https://software.intel.com/en-us/openvino-toolkit
Intel® Distribution of OpenVINO™ toolkit documentation: https://docs.openvinotoolkit.org/
Inference Engine FPGA plugin documentation: https://docs.openvinotoolkit.org/latest/_docs_IE_DG_supported_plugins_FPGA.html