“Hot Topic” How-To Links¶
Blogs & Articles¶
Streamline your Intel® Distribution of OpenVINO™ Toolkit development with Deep Learning Workbench
Enhanced Low-Precision Pipeline to Accelerate Inference with OpenVINO Toolkit
Improving DL Performance Using Binary Convolution Support in OpenVINO Toolkit
Automatic Multi-Device Inference with the Intel® Distribution of OpenVINO™ toolkit
Introducing int8 quantization for fast CPU inference using OpenVINO
Accelerate Vision-based AI with Intel® Distribution of OpenVINO™ Toolkit
Custom Operations Guide¶
To learn about what is custom operation and how to work with them in the Deep Learning Deployment Toolkit, see the Custom Operations Guide.
Introducing OpenVINO™ and Computer Vision | IoT Developer Show Season 2 | Intel Software¶
OpenVINO™ Toolkit and Two Hardware Development Kits | IoT Developer Show Season 2 | Intel Software¶
Intel Demonstration of High Performance Vision Deployment - The OpenVINO Toolkit in Action¶
Computer Vision at the Edge with OpenVINO by Krishnakumar Shetti at ODSC_India¶
..raw:: html
<iframe allowfullscreen mozallowfullscreen msallowfullscreen oallowfullscreen webkitallowfullscreen width=”560” height=”315” src=”https://www.youtube.com/embed/RfRCrq35LXg” frameborder=”0” allow=”accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture” allowfullscreen></iframe>