Tutorials that explain how to optimize and quantize models with OpenVINO tools.
BiT Image Classification OpenVINO IR model Quantization with NNCF.
Convert TensorFlow Hub models to OpenVINO Intermediate Representation (IR).
Semantic segmentation with LRASPP MobileNet v3 and OpenVINO
Classification with ConvNeXt and OpenVINO.
Hugging Face Model Hub with OpenVINO™.
Convert Detectron2 Models to OpenVINO™.
Convert and Optimize YOLOv8 with OpenVINO™.
Quantize Speech Recognition Models with accuracy control using NNCF PTQ API.
Learn about OpenVINO™ model conversion API.
Learn about model conversion in OpenVINO™.
Convert TensorFlow Object Detection models to OpenVINO IR.
Convert TensorFlow Lite models to OpenVINO IR.
Improve performance of image preprocessing step.
Improve performance of sparse Transformer models.
Use asynchronous execution to improve data pipelining.
Quantize MobileNet image classification.
Use Neural Network Compression Framework (NNCF) to quantize PyTorch model in post-training mode (without model fine-tuning).
Quantize a kidney segmentation model and show live inference.
Live inference of a kidney segmentation model and benchmark CT-scan data with OpenVINO.
Performance tricks for throughput mode in OpenVINO™.
Performance tricks for latency mode in OpenVINO™.
Working with GPUs in OpenVINO™
Optimize and quantize a pre-trained Data2Vec speech model.
Optimize and quantize a pre-trained Wav2Vec2 speech model.
Demonstrates how to use AUTO Device.
Optimize and quantize a pre-trained BERT model.
Download, convert and benchmark models from Open Model Zoo.
Convert PaddlePaddle models to OpenVINO IR.
Convert PyTorch models to OpenVINO IR.
Convert TensorFlow models to OpenVINO IR.