API Reference Guide¶
OpenVINO Model Server exposes two sets of network APIs for inference: one compatible with TensorFlow Serving and another one, with KServe API. Both APIs work on gRPC and REST interfaces. Supporting two sets of APIs makes OpenVINO Model Server easier to plug into existing systems the already leverage one of those APIs for inference. Learn more about supported APIs:
If you already use one of these APIs, integration of OpenVINO Model Server should be smooth and transparent.
Additionally OVMS provides in process inference with its C API: