API Reference Guide¶
OpenVINO Model Server exposes two sets of network APIs: one compatible with TensorFlow Serving and another one, with KServe API, for inference. Both APIs work on both gRPC and REST interfaces. Supporting two sets of APIs makes OpenVINO Model Server easier to plug into existing systems the already leverage one of those APIs for inference. Learn more about supported APIs:
If you already use one of these APIs, integration of OpenVINO Model Server should be smooth and transparent.
Additionally OVMS provides preview of in process inference with its C API: