Write a Client Application#

OpenVINO™ Model Server exposes three sets of APIs: one compatible with TensorFlow Serving, second with KServe API for inference and OpenAPI API for text generation. Both TFS and KServe APIs work on gRPC and REST interfaces. OpenAI API chat/completion endpoint supports REST API calls with and without streamed responses. Supporting multiple APIs makes OpenVINO Model Server easier to plug into existing systems the already leverage one of these APIs for inference. Learn more about supported APIs:

In this section you can find short code samples to interact with OpenVINO Model Server endpoints via: