Write a Client Application#
OpenVINO™ Model Server supports multiple APIs, for easy integration with systems using one of them for inference. The APIs are:
one compatible with TensorFlow Serving,
KServe API for inference
OpenAPI for text generation.
Both TFS and KServe APIs work on gRPC and REST interfaces.
The OpenAI API chat/completion
endpoint supports REST API calls with and without streamed responses.
Check the following articles to learn more about the supported APIs:
In this section you can find short code samples to interact with OpenVINO Model Server endpoints via: