Converting an ONNX GPT-2 Model¶
Danger
The code described here has been deprecated! Do not use it to avoid working with a legacy solution. It will be kept for some time to ensure backwards compatibility, but you should not use it in contemporary applications.
This guide describes a deprecated conversion method. The guide on the new and recommended method can be found in the Python tutorials.
Public pre-trained GPT-2 model is a large transformer-based language model with a simple objective: predict the next word, given all of the previous words within some text.
Downloading the Pre-Trained Base GPT-2 Model¶
To download the model, go to this model, and press Download.
To download the model and sample test data, go to this model, and press Download.
Converting an ONNX GPT-2 Model to IR¶
Generate the Intermediate Representation of the model GPT-2 by running model conversion with the following parameters:
mo --input_model gpt2-10.onnx --input_shape [X,Y,Z] --output_dir <OUTPUT_MODEL_DIR>