Public pre-trained GPT-2 model is a large transformer-based language model with a simple objective: predict the next word, given all of the previous words within some text.
To download the model, click Download on https://github.com/onnx/models/blob/master/text/machine_comprehension/gpt-2/model/gpt2-10.onnx.
To download the model and sample test data, click Download on https://github.com/onnx/models/blob/master/text/machine_comprehension/gpt-2/model/gpt2-10.tar.gz.
To generate the Intermediate Representation (IR) of the model GPT-2, run the Model Optimizer with the following parameters: