Pre-trained models for BERT (Bidirectional Encoder Representations from Transformers) are publicly available.
Currently, the following models from the pre-trained BERT model list are supported:
BERT-Base, Cased
BERT-Base, Uncased
BERT-Base, Multilingual Cased
BERT-Base, Multilingual Uncased
BERT-Base, Chinese
BERT-Large, Cased
BERT-Large, Uncased
Download and unzip an archive with the BERT-Base, Multilingual Uncased Model.
After the archive is unzipped, the directory uncased_L-12_H-768_A-12
is created and contains the following files:
bert_config.json
bert_model.ckpt.data-00000-of-00001
bert_model.ckpt.index
bert_model.ckpt.meta
vocab.txt
Pre-trained model meta-graph files are bert_model.ckpt.*
.
To generate the BERT Intermediate Representation (IR) of the model, run the Model Optimizer with the following parameters: