Converting a TensorFlow CRNN Model¶
This tutorial explains how to convert a CRNN model to OpenVINO™ Intermediate Representation (IR).
There are several public versions of TensorFlow CRNN model implementation available on GitHub. This tutorial explains how to convert the model from the CRNN Tensorflow repository to IR, and is validated with Python 3.7, TensorFlow 1.15.0, and protobuf 3.19.0. If you have another implementation of CRNN model, it can be converted to OpenVINO IR in a similar way. You need to get inference graph and run Model Optimizer on it.
To convert the model to IR:
Step 1. Clone this GitHub repository and check out the commit:
Clone the repository:
git clone https://github.com/MaybeShewill-CV/CRNN_Tensorflow.git
Go to the
CRNN_Tensorflow
directory of the cloned repository:cd path/to/CRNN_Tensorflow
Check out the necessary commit:
git checkout 64f1f1867bffaacfeacc7a80eebf5834a5726122
Step 2. Train the model using the framework or the pretrained checkpoint provided in this repository.
Step 3. Create an inference graph:
Add the
CRNN_Tensorflow
folder toPYTHONPATH
.For Linux:
export PYTHONPATH="${PYTHONPATH}:/path/to/CRNN_Tensorflow/"
For Windows, add
/path/to/CRNN_Tensorflow/
to thePYTHONPATH
environment variable in settings.
Edit the
tools/demo_shadownet.py
script. Aftersaver.restore(sess=sess, save_path=weights_path)
line, add the following code:from tensorflow.python.framework import graph_io frozen = tf.graph_util.convert_variables_to_constants(sess, sess.graph_def, ['shadow/LSTMLayers/transpose_time_major']) graph_io.write_graph(frozen, '.', 'frozen_graph.pb', as_text=False)
Run the demo with the following command:
python tools/demo_shadownet.py --image_path data/test_images/test_01.jpg --weights_path model/shadownet/shadownet_2017-10-17-11-47-46.ckpt-199999
If you want to use your checkpoint, replace the path in the
--weights_path
parameter with a path to your checkpoint.In the
CRNN_Tensorflow
directory, you will find the inference CRNN graphfrozen_graph.pb
. You can use this graph with OpenVINO to convert the model to IR and then run inference.
Step 4. Convert the model to IR:
mo --input_model path/to/your/CRNN_Tensorflow/frozen_graph.pb