densenet-161

Use Case and High-Level Description

The densenet-161 model is one of the DenseNet group of models designed to perform image classification. The main difference with the densenet-121 model is the size and accuracy of the model. The densenet-161 is much larger at 100MB in size vs the densenet-121 model’s roughly 31MB size. Originally trained on Torch, the authors converted them into Caffe* format. All the DenseNet models have been pre-trained on the ImageNet image database. For details about this family of models, check out the repository.

The model input is a blob that consists of a single image of 1, 3, 224, 224 in BGR order. The BGR mean values need to be subtracted as follows: [103.94, 116.78, 123.68] before passing the image blob into the network. In addition, values must be divided by 0.017.

The model output for densenet-161 is the typical object classifier output for the 1000 different classifications matching those in the ImageNet database.

Specification

Metric

Value

Type

Classification

GFLOPs

15.561

MParams

28.666

Source framework

Caffe*

Accuracy

Metric

Value

Top 1

77.55%

Top 5

93.92%

See the original repository.

Input

Original model

Image, name - data, shape - 1, 3, 224, 224, format is B, C, H, W, where:

  • B - batch size

  • C - channel

  • H - height

  • W - width

Channel order is BGR. Mean values - [103.94, 116.78, 123.68], scale value - 58.8235294117647.

Converted model

Image, name - data, shape - 1, 3, 224, 224, format is B, C, H, W, where:

  • B - batch size

  • C - channel

  • H - height

  • W - width

Channel order is BGR.

Output

Original model

Object classifier according to ImageNet classes, name - fc6, shape - 1, 1000, 1, 1, contains predicted probability for each class in logits format.

Converted model

Object classifier according to ImageNet classes, name - fc6, shape - 1, 1000, 1, 1, contains predicted probability for each class in logits format.

Download a Model and Convert it into Inference Engine Format

You can download models and if necessary convert them into Inference Engine format using the Model Downloader and other automation tools as shown in the examples below.

An example of using the Model Downloader:

python3 <omz_dir>/tools/downloader/downloader.py --name <model_name>

An example of using the Model Converter:

python3 <omz_dir>/tools/downloader/converter.py --name <model_name>