Post-Training Quantization with TensorFlow Classification Model

This Jupyter notebook can be launched after a local installation only.

Github

This example demonstrates how to quantize the OpenVINO model that was created in 301-tensorflow-training-openvino notebook, to improve inference speed. Quantization is performed with Post-training Quantization with NNCF. A custom dataloader and metric will be defined, and accuracy and performance will be computed for the original IR model and the quantized model.

Table of contents:

Preparation

The notebook requires that the training notebook has been run and that the Intermediate Representation (IR) models are created. If the IR models do not exist, running the next cell will run the training notebook. This will take a while.

%pip install -q tensorflow Pillow matplotlib numpy tqdm nncf
DEPRECATION: pytorch-lightning 1.6.5 has a non-standard dependency specifier torch>=1.8.*. pip 24.1 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of pytorch-lightning or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at https://github.com/pypa/pip/issues/12063
Note: you may need to restart the kernel to use updated packages.
from pathlib import Path

import tensorflow as tf

model_xml = Path("model/flower/flower_ir.xml")
dataset_url = (
    "https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz"
)
data_dir = Path(tf.keras.utils.get_file("flower_photos", origin=dataset_url, untar=True))

if not model_xml.exists():
    print("Executing training notebook. This will take a while...")
    %run 301-tensorflow-training-openvino.ipynb
2024-02-10 01:09:00.730910: I tensorflow/core/util/port.cc:110] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable TF_ENABLE_ONEDNN_OPTS=0.
2024-02-10 01:09:00.766002: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2024-02-10 01:09:01.406366: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Executing training notebook. This will take a while...
DEPRECATION: pytorch-lightning 1.6.5 has a non-standard dependency specifier torch>=1.8.*. pip 24.1 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of pytorch-lightning or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at https://github.com/pypa/pip/issues/12063
Note: you may need to restart the kernel to use updated packages.
3670
Found 3670 files belonging to 5 classes.
Using 2936 files for training.
2024-02-10 01:09:08.525687: E tensorflow/compiler/xla/stream_executor/cuda/cuda_driver.cc:266] failed call to cuInit: CUDA_ERROR_COMPAT_NOT_SUPPORTED_ON_DEVICE: forward compatibility was attempted on non supported HW
2024-02-10 01:09:08.525725: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:168] retrieving CUDA diagnostic information for host: iotg-dev-workstation-07
2024-02-10 01:09:08.525729: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:175] hostname: iotg-dev-workstation-07
2024-02-10 01:09:08.525856: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:199] libcuda reported version is: 470.223.2
2024-02-10 01:09:08.525872: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:203] kernel reported version is: 470.182.3
2024-02-10 01:09:08.525876: E tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:312] kernel version 470.182.3 does not match DSO version 470.223.2 -- cannot find working devices in this configuration
Found 3670 files belonging to 5 classes.
Using 734 files for validation.
['daisy', 'dandelion', 'roses', 'sunflowers', 'tulips']
2024-02-10 01:09:08.855253: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
     [[{{node Placeholder/_0}}]]
2024-02-10 01:09:08.855534: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
     [[{{node Placeholder/_4}}]]
../_images/301-tensorflow-training-openvino-nncf-with-output_3_11.png
2024-02-10 01:09:09.711519: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
     [[{{node Placeholder/_4}}]]
2024-02-10 01:09:09.711766: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
     [[{{node Placeholder/_0}}]]
(32, 180, 180, 3)
(32,)
2024-02-10 01:09:10.063734: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
     [[{{node Placeholder/_4}}]]
2024-02-10 01:09:10.064340: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
     [[{{node Placeholder/_4}}]]
0.0 0.9970461
2024-02-10 01:09:10.875056: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
     [[{{node Placeholder/_0}}]]
2024-02-10 01:09:10.875365: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
     [[{{node Placeholder/_0}}]]
../_images/301-tensorflow-training-openvino-nncf-with-output_3_17.png
Model: "sequential_2"
_________________________________________________________________
Layer (type)                Output Shape              Param #
=================================================================
sequential_1 (Sequential)   (None, 180, 180, 3)       0
rescaling_2 (Rescaling)     (None, 180, 180, 3)       0
conv2d_3 (Conv2D)           (None, 180, 180, 16)      448
max_pooling2d_3 (MaxPooling  (None, 90, 90, 16)       0
2D)
conv2d_4 (Conv2D)           (None, 90, 90, 32)        4640
max_pooling2d_4 (MaxPooling  (None, 45, 45, 32)       0
2D)
conv2d_5 (Conv2D)           (None, 45, 45, 64)        18496
max_pooling2d_5 (MaxPooling  (None, 22, 22, 64)       0
2D)
dropout (Dropout)           (None, 22, 22, 64)        0
flatten_1 (Flatten)         (None, 30976)             0
dense_2 (Dense)             (None, 128)               3965056
outputs (Dense)             (None, 5)                 645
=================================================================
Total params: 3,989,285
Trainable params: 3,989,285
Non-trainable params: 0
_________________________________________________________________
Epoch 1/15
2024-02-10 01:09:11.882327: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
     [[{{node Placeholder/_0}}]]
2024-02-10 01:09:11.882802: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
     [[{{node Placeholder/_4}}]]
1/92 [..............................] - ETA: 1:32 - loss: 1.6315 - accuracy: 0.1562
   
2/92 [..............................] - ETA: 6s - loss: 1.7632 - accuracy: 0.2812
   
3/92 [..............................] - ETA: 5s - loss: 1.7516 - accuracy: 0.2708
   
4/92 [>.............................] - ETA: 5s - loss: 1.7249 - accuracy: 0.2578
   
5/92 [>.............................] - ETA: 5s - loss: 1.6968 - accuracy: 0.2750
   
6/92 [>.............................] - ETA: 5s - loss: 1.6729 - accuracy: 0.2917
   
7/92 [=>............................] - ETA: 5s - loss: 1.6459 - accuracy: 0.3080
   
8/92 [=>............................] - ETA: 5s - loss: 1.6410 - accuracy: 0.3008
   
9/92 [=>............................] - ETA: 4s - loss: 1.6246 - accuracy: 0.3125


10/92 [==>………………………] - ETA: 4s - loss: 1.6151 - accuracy: 0.3000



11/92 [==>………………………] - ETA: 4s - loss: 1.6065 - accuracy: 0.3011



12/92 [==>………………………] - ETA: 4s - loss: 1.5947 - accuracy: 0.3047



13/92 [===>……………………..] - ETA: 4s - loss: 1.5839 - accuracy: 0.3077



14/92 [===>……………………..] - ETA: 4s - loss: 1.5719 - accuracy: 0.3125



15/92 [===>……………………..] - ETA: 4s - loss: 1.5604 - accuracy: 0.3187



16/92 [====>…………………….] - ETA: 4s - loss: 1.5477 - accuracy: 0.3203



17/92 [====>…………………….] - ETA: 4s - loss: 1.5317 - accuracy: 0.3272



18/92 [====>…………………….] - ETA: 4s - loss: 1.5153 - accuracy: 0.3368



19/92 [=====>……………………] - ETA: 4s - loss: 1.5118 - accuracy: 0.3355



20/92 [=====>……………………] - ETA: 4s - loss: 1.4901 - accuracy: 0.3484



21/92 [=====>……………………] - ETA: 4s - loss: 1.4818 - accuracy: 0.3569



22/92 [======>…………………..] - ETA: 4s - loss: 1.4839 - accuracy: 0.3563



23/92 [======>…………………..] - ETA: 4s - loss: 1.4731 - accuracy: 0.3599



24/92 [======>…………………..] - ETA: 3s - loss: 1.4556 - accuracy: 0.3724



25/92 [=======>………………….] - ETA: 3s - loss: 1.4413 - accuracy: 0.3788



26/92 [=======>………………….] - ETA: 3s - loss: 1.4353 - accuracy: 0.3774



27/92 [=======>………………….] - ETA: 3s - loss: 1.4367 - accuracy: 0.3762



28/92 [========>…………………] - ETA: 3s - loss: 1.4293 - accuracy: 0.3750



29/92 [========>…………………] - ETA: 3s - loss: 1.4196 - accuracy: 0.3793



30/92 [========>…………………] - ETA: 3s - loss: 1.4177 - accuracy: 0.3813



31/92 [=========>………………..] - ETA: 3s - loss: 1.4057 - accuracy: 0.3872



32/92 [=========>………………..] - ETA: 3s - loss: 1.4028 - accuracy: 0.3868



33/92 [=========>………………..] - ETA: 3s - loss: 1.3896 - accuracy: 0.3950



34/92 [==========>……………….] - ETA: 3s - loss: 1.3879 - accuracy: 0.3963



35/92 [==========>……………….] - ETA: 3s - loss: 1.3886 - accuracy: 0.3966



36/92 [==========>……………….] - ETA: 3s - loss: 1.3839 - accuracy: 0.3969



37/92 [===========>………………] - ETA: 3s - loss: 1.3853 - accuracy: 0.4022



38/92 [===========>………………] - ETA: 3s - loss: 1.3812 - accuracy: 0.4023



39/92 [===========>………………] - ETA: 3s - loss: 1.3746 - accuracy: 0.4065



40/92 [============>……………..] - ETA: 3s - loss: 1.3733 - accuracy: 0.4049



41/92 [============>……………..] - ETA: 2s - loss: 1.3684 - accuracy: 0.4064



42/92 [============>……………..] - ETA: 2s - loss: 1.3665 - accuracy: 0.4064



43/92 [=============>…………….] - ETA: 2s - loss: 1.3624 - accuracy: 0.4108



44/92 [=============>…………….] - ETA: 2s - loss: 1.3590 - accuracy: 0.4121



45/92 [=============>…………….] - ETA: 2s - loss: 1.3533 - accuracy: 0.4148



46/92 [==============>……………] - ETA: 2s - loss: 1.3472 - accuracy: 0.4167



47/92 [==============>……………] - ETA: 2s - loss: 1.3448 - accuracy: 0.4164



48/92 [==============>……………] - ETA: 2s - loss: 1.3409 - accuracy: 0.4162



49/92 [==============>……………] - ETA: 2s - loss: 1.3383 - accuracy: 0.4186



50/92 [===============>…………..] - ETA: 2s - loss: 1.3381 - accuracy: 0.4190



51/92 [===============>…………..] - ETA: 2s - loss: 1.3341 - accuracy: 0.4212



52/92 [===============>…………..] - ETA: 2s - loss: 1.3292 - accuracy: 0.4245



53/92 [================>………….] - ETA: 2s - loss: 1.3286 - accuracy: 0.4277



54/92 [================>………….] - ETA: 2s - loss: 1.3246 - accuracy: 0.4302



55/92 [================>………….] - ETA: 2s - loss: 1.3228 - accuracy: 0.4309



56/92 [=================>…………] - ETA: 2s - loss: 1.3231 - accuracy: 0.4355



57/92 [=================>…………] - ETA: 2s - loss: 1.3221 - accuracy: 0.4350



58/92 [=================>…………] - ETA: 1s - loss: 1.3200 - accuracy: 0.4378



59/92 [==================>………..] - ETA: 1s - loss: 1.3177 - accuracy: 0.4394



60/92 [==================>………..] - ETA: 1s - loss: 1.3148 - accuracy: 0.4409



61/92 [==================>………..] - ETA: 1s - loss: 1.3140 - accuracy: 0.4408



62/92 [===================>……….] - ETA: 1s - loss: 1.3080 - accuracy: 0.4443



63/92 [===================>……….] - ETA: 1s - loss: 1.3096 - accuracy: 0.4447



64/92 [===================>……….] - ETA: 1s - loss: 1.3068 - accuracy: 0.4451



65/92 [====================>………] - ETA: 1s - loss: 1.3014 - accuracy: 0.4469



66/92 [====================>………] - ETA: 1s - loss: 1.3013 - accuracy: 0.4468



67/92 [====================>………] - ETA: 1s - loss: 1.2977 - accuracy: 0.4480



68/92 [=====================>……..] - ETA: 1s - loss: 1.2948 - accuracy: 0.4493



69/92 [=====================>……..] - ETA: 1s - loss: 1.2914 - accuracy: 0.4500



70/92 [=====================>……..] - ETA: 1s - loss: 1.2929 - accuracy: 0.4476



71/92 [======================>…….] - ETA: 1s - loss: 1.2929 - accuracy: 0.4479



72/92 [======================>…….] - ETA: 1s - loss: 1.2902 - accuracy: 0.4495



73/92 [======================>…….] - ETA: 1s - loss: 1.2864 - accuracy: 0.4506



74/92 [=======================>……] - ETA: 1s - loss: 1.2854 - accuracy: 0.4504



75/92 [=======================>……] - ETA: 0s - loss: 1.2853 - accuracy: 0.4494



76/92 [=======================>……] - ETA: 0s - loss: 1.2809 - accuracy: 0.4513



77/92 [========================>…..] - ETA: 0s - loss: 1.2779 - accuracy: 0.4532



78/92 [========================>…..] - ETA: 0s - loss: 1.2774 - accuracy: 0.4538



79/92 [========================>…..] - ETA: 0s - loss: 1.2739 - accuracy: 0.4540



80/92 [=========================>….] - ETA: 0s - loss: 1.2722 - accuracy: 0.4542



81/92 [=========================>….] - ETA: 0s - loss: 1.2669 - accuracy: 0.4582



82/92 [=========================>….] - ETA: 0s - loss: 1.2654 - accuracy: 0.4599



83/92 [==========================>…] - ETA: 0s - loss: 1.2625 - accuracy: 0.4622



84/92 [==========================>…] - ETA: 0s - loss: 1.2580 - accuracy: 0.4642



85/92 [==========================>…] - ETA: 0s - loss: 1.2573 - accuracy: 0.4653



86/92 [===========================>..] - ETA: 0s - loss: 1.2566 - accuracy: 0.4665



87/92 [===========================>..] - ETA: 0s - loss: 1.2564 - accuracy: 0.4665



88/92 [===========================>..] - ETA: 0s - loss: 1.2530 - accuracy: 0.4679



89/92 [============================>.] - ETA: 0s - loss: 1.2492 - accuracy: 0.4690



90/92 [============================>.] - ETA: 0s - loss: 1.2445 - accuracy: 0.4714



91/92 [============================>.] - ETA: 0s - loss: 1.2402 - accuracy: 0.4735



92/92 [==============================] - ETA: 0s - loss: 1.2400 - accuracy: 0.4741

2024-02-10 01:09:18.229567: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [734]
     [[{{node Placeholder/_0}}]]
2024-02-10 01:09:18.229847: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [734]
     [[{{node Placeholder/_4}}]]


92/92 [==============================] - 7s 66ms/step - loss: 1.2400 - accuracy: 0.4741 - val_loss: 1.3762 - val_accuracy: 0.5014

Epoch 2/15
1/92 [..............................] - ETA: 7s - loss: 1.2018 - accuracy: 0.5625
   
2/92 [..............................] - ETA: 5s - loss: 1.0597 - accuracy: 0.5469
   
3/92 [..............................] - ETA: 5s - loss: 1.0781 - accuracy: 0.5521
   
4/92 [>.............................] - ETA: 5s - loss: 0.9952 - accuracy: 0.5938
   
5/92 [>.............................] - ETA: 5s - loss: 0.9872 - accuracy: 0.6187
   
6/92 [>.............................] - ETA: 5s - loss: 0.9615 - accuracy: 0.6094
   
7/92 [=>............................] - ETA: 4s - loss: 0.9715 - accuracy: 0.6205
   
8/92 [=>............................] - ETA: 4s - loss: 0.9608 - accuracy: 0.6211
   
9/92 [=>............................] - ETA: 4s - loss: 0.9538 - accuracy: 0.6250


10/92 [==>………………………] - ETA: 4s - loss: 0.9487 - accuracy: 0.6250



11/92 [==>………………………] - ETA: 4s - loss: 0.9616 - accuracy: 0.6307



12/92 [==>………………………] - ETA: 4s - loss: 0.9475 - accuracy: 0.6302



13/92 [===>……………………..] - ETA: 4s - loss: 0.9476 - accuracy: 0.6346



14/92 [===>……………………..] - ETA: 4s - loss: 0.9426 - accuracy: 0.6295



15/92 [===>……………………..] - ETA: 4s - loss: 0.9547 - accuracy: 0.6250



16/92 [====>…………………….] - ETA: 4s - loss: 0.9611 - accuracy: 0.6211



17/92 [====>…………………….] - ETA: 4s - loss: 0.9566 - accuracy: 0.6176



18/92 [====>…………………….] - ETA: 4s - loss: 0.9654 - accuracy: 0.6111



19/92 [=====>……………………] - ETA: 4s - loss: 0.9686 - accuracy: 0.6118



20/92 [=====>……………………] - ETA: 4s - loss: 0.9718 - accuracy: 0.6078



21/92 [=====>……………………] - ETA: 4s - loss: 0.9820 - accuracy: 0.6012



22/92 [======>…………………..] - ETA: 4s - loss: 0.9866 - accuracy: 0.5994



23/92 [======>…………………..] - ETA: 4s - loss: 0.9839 - accuracy: 0.6005



24/92 [======>…………………..] - ETA: 3s - loss: 0.9903 - accuracy: 0.5964



25/92 [=======>………………….] - ETA: 3s - loss: 0.9938 - accuracy: 0.5950



26/92 [=======>………………….] - ETA: 3s - loss: 1.0051 - accuracy: 0.5901



27/92 [=======>………………….] - ETA: 3s - loss: 0.9990 - accuracy: 0.5938



28/92 [========>…………………] - ETA: 3s - loss: 0.9965 - accuracy: 0.5949



29/92 [========>…………………] - ETA: 3s - loss: 0.9986 - accuracy: 0.5948



30/92 [========>…………………] - ETA: 3s - loss: 0.9992 - accuracy: 0.5917



31/92 [=========>………………..] - ETA: 3s - loss: 0.9976 - accuracy: 0.5927



33/92 [=========>………………..] - ETA: 3s - loss: 0.9922 - accuracy: 0.5964



34/92 [==========>……………….] - ETA: 3s - loss: 0.9964 - accuracy: 0.5954



35/92 [==========>……………….] - ETA: 3s - loss: 1.0031 - accuracy: 0.5935



36/92 [==========>……………….] - ETA: 3s - loss: 1.0041 - accuracy: 0.5909



37/92 [===========>………………] - ETA: 3s - loss: 1.0075 - accuracy: 0.5884



38/92 [===========>………………] - ETA: 3s - loss: 1.0049 - accuracy: 0.5886



39/92 [===========>………………] - ETA: 3s - loss: 1.0093 - accuracy: 0.5879



40/92 [============>……………..] - ETA: 3s - loss: 1.0040 - accuracy: 0.5912



41/92 [============>……………..] - ETA: 2s - loss: 1.0038 - accuracy: 0.5920



42/92 [============>……………..] - ETA: 2s - loss: 1.0080 - accuracy: 0.5898



43/92 [=============>…………….] - ETA: 2s - loss: 1.0082 - accuracy: 0.5914



44/92 [=============>…………….] - ETA: 2s - loss: 1.0021 - accuracy: 0.5936



45/92 [=============>…………….] - ETA: 2s - loss: 1.0045 - accuracy: 0.5936



46/92 [==============>……………] - ETA: 2s - loss: 1.0064 - accuracy: 0.5922



47/92 [==============>……………] - ETA: 2s - loss: 1.0063 - accuracy: 0.5936



48/92 [==============>……………] - ETA: 2s - loss: 1.0006 - accuracy: 0.5975



49/92 [==============>……………] - ETA: 2s - loss: 0.9997 - accuracy: 0.5981



50/92 [===============>…………..] - ETA: 2s - loss: 0.9995 - accuracy: 0.5980



51/92 [===============>…………..] - ETA: 2s - loss: 0.9969 - accuracy: 0.5998



52/92 [===============>…………..] - ETA: 2s - loss: 0.9952 - accuracy: 0.5996



53/92 [================>………….] - ETA: 2s - loss: 0.9974 - accuracy: 0.5983



54/92 [================>………….] - ETA: 2s - loss: 1.0079 - accuracy: 0.5913



55/92 [================>………….] - ETA: 2s - loss: 1.0106 - accuracy: 0.5908



56/92 [=================>…………] - ETA: 2s - loss: 1.0121 - accuracy: 0.5880



57/92 [=================>…………] - ETA: 2s - loss: 1.0146 - accuracy: 0.5887



58/92 [=================>…………] - ETA: 1s - loss: 1.0174 - accuracy: 0.5882



59/92 [==================>………..] - ETA: 1s - loss: 1.0150 - accuracy: 0.5883



60/92 [==================>………..] - ETA: 1s - loss: 1.0120 - accuracy: 0.5884



61/92 [==================>………..] - ETA: 1s - loss: 1.0103 - accuracy: 0.5890



62/92 [===================>……….] - ETA: 1s - loss: 1.0095 - accuracy: 0.5901



63/92 [===================>……….] - ETA: 1s - loss: 1.0114 - accuracy: 0.5886



64/92 [===================>……….] - ETA: 1s - loss: 1.0114 - accuracy: 0.5897



65/92 [====================>………] - ETA: 1s - loss: 1.0108 - accuracy: 0.5907



66/92 [====================>………] - ETA: 1s - loss: 1.0115 - accuracy: 0.5903



67/92 [====================>………] - ETA: 1s - loss: 1.0117 - accuracy: 0.5894



68/92 [=====================>……..] - ETA: 1s - loss: 1.0125 - accuracy: 0.5881



69/92 [=====================>……..] - ETA: 1s - loss: 1.0151 - accuracy: 0.5850



70/92 [=====================>……..] - ETA: 1s - loss: 1.0146 - accuracy: 0.5865



71/92 [======================>…….] - ETA: 1s - loss: 1.0158 - accuracy: 0.5866



72/92 [======================>…….] - ETA: 1s - loss: 1.0140 - accuracy: 0.5871



73/92 [======================>…….] - ETA: 1s - loss: 1.0130 - accuracy: 0.5872



74/92 [=======================>……] - ETA: 1s - loss: 1.0127 - accuracy: 0.5873



75/92 [=======================>……] - ETA: 0s - loss: 1.0109 - accuracy: 0.5886



76/92 [=======================>……] - ETA: 0s - loss: 1.0074 - accuracy: 0.5895



77/92 [========================>…..] - ETA: 0s - loss: 1.0038 - accuracy: 0.5916



78/92 [========================>…..] - ETA: 0s - loss: 1.0003 - accuracy: 0.5936



79/92 [========================>…..] - ETA: 0s - loss: 0.9982 - accuracy: 0.5944



80/92 [=========================>….] - ETA: 0s - loss: 0.9978 - accuracy: 0.5929



81/92 [=========================>….] - ETA: 0s - loss: 1.0002 - accuracy: 0.5940



82/92 [=========================>….] - ETA: 0s - loss: 0.9975 - accuracy: 0.5944



83/92 [==========================>…] - ETA: 0s - loss: 0.9982 - accuracy: 0.5948



84/92 [==========================>…] - ETA: 0s - loss: 0.9979 - accuracy: 0.5963



85/92 [==========================>…] - ETA: 0s - loss: 0.9960 - accuracy: 0.5970



86/92 [===========================>..] - ETA: 0s - loss: 0.9941 - accuracy: 0.5977



87/92 [===========================>..] - ETA: 0s - loss: 0.9931 - accuracy: 0.5973



88/92 [===========================>..] - ETA: 0s - loss: 0.9948 - accuracy: 0.5954



89/92 [============================>.] - ETA: 0s - loss: 0.9953 - accuracy: 0.5951



90/92 [============================>.] - ETA: 0s - loss: 0.9947 - accuracy: 0.5961



91/92 [============================>.] - ETA: 0s - loss: 0.9932 - accuracy: 0.5964



92/92 [==============================] - ETA: 0s - loss: 0.9956 - accuracy: 0.5974



92/92 [==============================] - 6s 64ms/step - loss: 0.9956 - accuracy: 0.5974 - val_loss: 0.9920 - val_accuracy: 0.6090

Epoch 3/15
1/92 [..............................] - ETA: 7s - loss: 1.2602 - accuracy: 0.4688
   
2/92 [..............................] - ETA: 5s - loss: 1.1814 - accuracy: 0.5781
   
3/92 [..............................] - ETA: 5s - loss: 1.1491 - accuracy: 0.5625
   
4/92 [>.............................] - ETA: 5s - loss: 1.0875 - accuracy: 0.5781
   
5/92 [>.............................] - ETA: 5s - loss: 1.0316 - accuracy: 0.5875
   
6/92 [>.............................] - ETA: 4s - loss: 1.0206 - accuracy: 0.5833
   
7/92 [=>............................] - ETA: 4s - loss: 0.9818 - accuracy: 0.5938
   
8/92 [=>............................] - ETA: 4s - loss: 1.0018 - accuracy: 0.5859
   
9/92 [=>............................] - ETA: 4s - loss: 0.9855 - accuracy: 0.5938


10/92 [==>………………………] - ETA: 4s - loss: 0.9760 - accuracy: 0.5969



11/92 [==>………………………] - ETA: 4s - loss: 0.9811 - accuracy: 0.5881



12/92 [==>………………………] - ETA: 4s - loss: 0.9836 - accuracy: 0.5859



13/92 [===>……………………..] - ETA: 4s - loss: 0.9757 - accuracy: 0.5889



14/92 [===>……………………..] - ETA: 4s - loss: 0.9660 - accuracy: 0.5893



15/92 [===>……………………..] - ETA: 4s - loss: 0.9619 - accuracy: 0.5938



16/92 [====>…………………….] - ETA: 4s - loss: 0.9688 - accuracy: 0.5898



17/92 [====>…………………….] - ETA: 4s - loss: 0.9691 - accuracy: 0.5919



18/92 [====>…………………….] - ETA: 4s - loss: 0.9729 - accuracy: 0.5938



20/92 [=====>……………………] - ETA: 4s - loss: 0.9704 - accuracy: 0.5934



21/92 [=====>……………………] - ETA: 4s - loss: 0.9652 - accuracy: 0.5934



22/92 [======>…………………..] - ETA: 4s - loss: 0.9528 - accuracy: 0.6006



23/92 [======>…………………..] - ETA: 4s - loss: 0.9511 - accuracy: 0.6044



24/92 [======>…………………..] - ETA: 3s - loss: 0.9597 - accuracy: 0.6026



25/92 [=======>………………….] - ETA: 3s - loss: 0.9707 - accuracy: 0.5972



26/92 [=======>………………….] - ETA: 3s - loss: 0.9649 - accuracy: 0.5971



27/92 [=======>………………….] - ETA: 3s - loss: 0.9528 - accuracy: 0.6016



28/92 [========>…………………] - ETA: 3s - loss: 0.9453 - accuracy: 0.6059



29/92 [========>…………………] - ETA: 3s - loss: 0.9458 - accuracy: 0.6065



30/92 [========>…………………] - ETA: 3s - loss: 0.9467 - accuracy: 0.6061



31/92 [=========>………………..] - ETA: 3s - loss: 0.9432 - accuracy: 0.6098



32/92 [=========>………………..] - ETA: 3s - loss: 0.9430 - accuracy: 0.6093



33/92 [=========>………………..] - ETA: 3s - loss: 0.9351 - accuracy: 0.6164



34/92 [==========>……………….] - ETA: 3s - loss: 0.9405 - accuracy: 0.6139



35/92 [==========>……………….] - ETA: 3s - loss: 0.9356 - accuracy: 0.6169



36/92 [==========>……………….] - ETA: 3s - loss: 0.9322 - accuracy: 0.6215



37/92 [===========>………………] - ETA: 3s - loss: 0.9363 - accuracy: 0.6199



38/92 [===========>………………] - ETA: 3s - loss: 0.9325 - accuracy: 0.6217



39/92 [===========>………………] - ETA: 3s - loss: 0.9322 - accuracy: 0.6234



40/92 [============>……………..] - ETA: 3s - loss: 0.9291 - accuracy: 0.6250



41/92 [============>……………..] - ETA: 2s - loss: 0.9252 - accuracy: 0.6242



42/92 [============>……………..] - ETA: 2s - loss: 0.9226 - accuracy: 0.6257



43/92 [=============>…………….] - ETA: 2s - loss: 0.9213 - accuracy: 0.6250



44/92 [=============>…………….] - ETA: 2s - loss: 0.9232 - accuracy: 0.6243



45/92 [=============>…………….] - ETA: 2s - loss: 0.9261 - accuracy: 0.6222



46/92 [==============>……………] - ETA: 2s - loss: 0.9277 - accuracy: 0.6209



47/92 [==============>……………] - ETA: 2s - loss: 0.9308 - accuracy: 0.6197



48/92 [==============>……………] - ETA: 2s - loss: 0.9392 - accuracy: 0.6145



49/92 [==============>……………] - ETA: 2s - loss: 0.9396 - accuracy: 0.6122



50/92 [===============>…………..] - ETA: 2s - loss: 0.9369 - accuracy: 0.6137



51/92 [===============>…………..] - ETA: 2s - loss: 0.9361 - accuracy: 0.6139



52/92 [===============>…………..] - ETA: 2s - loss: 0.9326 - accuracy: 0.6147



53/92 [================>………….] - ETA: 2s - loss: 0.9335 - accuracy: 0.6149



54/92 [================>………….] - ETA: 2s - loss: 0.9319 - accuracy: 0.6169



55/92 [================>………….] - ETA: 2s - loss: 0.9303 - accuracy: 0.6170



56/92 [=================>…………] - ETA: 2s - loss: 0.9289 - accuracy: 0.6183



57/92 [=================>…………] - ETA: 2s - loss: 0.9316 - accuracy: 0.6167



58/92 [=================>…………] - ETA: 1s - loss: 0.9314 - accuracy: 0.6163



59/92 [==================>………..] - ETA: 1s - loss: 0.9313 - accuracy: 0.6165



60/92 [==================>………..] - ETA: 1s - loss: 0.9282 - accuracy: 0.6182



61/92 [==================>………..] - ETA: 1s - loss: 0.9264 - accuracy: 0.6188



62/92 [===================>……….] - ETA: 1s - loss: 0.9249 - accuracy: 0.6194



63/92 [===================>……….] - ETA: 1s - loss: 0.9295 - accuracy: 0.6165



64/92 [===================>……….] - ETA: 1s - loss: 0.9288 - accuracy: 0.6157



65/92 [====================>………] - ETA: 1s - loss: 0.9234 - accuracy: 0.6178



66/92 [====================>………] - ETA: 1s - loss: 0.9228 - accuracy: 0.6179



67/92 [====================>………] - ETA: 1s - loss: 0.9232 - accuracy: 0.6170



68/92 [=====================>……..] - ETA: 1s - loss: 0.9240 - accuracy: 0.6172



69/92 [=====================>……..] - ETA: 1s - loss: 0.9200 - accuracy: 0.6205



70/92 [=====================>……..] - ETA: 1s - loss: 0.9244 - accuracy: 0.6196



71/92 [======================>…….] - ETA: 1s - loss: 0.9237 - accuracy: 0.6197



72/92 [======================>…….] - ETA: 1s - loss: 0.9253 - accuracy: 0.6193



73/92 [======================>…….] - ETA: 1s - loss: 0.9243 - accuracy: 0.6203



74/92 [=======================>……] - ETA: 1s - loss: 0.9259 - accuracy: 0.6195



75/92 [=======================>……] - ETA: 0s - loss: 0.9242 - accuracy: 0.6204



76/92 [=======================>……] - ETA: 0s - loss: 0.9216 - accuracy: 0.6213



77/92 [========================>…..] - ETA: 0s - loss: 0.9203 - accuracy: 0.6209



78/92 [========================>…..] - ETA: 0s - loss: 0.9200 - accuracy: 0.6214



79/92 [========================>…..] - ETA: 0s - loss: 0.9181 - accuracy: 0.6226



80/92 [=========================>….] - ETA: 0s - loss: 0.9185 - accuracy: 0.6226



81/92 [=========================>….] - ETA: 0s - loss: 0.9160 - accuracy: 0.6250



82/92 [=========================>….] - ETA: 0s - loss: 0.9186 - accuracy: 0.6250



83/92 [==========================>…] - ETA: 0s - loss: 0.9164 - accuracy: 0.6258



84/92 [==========================>…] - ETA: 0s - loss: 0.9167 - accuracy: 0.6269



85/92 [==========================>…] - ETA: 0s - loss: 0.9177 - accuracy: 0.6265



86/92 [===========================>..] - ETA: 0s - loss: 0.9183 - accuracy: 0.6276



87/92 [===========================>..] - ETA: 0s - loss: 0.9182 - accuracy: 0.6275



88/92 [===========================>..] - ETA: 0s - loss: 0.9156 - accuracy: 0.6278



89/92 [============================>.] - ETA: 0s - loss: 0.9135 - accuracy: 0.6292



90/92 [============================>.] - ETA: 0s - loss: 0.9121 - accuracy: 0.6302



91/92 [============================>.] - ETA: 0s - loss: 0.9129 - accuracy: 0.6298



92/92 [==============================] - ETA: 0s - loss: 0.9155 - accuracy: 0.6298



92/92 [==============================] - 6s 64ms/step - loss: 0.9155 - accuracy: 0.6298 - val_loss: 0.8959 - val_accuracy: 0.6621

Epoch 4/15
1/92 [..............................] - ETA: 7s - loss: 0.7704 - accuracy: 0.7812
   
2/92 [..............................] - ETA: 5s - loss: 0.8739 - accuracy: 0.6562
   
3/92 [..............................] - ETA: 5s - loss: 0.9644 - accuracy: 0.6146
   
4/92 [>.............................] - ETA: 5s - loss: 0.9070 - accuracy: 0.6484
   
5/92 [>.............................] - ETA: 4s - loss: 0.8696 - accuracy: 0.6625
   
6/92 [>.............................] - ETA: 4s - loss: 0.8536 - accuracy: 0.6562
   
7/92 [=>............................] - ETA: 4s - loss: 0.8587 - accuracy: 0.6473
   
8/92 [=>............................] - ETA: 4s - loss: 0.8727 - accuracy: 0.6523
   
9/92 [=>............................] - ETA: 4s - loss: 0.8413 - accuracy: 0.6701


10/92 [==>………………………] - ETA: 4s - loss: 0.8577 - accuracy: 0.6594



11/92 [==>………………………] - ETA: 4s - loss: 0.8386 - accuracy: 0.6733



12/92 [==>………………………] - ETA: 4s - loss: 0.8637 - accuracy: 0.6589



13/92 [===>……………………..] - ETA: 4s - loss: 0.8819 - accuracy: 0.6659



14/92 [===>……………………..] - ETA: 4s - loss: 0.8783 - accuracy: 0.6674



15/92 [===>……………………..] - ETA: 4s - loss: 0.8797 - accuracy: 0.6667



16/92 [====>…………………….] - ETA: 4s - loss: 0.8644 - accuracy: 0.6777



17/92 [====>…………………….] - ETA: 4s - loss: 0.8715 - accuracy: 0.6746



18/92 [====>…………………….] - ETA: 4s - loss: 0.8544 - accuracy: 0.6788



19/92 [=====>……………………] - ETA: 4s - loss: 0.8484 - accuracy: 0.6809



20/92 [=====>……………………] - ETA: 4s - loss: 0.8429 - accuracy: 0.6828



21/92 [=====>……………………] - ETA: 4s - loss: 0.8352 - accuracy: 0.6860



22/92 [======>…………………..] - ETA: 4s - loss: 0.8293 - accuracy: 0.6875



23/92 [======>…………………..] - ETA: 3s - loss: 0.8324 - accuracy: 0.6861



24/92 [======>…………………..] - ETA: 3s - loss: 0.8321 - accuracy: 0.6875



25/92 [=======>………………….] - ETA: 3s - loss: 0.8377 - accuracy: 0.6913



26/92 [=======>………………….] - ETA: 3s - loss: 0.8366 - accuracy: 0.6923



27/92 [=======>………………….] - ETA: 3s - loss: 0.8278 - accuracy: 0.6933



28/92 [========>…………………] - ETA: 3s - loss: 0.8303 - accuracy: 0.6942



29/92 [========>…………………] - ETA: 3s - loss: 0.8305 - accuracy: 0.6950



30/92 [========>…………………] - ETA: 3s - loss: 0.8342 - accuracy: 0.6958



31/92 [=========>………………..] - ETA: 3s - loss: 0.8350 - accuracy: 0.6956



32/92 [=========>………………..] - ETA: 3s - loss: 0.8386 - accuracy: 0.6914



33/92 [=========>………………..] - ETA: 3s - loss: 0.8354 - accuracy: 0.6922



34/92 [==========>……………….] - ETA: 3s - loss: 0.8424 - accuracy: 0.6921



35/92 [==========>……………….] - ETA: 3s - loss: 0.8367 - accuracy: 0.6920



36/92 [==========>……………….] - ETA: 3s - loss: 0.8349 - accuracy: 0.6936



37/92 [===========>………………] - ETA: 3s - loss: 0.8365 - accuracy: 0.6926



38/92 [===========>………………] - ETA: 3s - loss: 0.8451 - accuracy: 0.6891



39/92 [===========>………………] - ETA: 3s - loss: 0.8401 - accuracy: 0.6899



40/92 [============>……………..] - ETA: 3s - loss: 0.8397 - accuracy: 0.6891



41/92 [============>……………..] - ETA: 2s - loss: 0.8379 - accuracy: 0.6913



42/92 [============>……………..] - ETA: 2s - loss: 0.8459 - accuracy: 0.6868



43/92 [=============>…………….] - ETA: 2s - loss: 0.8410 - accuracy: 0.6860



44/92 [=============>…………….] - ETA: 2s - loss: 0.8352 - accuracy: 0.6889



46/92 [==============>……………] - ETA: 2s - loss: 0.8372 - accuracy: 0.6872



47/92 [==============>……………] - ETA: 2s - loss: 0.8341 - accuracy: 0.6885



48/92 [==============>……………] - ETA: 2s - loss: 0.8279 - accuracy: 0.6918



49/92 [==============>……………] - ETA: 2s - loss: 0.8294 - accuracy: 0.6917



50/92 [===============>…………..] - ETA: 2s - loss: 0.8300 - accuracy: 0.6928



51/92 [===============>…………..] - ETA: 2s - loss: 0.8275 - accuracy: 0.6940



52/92 [===============>…………..] - ETA: 2s - loss: 0.8267 - accuracy: 0.6944



53/92 [================>………….] - ETA: 2s - loss: 0.8255 - accuracy: 0.6961



54/92 [================>………….] - ETA: 2s - loss: 0.8220 - accuracy: 0.6977



55/92 [================>………….] - ETA: 2s - loss: 0.8198 - accuracy: 0.6975



56/92 [=================>…………] - ETA: 2s - loss: 0.8172 - accuracy: 0.6979



57/92 [=================>…………] - ETA: 2s - loss: 0.8155 - accuracy: 0.6982



58/92 [=================>…………] - ETA: 1s - loss: 0.8128 - accuracy: 0.6997



59/92 [==================>………..] - ETA: 1s - loss: 0.8129 - accuracy: 0.7000



60/92 [==================>………..] - ETA: 1s - loss: 0.8170 - accuracy: 0.6987



61/92 [==================>………..] - ETA: 1s - loss: 0.8182 - accuracy: 0.6980



62/92 [===================>……….] - ETA: 1s - loss: 0.8189 - accuracy: 0.6964



63/92 [===================>……….] - ETA: 1s - loss: 0.8193 - accuracy: 0.6952



64/92 [===================>……….] - ETA: 1s - loss: 0.8194 - accuracy: 0.6961



65/92 [====================>………] - ETA: 1s - loss: 0.8215 - accuracy: 0.6955



66/92 [====================>………] - ETA: 1s - loss: 0.8210 - accuracy: 0.6953



67/92 [====================>………] - ETA: 1s - loss: 0.8223 - accuracy: 0.6952



68/92 [=====================>……..] - ETA: 1s - loss: 0.8207 - accuracy: 0.6960



69/92 [=====================>……..] - ETA: 1s - loss: 0.8182 - accuracy: 0.6977



70/92 [=====================>……..] - ETA: 1s - loss: 0.8195 - accuracy: 0.6962



71/92 [======================>…….] - ETA: 1s - loss: 0.8194 - accuracy: 0.6961



72/92 [======================>…….] - ETA: 1s - loss: 0.8208 - accuracy: 0.6956



73/92 [======================>…….] - ETA: 1s - loss: 0.8190 - accuracy: 0.6963



74/92 [=======================>……] - ETA: 1s - loss: 0.8170 - accuracy: 0.6970



75/92 [=======================>……] - ETA: 0s - loss: 0.8158 - accuracy: 0.6977



76/92 [=======================>……] - ETA: 0s - loss: 0.8179 - accuracy: 0.6955



77/92 [========================>…..] - ETA: 0s - loss: 0.8176 - accuracy: 0.6958



78/92 [========================>…..] - ETA: 0s - loss: 0.8175 - accuracy: 0.6957



79/92 [========================>…..] - ETA: 0s - loss: 0.8188 - accuracy: 0.6952



80/92 [=========================>….] - ETA: 0s - loss: 0.8172 - accuracy: 0.6959



81/92 [=========================>….] - ETA: 0s - loss: 0.8217 - accuracy: 0.6939



82/92 [=========================>….] - ETA: 0s - loss: 0.8197 - accuracy: 0.6950



83/92 [==========================>…] - ETA: 0s - loss: 0.8180 - accuracy: 0.6952



84/92 [==========================>…] - ETA: 0s - loss: 0.8183 - accuracy: 0.6951



85/92 [==========================>…] - ETA: 0s - loss: 0.8190 - accuracy: 0.6947



86/92 [===========================>..] - ETA: 0s - loss: 0.8208 - accuracy: 0.6928



87/92 [===========================>..] - ETA: 0s - loss: 0.8207 - accuracy: 0.6931



88/92 [===========================>..] - ETA: 0s - loss: 0.8205 - accuracy: 0.6930



89/92 [============================>.] - ETA: 0s - loss: 0.8188 - accuracy: 0.6937



90/92 [============================>.] - ETA: 0s - loss: 0.8180 - accuracy: 0.6939



91/92 [============================>.] - ETA: 0s - loss: 0.8167 - accuracy: 0.6946



92/92 [==============================] - ETA: 0s - loss: 0.8158 - accuracy: 0.6945



92/92 [==============================] - 6s 64ms/step - loss: 0.8158 - accuracy: 0.6945 - val_loss: 0.8530 - val_accuracy: 0.6757

Epoch 5/15
1/92 [..............................] - ETA: 7s - loss: 0.8907 - accuracy: 0.7188
   
2/92 [..............................] - ETA: 5s - loss: 0.8773 - accuracy: 0.6875
   
3/92 [..............................] - ETA: 5s - loss: 0.8330 - accuracy: 0.6771
   
4/92 [>.............................] - ETA: 5s - loss: 0.7960 - accuracy: 0.6953
   
5/92 [>.............................] - ETA: 5s - loss: 0.8390 - accuracy: 0.6812
   
6/92 [>.............................] - ETA: 5s - loss: 0.8144 - accuracy: 0.6771
   
7/92 [=>............................] - ETA: 5s - loss: 0.8024 - accuracy: 0.6920
   
8/92 [=>............................] - ETA: 4s - loss: 0.8119 - accuracy: 0.6914
   
9/92 [=>............................] - ETA: 4s - loss: 0.8164 - accuracy: 0.6875


10/92 [==>………………………] - ETA: 4s - loss: 0.7930 - accuracy: 0.7000



11/92 [==>………………………] - ETA: 4s - loss: 0.7694 - accuracy: 0.7102



12/92 [==>………………………] - ETA: 4s - loss: 0.7519 - accuracy: 0.7161



13/92 [===>……………………..] - ETA: 4s - loss: 0.7302 - accuracy: 0.7260



14/92 [===>……………………..] - ETA: 4s - loss: 0.7293 - accuracy: 0.7210



15/92 [===>……………………..] - ETA: 4s - loss: 0.7256 - accuracy: 0.7208



16/92 [====>…………………….] - ETA: 4s - loss: 0.7320 - accuracy: 0.7207



17/92 [====>…………………….] - ETA: 4s - loss: 0.7327 - accuracy: 0.7243



18/92 [====>…………………….] - ETA: 4s - loss: 0.7351 - accuracy: 0.7205



19/92 [=====>……………………] - ETA: 4s - loss: 0.7352 - accuracy: 0.7204



20/92 [=====>……………………] - ETA: 4s - loss: 0.7393 - accuracy: 0.7203



21/92 [=====>……………………] - ETA: 4s - loss: 0.7438 - accuracy: 0.7217



22/92 [======>…………………..] - ETA: 4s - loss: 0.7462 - accuracy: 0.7216



23/92 [======>…………………..] - ETA: 3s - loss: 0.7597 - accuracy: 0.7147



24/92 [======>…………………..] - ETA: 3s - loss: 0.7511 - accuracy: 0.7188



25/92 [=======>………………….] - ETA: 3s - loss: 0.7569 - accuracy: 0.7150



26/92 [=======>………………….] - ETA: 3s - loss: 0.7474 - accuracy: 0.7175



27/92 [=======>………………….] - ETA: 3s - loss: 0.7583 - accuracy: 0.7141



28/92 [========>…………………] - ETA: 3s - loss: 0.7548 - accuracy: 0.7154



29/92 [========>…………………] - ETA: 3s - loss: 0.7512 - accuracy: 0.7155



30/92 [========>…………………] - ETA: 3s - loss: 0.7478 - accuracy: 0.7156



31/92 [=========>………………..] - ETA: 3s - loss: 0.7464 - accuracy: 0.7157



32/92 [=========>………………..] - ETA: 3s - loss: 0.7567 - accuracy: 0.7129



33/92 [=========>………………..] - ETA: 3s - loss: 0.7518 - accuracy: 0.7159



34/92 [==========>……………….] - ETA: 3s - loss: 0.7569 - accuracy: 0.7160



35/92 [==========>……………….] - ETA: 3s - loss: 0.7563 - accuracy: 0.7152



36/92 [==========>……………….] - ETA: 3s - loss: 0.7563 - accuracy: 0.7153



37/92 [===========>………………] - ETA: 3s - loss: 0.7607 - accuracy: 0.7111



38/92 [===========>………………] - ETA: 3s - loss: 0.7650 - accuracy: 0.7072



39/92 [===========>………………] - ETA: 3s - loss: 0.7642 - accuracy: 0.7083



40/92 [============>……………..] - ETA: 3s - loss: 0.7700 - accuracy: 0.7078



41/92 [============>……………..] - ETA: 2s - loss: 0.7711 - accuracy: 0.7096



42/92 [============>……………..] - ETA: 2s - loss: 0.7661 - accuracy: 0.7128



43/92 [=============>…………….] - ETA: 2s - loss: 0.7633 - accuracy: 0.7129



44/92 [=============>…………….] - ETA: 2s - loss: 0.7632 - accuracy: 0.7124



45/92 [=============>…………….] - ETA: 2s - loss: 0.7625 - accuracy: 0.7125



46/92 [==============>……………] - ETA: 2s - loss: 0.7612 - accuracy: 0.7120



47/92 [==============>……………] - ETA: 2s - loss: 0.7590 - accuracy: 0.7108



48/92 [==============>……………] - ETA: 2s - loss: 0.7586 - accuracy: 0.7103



49/92 [==============>……………] - ETA: 2s - loss: 0.7561 - accuracy: 0.7111



50/92 [===============>…………..] - ETA: 2s - loss: 0.7645 - accuracy: 0.7050



52/92 [===============>…………..] - ETA: 2s - loss: 0.7645 - accuracy: 0.7047



53/92 [================>………….] - ETA: 2s - loss: 0.7692 - accuracy: 0.7038



54/92 [================>………….] - ETA: 2s - loss: 0.7705 - accuracy: 0.7035



55/92 [================>………….] - ETA: 2s - loss: 0.7802 - accuracy: 0.6986



56/92 [=================>…………] - ETA: 2s - loss: 0.7780 - accuracy: 0.6990



57/92 [=================>…………] - ETA: 2s - loss: 0.7770 - accuracy: 0.6982



58/92 [=================>…………] - ETA: 1s - loss: 0.7749 - accuracy: 0.6997



59/92 [==================>………..] - ETA: 1s - loss: 0.7784 - accuracy: 0.6984



60/92 [==================>………..] - ETA: 1s - loss: 0.7787 - accuracy: 0.6967



61/92 [==================>………..] - ETA: 1s - loss: 0.7795 - accuracy: 0.6955



62/92 [===================>……….] - ETA: 1s - loss: 0.7780 - accuracy: 0.6964



63/92 [===================>……….] - ETA: 1s - loss: 0.7771 - accuracy: 0.6977



64/92 [===================>……….] - ETA: 1s - loss: 0.7814 - accuracy: 0.6961



65/92 [====================>………] - ETA: 1s - loss: 0.7836 - accuracy: 0.6950



66/92 [====================>………] - ETA: 1s - loss: 0.7805 - accuracy: 0.6968



67/92 [====================>………] - ETA: 1s - loss: 0.7795 - accuracy: 0.6966



68/92 [=====================>……..] - ETA: 1s - loss: 0.7841 - accuracy: 0.6933



69/92 [=====================>……..] - ETA: 1s - loss: 0.7849 - accuracy: 0.6932



70/92 [=====================>……..] - ETA: 1s - loss: 0.7907 - accuracy: 0.6927



71/92 [======================>…….] - ETA: 1s - loss: 0.7915 - accuracy: 0.6917



72/92 [======================>…….] - ETA: 1s - loss: 0.7909 - accuracy: 0.6916



73/92 [======================>…….] - ETA: 1s - loss: 0.7939 - accuracy: 0.6903



74/92 [=======================>……] - ETA: 1s - loss: 0.7967 - accuracy: 0.6890



75/92 [=======================>……] - ETA: 0s - loss: 0.7993 - accuracy: 0.6873



76/92 [=======================>……] - ETA: 0s - loss: 0.8003 - accuracy: 0.6861



77/92 [========================>…..] - ETA: 0s - loss: 0.8013 - accuracy: 0.6853



78/92 [========================>…..] - ETA: 0s - loss: 0.8004 - accuracy: 0.6853



79/92 [========================>…..] - ETA: 0s - loss: 0.7983 - accuracy: 0.6861



80/92 [=========================>….] - ETA: 0s - loss: 0.8004 - accuracy: 0.6846



81/92 [=========================>….] - ETA: 0s - loss: 0.8001 - accuracy: 0.6854



82/92 [=========================>….] - ETA: 0s - loss: 0.7997 - accuracy: 0.6858



83/92 [==========================>…] - ETA: 0s - loss: 0.7988 - accuracy: 0.6869



84/92 [==========================>…] - ETA: 0s - loss: 0.7983 - accuracy: 0.6869



85/92 [==========================>…] - ETA: 0s - loss: 0.7975 - accuracy: 0.6881



86/92 [===========================>..] - ETA: 0s - loss: 0.7958 - accuracy: 0.6891



87/92 [===========================>..] - ETA: 0s - loss: 0.7945 - accuracy: 0.6902



88/92 [===========================>..] - ETA: 0s - loss: 0.7928 - accuracy: 0.6909



89/92 [============================>.] - ETA: 0s - loss: 0.7910 - accuracy: 0.6919



90/92 [============================>.] - ETA: 0s - loss: 0.7917 - accuracy: 0.6915



91/92 [============================>.] - ETA: 0s - loss: 0.7883 - accuracy: 0.6932



92/92 [==============================] - ETA: 0s - loss: 0.7896 - accuracy: 0.6931



92/92 [==============================] - 6s 63ms/step - loss: 0.7896 - accuracy: 0.6931 - val_loss: 0.8867 - val_accuracy: 0.6798

Epoch 6/15
1/92 [..............................] - ETA: 6s - loss: 0.5518 - accuracy: 0.7812
   
2/92 [..............................] - ETA: 5s - loss: 0.7630 - accuracy: 0.7500
   
3/92 [..............................] - ETA: 5s - loss: 0.7584 - accuracy: 0.7604
   
4/92 [>.............................] - ETA: 5s - loss: 0.7502 - accuracy: 0.7266
   
5/92 [>.............................] - ETA: 5s - loss: 0.7661 - accuracy: 0.7125
   
6/92 [>.............................] - ETA: 4s - loss: 0.7633 - accuracy: 0.7083
   
7/92 [=>............................] - ETA: 4s - loss: 0.7836 - accuracy: 0.7054
   
8/92 [=>............................] - ETA: 4s - loss: 0.7776 - accuracy: 0.7070
   
9/92 [=>............................] - ETA: 4s - loss: 0.7611 - accuracy: 0.7153


10/92 [==>………………………] - ETA: 4s - loss: 0.7587 - accuracy: 0.7125



11/92 [==>………………………] - ETA: 4s - loss: 0.7458 - accuracy: 0.7244



12/92 [==>………………………] - ETA: 4s - loss: 0.7555 - accuracy: 0.7266



13/92 [===>……………………..] - ETA: 4s - loss: 0.7522 - accuracy: 0.7308



14/92 [===>……………………..] - ETA: 4s - loss: 0.7398 - accuracy: 0.7277



15/92 [===>……………………..] - ETA: 4s - loss: 0.7376 - accuracy: 0.7312



16/92 [====>…………………….] - ETA: 4s - loss: 0.7344 - accuracy: 0.7285



17/92 [====>…………………….] - ETA: 4s - loss: 0.7325 - accuracy: 0.7298



18/92 [====>…………………….] - ETA: 4s - loss: 0.7301 - accuracy: 0.7292



19/92 [=====>……………………] - ETA: 4s - loss: 0.7525 - accuracy: 0.7237



20/92 [=====>……………………] - ETA: 4s - loss: 0.7644 - accuracy: 0.7188



21/92 [=====>……………………] - ETA: 4s - loss: 0.7723 - accuracy: 0.7158



22/92 [======>…………………..] - ETA: 4s - loss: 0.7636 - accuracy: 0.7202



23/92 [======>…………………..] - ETA: 3s - loss: 0.7545 - accuracy: 0.7228



24/92 [======>…………………..] - ETA: 3s - loss: 0.7522 - accuracy: 0.7227



25/92 [=======>………………….] - ETA: 3s - loss: 0.7536 - accuracy: 0.7200



26/92 [=======>………………….] - ETA: 3s - loss: 0.7700 - accuracy: 0.7103



27/92 [=======>………………….] - ETA: 3s - loss: 0.7697 - accuracy: 0.7106



28/92 [========>…………………] - ETA: 3s - loss: 0.7738 - accuracy: 0.7121



29/92 [========>…………………] - ETA: 3s - loss: 0.7679 - accuracy: 0.7155



30/92 [========>…………………] - ETA: 3s - loss: 0.7772 - accuracy: 0.7104



31/92 [=========>………………..] - ETA: 3s - loss: 0.7790 - accuracy: 0.7107



32/92 [=========>………………..] - ETA: 3s - loss: 0.7822 - accuracy: 0.7100



33/92 [=========>………………..] - ETA: 3s - loss: 0.7819 - accuracy: 0.7112



34/92 [==========>……………….] - ETA: 3s - loss: 0.7831 - accuracy: 0.7086



35/92 [==========>……………….] - ETA: 3s - loss: 0.7782 - accuracy: 0.7089



36/92 [==========>……………….] - ETA: 3s - loss: 0.7777 - accuracy: 0.7083



37/92 [===========>………………] - ETA: 3s - loss: 0.7794 - accuracy: 0.7078



38/92 [===========>………………] - ETA: 3s - loss: 0.7856 - accuracy: 0.7064



39/92 [===========>………………] - ETA: 3s - loss: 0.7847 - accuracy: 0.7067



40/92 [============>……………..] - ETA: 3s - loss: 0.7862 - accuracy: 0.7047



41/92 [============>……………..] - ETA: 2s - loss: 0.7830 - accuracy: 0.7058



42/92 [============>……………..] - ETA: 2s - loss: 0.7793 - accuracy: 0.7068



43/92 [=============>…………….] - ETA: 2s - loss: 0.7756 - accuracy: 0.7086



44/92 [=============>…………….] - ETA: 2s - loss: 0.7727 - accuracy: 0.7095



45/92 [=============>…………….] - ETA: 2s - loss: 0.7714 - accuracy: 0.7076



46/92 [==============>……………] - ETA: 2s - loss: 0.7723 - accuracy: 0.7079



47/92 [==============>……………] - ETA: 2s - loss: 0.7696 - accuracy: 0.7094



48/92 [==============>……………] - ETA: 2s - loss: 0.7686 - accuracy: 0.7109



49/92 [==============>……………] - ETA: 2s - loss: 0.7657 - accuracy: 0.7136



50/92 [===============>…………..] - ETA: 2s - loss: 0.7702 - accuracy: 0.7119



51/92 [===============>…………..] - ETA: 2s - loss: 0.7732 - accuracy: 0.7126



52/92 [===============>…………..] - ETA: 2s - loss: 0.7772 - accuracy: 0.7109



53/92 [================>………….] - ETA: 2s - loss: 0.7780 - accuracy: 0.7099



54/92 [================>………….] - ETA: 2s - loss: 0.7742 - accuracy: 0.7124



55/92 [================>………….] - ETA: 2s - loss: 0.7721 - accuracy: 0.7136



56/92 [=================>…………] - ETA: 2s - loss: 0.7717 - accuracy: 0.7132



57/92 [=================>…………] - ETA: 2s - loss: 0.7688 - accuracy: 0.7138



58/92 [=================>…………] - ETA: 1s - loss: 0.7733 - accuracy: 0.7101



59/92 [==================>………..] - ETA: 1s - loss: 0.7728 - accuracy: 0.7119



60/92 [==================>………..] - ETA: 1s - loss: 0.7746 - accuracy: 0.7109



61/92 [==================>………..] - ETA: 1s - loss: 0.7697 - accuracy: 0.7126



62/92 [===================>……….] - ETA: 1s - loss: 0.7746 - accuracy: 0.7117



63/92 [===================>……….] - ETA: 1s - loss: 0.7725 - accuracy: 0.7123



64/92 [===================>……….] - ETA: 1s - loss: 0.7711 - accuracy: 0.7134



65/92 [====================>………] - ETA: 1s - loss: 0.7713 - accuracy: 0.7130



66/92 [====================>………] - ETA: 1s - loss: 0.7690 - accuracy: 0.7135



67/92 [====================>………] - ETA: 1s - loss: 0.7682 - accuracy: 0.7141



68/92 [=====================>……..] - ETA: 1s - loss: 0.7670 - accuracy: 0.7146



69/92 [=====================>……..] - ETA: 1s - loss: 0.7655 - accuracy: 0.7156



70/92 [=====================>……..] - ETA: 1s - loss: 0.7639 - accuracy: 0.7161



71/92 [======================>…….] - ETA: 1s - loss: 0.7664 - accuracy: 0.7139



72/92 [======================>…….] - ETA: 1s - loss: 0.7650 - accuracy: 0.7153



73/92 [======================>…….] - ETA: 1s - loss: 0.7669 - accuracy: 0.7132



74/92 [=======================>……] - ETA: 1s - loss: 0.7677 - accuracy: 0.7124



76/92 [=======================>……] - ETA: 0s - loss: 0.7683 - accuracy: 0.7120



77/92 [========================>…..] - ETA: 0s - loss: 0.7676 - accuracy: 0.7113



78/92 [========================>…..] - ETA: 0s - loss: 0.7679 - accuracy: 0.7122



79/92 [========================>…..] - ETA: 0s - loss: 0.7655 - accuracy: 0.7127



80/92 [=========================>….] - ETA: 0s - loss: 0.7639 - accuracy: 0.7132



81/92 [=========================>….] - ETA: 0s - loss: 0.7643 - accuracy: 0.7136



82/92 [=========================>….] - ETA: 0s - loss: 0.7672 - accuracy: 0.7122



83/92 [==========================>…] - ETA: 0s - loss: 0.7681 - accuracy: 0.7115



84/92 [==========================>…] - ETA: 0s - loss: 0.7652 - accuracy: 0.7131



85/92 [==========================>…] - ETA: 0s - loss: 0.7664 - accuracy: 0.7124



86/92 [===========================>..] - ETA: 0s - loss: 0.7678 - accuracy: 0.7114



87/92 [===========================>..] - ETA: 0s - loss: 0.7660 - accuracy: 0.7118



88/92 [===========================>..] - ETA: 0s - loss: 0.7654 - accuracy: 0.7115



89/92 [============================>.] - ETA: 0s - loss: 0.7658 - accuracy: 0.7109



90/92 [============================>.] - ETA: 0s - loss: 0.7648 - accuracy: 0.7107



91/92 [============================>.] - ETA: 0s - loss: 0.7648 - accuracy: 0.7107



92/92 [==============================] - ETA: 0s - loss: 0.7647 - accuracy: 0.7115



92/92 [==============================] - 6s 64ms/step - loss: 0.7647 - accuracy: 0.7115 - val_loss: 0.7599 - val_accuracy: 0.7016

Epoch 7/15
1/92 [..............................] - ETA: 7s - loss: 0.4912 - accuracy: 0.8438
   
2/92 [..............................] - ETA: 5s - loss: 0.5197 - accuracy: 0.7812
   
3/92 [..............................] - ETA: 5s - loss: 0.6350 - accuracy: 0.7396
   
4/92 [>.............................] - ETA: 5s - loss: 0.6448 - accuracy: 0.7500
   
5/92 [>.............................] - ETA: 5s - loss: 0.6741 - accuracy: 0.7375
   
6/92 [>.............................] - ETA: 5s - loss: 0.7069 - accuracy: 0.7344
   
7/92 [=>............................] - ETA: 5s - loss: 0.7105 - accuracy: 0.7321
   
8/92 [=>............................] - ETA: 5s - loss: 0.7082 - accuracy: 0.7344
   
9/92 [=>............................] - ETA: 4s - loss: 0.7131 - accuracy: 0.7326


10/92 [==>………………………] - ETA: 4s - loss: 0.7040 - accuracy: 0.7312



11/92 [==>………………………] - ETA: 4s - loss: 0.7117 - accuracy: 0.7244



12/92 [==>………………………] - ETA: 4s - loss: 0.7376 - accuracy: 0.7161



13/92 [===>……………………..] - ETA: 4s - loss: 0.7223 - accuracy: 0.7236



14/92 [===>……………………..] - ETA: 4s - loss: 0.7167 - accuracy: 0.7210



15/92 [===>……………………..] - ETA: 4s - loss: 0.7110 - accuracy: 0.7250



16/92 [====>…………………….] - ETA: 4s - loss: 0.6943 - accuracy: 0.7324



17/92 [====>…………………….] - ETA: 4s - loss: 0.6881 - accuracy: 0.7335



18/92 [====>…………………….] - ETA: 4s - loss: 0.6882 - accuracy: 0.7326



19/92 [=====>……………………] - ETA: 4s - loss: 0.6898 - accuracy: 0.7319



20/92 [=====>……………………] - ETA: 4s - loss: 0.6850 - accuracy: 0.7328



21/92 [=====>……………………] - ETA: 4s - loss: 0.6983 - accuracy: 0.7292



22/92 [======>…………………..] - ETA: 4s - loss: 0.6962 - accuracy: 0.7301



23/92 [======>…………………..] - ETA: 4s - loss: 0.6905 - accuracy: 0.7323



24/92 [======>…………………..] - ETA: 4s - loss: 0.6827 - accuracy: 0.7370



25/92 [=======>………………….] - ETA: 3s - loss: 0.6814 - accuracy: 0.7350



26/92 [=======>………………….] - ETA: 3s - loss: 0.6826 - accuracy: 0.7332



27/92 [=======>………………….] - ETA: 3s - loss: 0.6718 - accuracy: 0.7396



28/92 [========>…………………] - ETA: 3s - loss: 0.6691 - accuracy: 0.7388



29/92 [========>…………………] - ETA: 3s - loss: 0.6769 - accuracy: 0.7349



30/92 [========>…………………] - ETA: 3s - loss: 0.6747 - accuracy: 0.7365



31/92 [=========>………………..] - ETA: 3s - loss: 0.6848 - accuracy: 0.7339



32/92 [=========>………………..] - ETA: 3s - loss: 0.6793 - accuracy: 0.7383



33/92 [=========>………………..] - ETA: 3s - loss: 0.6826 - accuracy: 0.7377



34/92 [==========>……………….] - ETA: 3s - loss: 0.6777 - accuracy: 0.7408



35/92 [==========>……………….] - ETA: 3s - loss: 0.6865 - accuracy: 0.7357



36/92 [==========>……………….] - ETA: 3s - loss: 0.6899 - accuracy: 0.7352



37/92 [===========>………………] - ETA: 3s - loss: 0.6925 - accuracy: 0.7340



38/92 [===========>………………] - ETA: 3s - loss: 0.6955 - accuracy: 0.7327



39/92 [===========>………………] - ETA: 3s - loss: 0.6942 - accuracy: 0.7324



40/92 [============>……………..] - ETA: 3s - loss: 0.7007 - accuracy: 0.7297



41/92 [============>……………..] - ETA: 2s - loss: 0.7064 - accuracy: 0.7264



42/92 [============>……………..] - ETA: 2s - loss: 0.7118 - accuracy: 0.7262



43/92 [=============>…………….] - ETA: 2s - loss: 0.7098 - accuracy: 0.7267



44/92 [=============>…………….] - ETA: 2s - loss: 0.7088 - accuracy: 0.7294



45/92 [=============>…………….] - ETA: 2s - loss: 0.7112 - accuracy: 0.7292



46/92 [==============>……………] - ETA: 2s - loss: 0.7105 - accuracy: 0.7283



47/92 [==============>……………] - ETA: 2s - loss: 0.7076 - accuracy: 0.7294



48/92 [==============>……………] - ETA: 2s - loss: 0.7085 - accuracy: 0.7272



49/92 [==============>……………] - ETA: 2s - loss: 0.7099 - accuracy: 0.7277



50/92 [===============>…………..] - ETA: 2s - loss: 0.7086 - accuracy: 0.7287



51/92 [===============>…………..] - ETA: 2s - loss: 0.7087 - accuracy: 0.7298



52/92 [===============>…………..] - ETA: 2s - loss: 0.7098 - accuracy: 0.7302



53/92 [================>………….] - ETA: 2s - loss: 0.7106 - accuracy: 0.7300



54/92 [================>………….] - ETA: 2s - loss: 0.7093 - accuracy: 0.7315



55/92 [================>………….] - ETA: 2s - loss: 0.7085 - accuracy: 0.7318



56/92 [=================>…………] - ETA: 2s - loss: 0.7107 - accuracy: 0.7299



57/92 [=================>…………] - ETA: 2s - loss: 0.7082 - accuracy: 0.7308



58/92 [=================>…………] - ETA: 1s - loss: 0.7098 - accuracy: 0.7295



59/92 [==================>………..] - ETA: 1s - loss: 0.7162 - accuracy: 0.7251



60/92 [==================>………..] - ETA: 1s - loss: 0.7135 - accuracy: 0.7260



61/92 [==================>………..] - ETA: 1s - loss: 0.7132 - accuracy: 0.7259



62/92 [===================>……….] - ETA: 1s - loss: 0.7137 - accuracy: 0.7263



63/92 [===================>……….] - ETA: 1s - loss: 0.7139 - accuracy: 0.7257



64/92 [===================>……….] - ETA: 1s - loss: 0.7139 - accuracy: 0.7251



65/92 [====================>………] - ETA: 1s - loss: 0.7107 - accuracy: 0.7264



66/92 [====================>………] - ETA: 1s - loss: 0.7091 - accuracy: 0.7268



67/92 [====================>………] - ETA: 1s - loss: 0.7080 - accuracy: 0.7276



68/92 [=====================>……..] - ETA: 1s - loss: 0.7085 - accuracy: 0.7275



69/92 [=====================>……..] - ETA: 1s - loss: 0.7085 - accuracy: 0.7274



70/92 [=====================>……..] - ETA: 1s - loss: 0.7086 - accuracy: 0.7277



71/92 [======================>…….] - ETA: 1s - loss: 0.7068 - accuracy: 0.7293



72/92 [======================>…….] - ETA: 1s - loss: 0.7058 - accuracy: 0.7300



74/92 [=======================>……] - ETA: 1s - loss: 0.7033 - accuracy: 0.7292



75/92 [=======================>……] - ETA: 0s - loss: 0.7020 - accuracy: 0.7304



76/92 [=======================>……] - ETA: 0s - loss: 0.7010 - accuracy: 0.7306



77/92 [========================>…..] - ETA: 0s - loss: 0.6969 - accuracy: 0.7325



78/92 [========================>…..] - ETA: 0s - loss: 0.6963 - accuracy: 0.7327



79/92 [========================>…..] - ETA: 0s - loss: 0.6964 - accuracy: 0.7329



80/92 [=========================>….] - ETA: 0s - loss: 0.6943 - accuracy: 0.7343



81/92 [=========================>….] - ETA: 0s - loss: 0.6968 - accuracy: 0.7345



82/92 [=========================>….] - ETA: 0s - loss: 0.6975 - accuracy: 0.7343



83/92 [==========================>…] - ETA: 0s - loss: 0.7000 - accuracy: 0.7330



84/92 [==========================>…] - ETA: 0s - loss: 0.7012 - accuracy: 0.7328



85/92 [==========================>…] - ETA: 0s - loss: 0.6997 - accuracy: 0.7334



86/92 [===========================>..] - ETA: 0s - loss: 0.6986 - accuracy: 0.7336



87/92 [===========================>..] - ETA: 0s - loss: 0.6983 - accuracy: 0.7338



88/92 [===========================>..] - ETA: 0s - loss: 0.6983 - accuracy: 0.7340



89/92 [============================>.] - ETA: 0s - loss: 0.6967 - accuracy: 0.7338



90/92 [============================>.] - ETA: 0s - loss: 0.6940 - accuracy: 0.7357



91/92 [============================>.] - ETA: 0s - loss: 0.6932 - accuracy: 0.7362



92/92 [==============================] - ETA: 0s - loss: 0.6932 - accuracy: 0.7360



92/92 [==============================] - 6s 64ms/step - loss: 0.6932 - accuracy: 0.7360 - val_loss: 0.7731 - val_accuracy: 0.6853

Epoch 8/15
1/92 [..............................] - ETA: 7s - loss: 0.5886 - accuracy: 0.7500
   
2/92 [..............................] - ETA: 5s - loss: 0.6209 - accuracy: 0.7500
   
3/92 [..............................] - ETA: 5s - loss: 0.6834 - accuracy: 0.7396
   
4/92 [>.............................] - ETA: 5s - loss: 0.6812 - accuracy: 0.7266
   
5/92 [>.............................] - ETA: 5s - loss: 0.6540 - accuracy: 0.7312
   
6/92 [>.............................] - ETA: 5s - loss: 0.6633 - accuracy: 0.7240
   
7/92 [=>............................] - ETA: 4s - loss: 0.6415 - accuracy: 0.7411
   
8/92 [=>............................] - ETA: 4s - loss: 0.6256 - accuracy: 0.7539
   
9/92 [=>............................] - ETA: 4s - loss: 0.5989 - accuracy: 0.7604


10/92 [==>………………………] - ETA: 4s - loss: 0.6047 - accuracy: 0.7500



11/92 [==>………………………] - ETA: 4s - loss: 0.6068 - accuracy: 0.7500



12/92 [==>………………………] - ETA: 4s - loss: 0.6024 - accuracy: 0.7500



13/92 [===>……………………..] - ETA: 4s - loss: 0.6013 - accuracy: 0.7548



14/92 [===>……………………..] - ETA: 4s - loss: 0.5978 - accuracy: 0.7545



15/92 [===>……………………..] - ETA: 4s - loss: 0.6020 - accuracy: 0.7479



16/92 [====>…………………….] - ETA: 4s - loss: 0.5969 - accuracy: 0.7500



17/92 [====>…………………….] - ETA: 4s - loss: 0.6147 - accuracy: 0.7408



18/92 [====>…………………….] - ETA: 4s - loss: 0.6109 - accuracy: 0.7448



19/92 [=====>……………………] - ETA: 4s - loss: 0.6156 - accuracy: 0.7467



20/92 [=====>……………………] - ETA: 4s - loss: 0.6125 - accuracy: 0.7516



21/92 [=====>……………………] - ETA: 4s - loss: 0.6098 - accuracy: 0.7530



22/92 [======>…………………..] - ETA: 4s - loss: 0.6131 - accuracy: 0.7528



23/92 [======>…………………..] - ETA: 4s - loss: 0.6149 - accuracy: 0.7527



24/92 [======>…………………..] - ETA: 3s - loss: 0.6234 - accuracy: 0.7474



25/92 [=======>………………….] - ETA: 3s - loss: 0.6214 - accuracy: 0.7487



26/92 [=======>………………….] - ETA: 3s - loss: 0.6183 - accuracy: 0.7512



27/92 [=======>………………….] - ETA: 3s - loss: 0.6153 - accuracy: 0.7535



28/92 [========>…………………] - ETA: 3s - loss: 0.6128 - accuracy: 0.7556



29/92 [========>…………………] - ETA: 3s - loss: 0.6245 - accuracy: 0.7522



30/92 [========>…………………] - ETA: 3s - loss: 0.6246 - accuracy: 0.7510



31/92 [=========>………………..] - ETA: 3s - loss: 0.6228 - accuracy: 0.7530



32/92 [=========>………………..] - ETA: 3s - loss: 0.6247 - accuracy: 0.7539



33/92 [=========>………………..] - ETA: 3s - loss: 0.6282 - accuracy: 0.7509



34/92 [==========>……………….] - ETA: 3s - loss: 0.6380 - accuracy: 0.7491



35/92 [==========>……………….] - ETA: 3s - loss: 0.6360 - accuracy: 0.7500



36/92 [==========>……………….] - ETA: 3s - loss: 0.6347 - accuracy: 0.7526



37/92 [===========>………………] - ETA: 3s - loss: 0.6365 - accuracy: 0.7525



38/92 [===========>………………] - ETA: 3s - loss: 0.6336 - accuracy: 0.7549



39/92 [===========>………………] - ETA: 3s - loss: 0.6360 - accuracy: 0.7532



40/92 [============>……………..] - ETA: 3s - loss: 0.6311 - accuracy: 0.7547



41/92 [============>……………..] - ETA: 2s - loss: 0.6326 - accuracy: 0.7553



42/92 [============>……………..] - ETA: 2s - loss: 0.6372 - accuracy: 0.7545



43/92 [=============>…………….] - ETA: 2s - loss: 0.6439 - accuracy: 0.7536



44/92 [=============>…………….] - ETA: 2s - loss: 0.6455 - accuracy: 0.7521



46/92 [==============>……………] - ETA: 2s - loss: 0.6455 - accuracy: 0.7514



47/92 [==============>……………] - ETA: 2s - loss: 0.6464 - accuracy: 0.7520



48/92 [==============>……………] - ETA: 2s - loss: 0.6517 - accuracy: 0.7513



49/92 [==============>……………] - ETA: 2s - loss: 0.6550 - accuracy: 0.7513



50/92 [===============>…………..] - ETA: 2s - loss: 0.6535 - accuracy: 0.7519



51/92 [===============>…………..] - ETA: 2s - loss: 0.6543 - accuracy: 0.7525



52/92 [===============>…………..] - ETA: 2s - loss: 0.6592 - accuracy: 0.7482



53/92 [================>………….] - ETA: 2s - loss: 0.6656 - accuracy: 0.7447



54/92 [================>………….] - ETA: 2s - loss: 0.6636 - accuracy: 0.7459



55/92 [================>………….] - ETA: 2s - loss: 0.6646 - accuracy: 0.7454



56/92 [=================>…………] - ETA: 2s - loss: 0.6710 - accuracy: 0.7444



57/92 [=================>…………] - ETA: 2s - loss: 0.6695 - accuracy: 0.7450



58/92 [=================>…………] - ETA: 1s - loss: 0.6672 - accuracy: 0.7462



59/92 [==================>………..] - ETA: 1s - loss: 0.6702 - accuracy: 0.7452



60/92 [==================>………..] - ETA: 1s - loss: 0.6684 - accuracy: 0.7453



61/92 [==================>………..] - ETA: 1s - loss: 0.6693 - accuracy: 0.7454



62/92 [===================>……….] - ETA: 1s - loss: 0.6713 - accuracy: 0.7449



63/92 [===================>……….] - ETA: 1s - loss: 0.6722 - accuracy: 0.7440



64/92 [===================>……….] - ETA: 1s - loss: 0.6705 - accuracy: 0.7451



65/92 [====================>………] - ETA: 1s - loss: 0.6720 - accuracy: 0.7452



66/92 [====================>………] - ETA: 1s - loss: 0.6696 - accuracy: 0.7462



67/92 [====================>………] - ETA: 1s - loss: 0.6726 - accuracy: 0.7458



68/92 [=====================>……..] - ETA: 1s - loss: 0.6737 - accuracy: 0.7454



69/92 [=====================>……..] - ETA: 1s - loss: 0.6717 - accuracy: 0.7459



70/92 [=====================>……..] - ETA: 1s - loss: 0.6693 - accuracy: 0.7460



71/92 [======================>…….] - ETA: 1s - loss: 0.6691 - accuracy: 0.7460



72/92 [======================>…….] - ETA: 1s - loss: 0.6727 - accuracy: 0.7443



73/92 [======================>…….] - ETA: 1s - loss: 0.6732 - accuracy: 0.7440



74/92 [=======================>……] - ETA: 1s - loss: 0.6720 - accuracy: 0.7445



75/92 [=======================>……] - ETA: 0s - loss: 0.6715 - accuracy: 0.7458



76/92 [=======================>……] - ETA: 0s - loss: 0.6703 - accuracy: 0.7459



77/92 [========================>…..] - ETA: 0s - loss: 0.6712 - accuracy: 0.7455



78/92 [========================>…..] - ETA: 0s - loss: 0.6727 - accuracy: 0.7456



79/92 [========================>…..] - ETA: 0s - loss: 0.6713 - accuracy: 0.7460



80/92 [=========================>….] - ETA: 0s - loss: 0.6690 - accuracy: 0.7473



81/92 [=========================>….] - ETA: 0s - loss: 0.6698 - accuracy: 0.7457



82/92 [=========================>….] - ETA: 0s - loss: 0.6702 - accuracy: 0.7450



83/92 [==========================>…] - ETA: 0s - loss: 0.6702 - accuracy: 0.7455



84/92 [==========================>…] - ETA: 0s - loss: 0.6699 - accuracy: 0.7451



85/92 [==========================>…] - ETA: 0s - loss: 0.6690 - accuracy: 0.7463



86/92 [===========================>..] - ETA: 0s - loss: 0.6715 - accuracy: 0.7449



87/92 [===========================>..] - ETA: 0s - loss: 0.6732 - accuracy: 0.7439



88/92 [===========================>..] - ETA: 0s - loss: 0.6717 - accuracy: 0.7439



89/92 [============================>.] - ETA: 0s - loss: 0.6743 - accuracy: 0.7433



90/92 [============================>.] - ETA: 0s - loss: 0.6769 - accuracy: 0.7427



91/92 [============================>.] - ETA: 0s - loss: 0.6763 - accuracy: 0.7428



92/92 [==============================] - ETA: 0s - loss: 0.6821 - accuracy: 0.7398



92/92 [==============================] - 6s 64ms/step - loss: 0.6821 - accuracy: 0.7398 - val_loss: 0.7942 - val_accuracy: 0.6812

Epoch 9/15
1/92 [..............................] - ETA: 6s - loss: 0.5542 - accuracy: 0.7812
   
2/92 [..............................] - ETA: 5s - loss: 0.6572 - accuracy: 0.7500
   
3/92 [..............................] - ETA: 5s - loss: 0.6516 - accuracy: 0.7604
   
4/92 [>.............................] - ETA: 5s - loss: 0.6615 - accuracy: 0.7344
   
5/92 [>.............................] - ETA: 5s - loss: 0.6472 - accuracy: 0.7625
   
6/92 [>.............................] - ETA: 5s - loss: 0.6448 - accuracy: 0.7604
   
7/92 [=>............................] - ETA: 4s - loss: 0.6265 - accuracy: 0.7545
   
8/92 [=>............................] - ETA: 4s - loss: 0.6567 - accuracy: 0.7422
   
9/92 [=>............................] - ETA: 4s - loss: 0.6387 - accuracy: 0.7465


10/92 [==>………………………] - ETA: 4s - loss: 0.6596 - accuracy: 0.7406



11/92 [==>………………………] - ETA: 4s - loss: 0.6548 - accuracy: 0.7415



12/92 [==>………………………] - ETA: 4s - loss: 0.6587 - accuracy: 0.7370



13/92 [===>……………………..] - ETA: 4s - loss: 0.6604 - accuracy: 0.7356



14/92 [===>……………………..] - ETA: 4s - loss: 0.6458 - accuracy: 0.7433



15/92 [===>……………………..] - ETA: 4s - loss: 0.6432 - accuracy: 0.7479



16/92 [====>…………………….] - ETA: 4s - loss: 0.6368 - accuracy: 0.7500



17/92 [====>…………………….] - ETA: 4s - loss: 0.6318 - accuracy: 0.7574



18/92 [====>…………………….] - ETA: 4s - loss: 0.6263 - accuracy: 0.7604



19/92 [=====>……………………] - ETA: 4s - loss: 0.6300 - accuracy: 0.7599



20/92 [=====>……………………] - ETA: 4s - loss: 0.6349 - accuracy: 0.7578



21/92 [=====>……………………] - ETA: 4s - loss: 0.6289 - accuracy: 0.7589



22/92 [======>…………………..] - ETA: 4s - loss: 0.6234 - accuracy: 0.7614



23/92 [======>…………………..] - ETA: 4s - loss: 0.6200 - accuracy: 0.7609



24/92 [======>…………………..] - ETA: 3s - loss: 0.6159 - accuracy: 0.7630



25/92 [=======>………………….] - ETA: 3s - loss: 0.6139 - accuracy: 0.7625



26/92 [=======>………………….] - ETA: 3s - loss: 0.6118 - accuracy: 0.7620



27/92 [=======>………………….] - ETA: 3s - loss: 0.6134 - accuracy: 0.7581



28/92 [========>…………………] - ETA: 3s - loss: 0.6103 - accuracy: 0.7567



29/92 [========>…………………] - ETA: 3s - loss: 0.6040 - accuracy: 0.7586



30/92 [========>…………………] - ETA: 3s - loss: 0.6029 - accuracy: 0.7604



31/92 [=========>………………..] - ETA: 3s - loss: 0.6022 - accuracy: 0.7621



32/92 [=========>………………..] - ETA: 3s - loss: 0.6037 - accuracy: 0.7607



33/92 [=========>………………..] - ETA: 3s - loss: 0.6135 - accuracy: 0.7566



34/92 [==========>……………….] - ETA: 3s - loss: 0.6141 - accuracy: 0.7574



35/92 [==========>……………….] - ETA: 3s - loss: 0.6172 - accuracy: 0.7554



36/92 [==========>……………….] - ETA: 3s - loss: 0.6220 - accuracy: 0.7552



37/92 [===========>………………] - ETA: 3s - loss: 0.6279 - accuracy: 0.7525



38/92 [===========>………………] - ETA: 3s - loss: 0.6295 - accuracy: 0.7500



39/92 [===========>………………] - ETA: 3s - loss: 0.6328 - accuracy: 0.7492



40/92 [============>……………..] - ETA: 3s - loss: 0.6366 - accuracy: 0.7477



41/92 [============>……………..] - ETA: 2s - loss: 0.6435 - accuracy: 0.7470



42/92 [============>……………..] - ETA: 2s - loss: 0.6465 - accuracy: 0.7448



43/92 [=============>…………….] - ETA: 2s - loss: 0.6493 - accuracy: 0.7442



44/92 [=============>…………….] - ETA: 2s - loss: 0.6462 - accuracy: 0.7450



45/92 [=============>…………….] - ETA: 2s - loss: 0.6506 - accuracy: 0.7437



46/92 [==============>……………] - ETA: 2s - loss: 0.6490 - accuracy: 0.7432



47/92 [==============>……………] - ETA: 2s - loss: 0.6511 - accuracy: 0.7414



48/92 [==============>……………] - ETA: 2s - loss: 0.6528 - accuracy: 0.7415



49/92 [==============>……………] - ETA: 2s - loss: 0.6529 - accuracy: 0.7411



50/92 [===============>…………..] - ETA: 2s - loss: 0.6542 - accuracy: 0.7419



51/92 [===============>…………..] - ETA: 2s - loss: 0.6546 - accuracy: 0.7408



52/92 [===============>…………..] - ETA: 2s - loss: 0.6496 - accuracy: 0.7434



53/92 [================>………….] - ETA: 2s - loss: 0.6550 - accuracy: 0.7417



54/92 [================>………….] - ETA: 2s - loss: 0.6544 - accuracy: 0.7413



55/92 [================>………….] - ETA: 2s - loss: 0.6563 - accuracy: 0.7420



56/92 [=================>…………] - ETA: 2s - loss: 0.6577 - accuracy: 0.7416



57/92 [=================>…………] - ETA: 2s - loss: 0.6562 - accuracy: 0.7429



58/92 [=================>…………] - ETA: 1s - loss: 0.6543 - accuracy: 0.7441



59/92 [==================>………..] - ETA: 1s - loss: 0.6554 - accuracy: 0.7431



60/92 [==================>………..] - ETA: 1s - loss: 0.6605 - accuracy: 0.7401



61/92 [==================>………..] - ETA: 1s - loss: 0.6579 - accuracy: 0.7423



62/92 [===================>……….] - ETA: 1s - loss: 0.6568 - accuracy: 0.7414



64/92 [===================>……….] - ETA: 1s - loss: 0.6570 - accuracy: 0.7412



65/92 [====================>………] - ETA: 1s - loss: 0.6617 - accuracy: 0.7384



66/92 [====================>………] - ETA: 1s - loss: 0.6606 - accuracy: 0.7395



67/92 [====================>………] - ETA: 1s - loss: 0.6609 - accuracy: 0.7392



68/92 [=====================>……..] - ETA: 1s - loss: 0.6580 - accuracy: 0.7408



69/92 [=====================>……..] - ETA: 1s - loss: 0.6592 - accuracy: 0.7409



70/92 [=====================>……..] - ETA: 1s - loss: 0.6564 - accuracy: 0.7424



71/92 [======================>…….] - ETA: 1s - loss: 0.6546 - accuracy: 0.7434



72/92 [======================>…….] - ETA: 1s - loss: 0.6555 - accuracy: 0.7435



73/92 [======================>…….] - ETA: 1s - loss: 0.6536 - accuracy: 0.7444



74/92 [=======================>……] - ETA: 1s - loss: 0.6532 - accuracy: 0.7449



75/92 [=======================>……] - ETA: 0s - loss: 0.6585 - accuracy: 0.7425



76/92 [=======================>……] - ETA: 0s - loss: 0.6595 - accuracy: 0.7426



77/92 [========================>…..] - ETA: 0s - loss: 0.6558 - accuracy: 0.7447



78/92 [========================>…..] - ETA: 0s - loss: 0.6535 - accuracy: 0.7456



79/92 [========================>…..] - ETA: 0s - loss: 0.6509 - accuracy: 0.7472



80/92 [=========================>….] - ETA: 0s - loss: 0.6487 - accuracy: 0.7480



81/92 [=========================>….] - ETA: 0s - loss: 0.6491 - accuracy: 0.7473



82/92 [=========================>….] - ETA: 0s - loss: 0.6480 - accuracy: 0.7477



83/92 [==========================>…] - ETA: 0s - loss: 0.6493 - accuracy: 0.7481



84/92 [==========================>…] - ETA: 0s - loss: 0.6504 - accuracy: 0.7466



85/92 [==========================>…] - ETA: 0s - loss: 0.6491 - accuracy: 0.7474



86/92 [===========================>..] - ETA: 0s - loss: 0.6478 - accuracy: 0.7482



87/92 [===========================>..] - ETA: 0s - loss: 0.6478 - accuracy: 0.7486



88/92 [===========================>..] - ETA: 0s - loss: 0.6470 - accuracy: 0.7504



89/92 [============================>.] - ETA: 0s - loss: 0.6463 - accuracy: 0.7511



90/92 [============================>.] - ETA: 0s - loss: 0.6476 - accuracy: 0.7510



91/92 [============================>.] - ETA: 0s - loss: 0.6475 - accuracy: 0.7507



92/92 [==============================] - ETA: 0s - loss: 0.6469 - accuracy: 0.7510



92/92 [==============================] - 6s 63ms/step - loss: 0.6469 - accuracy: 0.7510 - val_loss: 0.7705 - val_accuracy: 0.6921

Epoch 10/15
1/92 [..............................] - ETA: 7s - loss: 0.5019 - accuracy: 0.8438
   
2/92 [..............................] - ETA: 5s - loss: 0.5805 - accuracy: 0.7969
   
3/92 [..............................] - ETA: 5s - loss: 0.6209 - accuracy: 0.7604
   
4/92 [>.............................] - ETA: 5s - loss: 0.6745 - accuracy: 0.7109
   
5/92 [>.............................] - ETA: 5s - loss: 0.6841 - accuracy: 0.7125
   
6/92 [>.............................] - ETA: 4s - loss: 0.6510 - accuracy: 0.7188
   
7/92 [=>............................] - ETA: 4s - loss: 0.6254 - accuracy: 0.7411
   
8/92 [=>............................] - ETA: 4s - loss: 0.6364 - accuracy: 0.7383
   
9/92 [=>............................] - ETA: 4s - loss: 0.6488 - accuracy: 0.7292


10/92 [==>………………………] - ETA: 4s - loss: 0.6263 - accuracy: 0.7406



11/92 [==>………………………] - ETA: 4s - loss: 0.6172 - accuracy: 0.7443



12/92 [==>………………………] - ETA: 4s - loss: 0.6176 - accuracy: 0.7422



13/92 [===>……………………..] - ETA: 4s - loss: 0.6043 - accuracy: 0.7452



14/92 [===>……………………..] - ETA: 4s - loss: 0.6265 - accuracy: 0.7433



15/92 [===>……………………..] - ETA: 4s - loss: 0.6181 - accuracy: 0.7479



16/92 [====>…………………….] - ETA: 4s - loss: 0.6257 - accuracy: 0.7520



17/92 [====>…………………….] - ETA: 4s - loss: 0.6240 - accuracy: 0.7574



18/92 [====>…………………….] - ETA: 4s - loss: 0.6256 - accuracy: 0.7535



19/92 [=====>……………………] - ETA: 4s - loss: 0.6189 - accuracy: 0.7566



20/92 [=====>……………………] - ETA: 4s - loss: 0.6213 - accuracy: 0.7578



21/92 [=====>……………………] - ETA: 4s - loss: 0.6196 - accuracy: 0.7589



22/92 [======>…………………..] - ETA: 4s - loss: 0.6144 - accuracy: 0.7642



23/92 [======>…………………..] - ETA: 3s - loss: 0.6133 - accuracy: 0.7649



24/92 [======>…………………..] - ETA: 3s - loss: 0.6115 - accuracy: 0.7669



25/92 [=======>………………….] - ETA: 3s - loss: 0.6141 - accuracy: 0.7638



26/92 [=======>………………….] - ETA: 3s - loss: 0.6078 - accuracy: 0.7656



27/92 [=======>………………….] - ETA: 3s - loss: 0.6107 - accuracy: 0.7639



28/92 [========>…………………] - ETA: 3s - loss: 0.6194 - accuracy: 0.7578



29/92 [========>…………………] - ETA: 3s - loss: 0.6195 - accuracy: 0.7575



30/92 [========>…………………] - ETA: 3s - loss: 0.6170 - accuracy: 0.7604



31/92 [=========>………………..] - ETA: 3s - loss: 0.6153 - accuracy: 0.7601



32/92 [=========>………………..] - ETA: 3s - loss: 0.6169 - accuracy: 0.7588



33/92 [=========>………………..] - ETA: 3s - loss: 0.6183 - accuracy: 0.7576



34/92 [==========>……………….] - ETA: 3s - loss: 0.6117 - accuracy: 0.7610



35/92 [==========>……………….] - ETA: 3s - loss: 0.6171 - accuracy: 0.7607



36/92 [==========>……………….] - ETA: 3s - loss: 0.6148 - accuracy: 0.7613



37/92 [===========>………………] - ETA: 3s - loss: 0.6160 - accuracy: 0.7601



38/92 [===========>………………] - ETA: 3s - loss: 0.6141 - accuracy: 0.7615



39/92 [===========>………………] - ETA: 3s - loss: 0.6149 - accuracy: 0.7612



40/92 [============>……………..] - ETA: 3s - loss: 0.6139 - accuracy: 0.7625



41/92 [============>……………..] - ETA: 2s - loss: 0.6141 - accuracy: 0.7630



42/92 [============>……………..] - ETA: 2s - loss: 0.6113 - accuracy: 0.7634



43/92 [=============>…………….] - ETA: 2s - loss: 0.6049 - accuracy: 0.7667



44/92 [=============>…………….] - ETA: 2s - loss: 0.6052 - accuracy: 0.7670



46/92 [==============>……………] - ETA: 2s - loss: 0.6061 - accuracy: 0.7678



47/92 [==============>……………] - ETA: 2s - loss: 0.6016 - accuracy: 0.7701



48/92 [==============>……………] - ETA: 2s - loss: 0.6008 - accuracy: 0.7703



49/92 [==============>……………] - ETA: 2s - loss: 0.6031 - accuracy: 0.7705



50/92 [===============>…………..] - ETA: 2s - loss: 0.6029 - accuracy: 0.7714



51/92 [===============>…………..] - ETA: 2s - loss: 0.6044 - accuracy: 0.7691



52/92 [===============>…………..] - ETA: 2s - loss: 0.6106 - accuracy: 0.7645



53/92 [================>………….] - ETA: 2s - loss: 0.6055 - accuracy: 0.7660



54/92 [================>………….] - ETA: 2s - loss: 0.6085 - accuracy: 0.7645



55/92 [================>………….] - ETA: 2s - loss: 0.6119 - accuracy: 0.7637



56/92 [=================>…………] - ETA: 2s - loss: 0.6117 - accuracy: 0.7640



57/92 [=================>…………] - ETA: 2s - loss: 0.6108 - accuracy: 0.7649



58/92 [=================>…………] - ETA: 1s - loss: 0.6119 - accuracy: 0.7641



59/92 [==================>………..] - ETA: 1s - loss: 0.6111 - accuracy: 0.7660



60/92 [==================>………..] - ETA: 1s - loss: 0.6074 - accuracy: 0.7667



61/92 [==================>………..] - ETA: 1s - loss: 0.6087 - accuracy: 0.7665



62/92 [===================>……….] - ETA: 1s - loss: 0.6106 - accuracy: 0.7672



63/92 [===================>……….] - ETA: 1s - loss: 0.6161 - accuracy: 0.7659



64/92 [===================>……….] - ETA: 1s - loss: 0.6146 - accuracy: 0.7672



65/92 [====================>………] - ETA: 1s - loss: 0.6118 - accuracy: 0.7688



66/92 [====================>………] - ETA: 1s - loss: 0.6131 - accuracy: 0.7681



67/92 [====================>………] - ETA: 1s - loss: 0.6130 - accuracy: 0.7678



68/92 [=====================>……..] - ETA: 1s - loss: 0.6126 - accuracy: 0.7680



69/92 [=====================>……..] - ETA: 1s - loss: 0.6167 - accuracy: 0.7655



70/92 [=====================>……..] - ETA: 1s - loss: 0.6165 - accuracy: 0.7652



71/92 [======================>…….] - ETA: 1s - loss: 0.6148 - accuracy: 0.7655



72/92 [======================>…….] - ETA: 1s - loss: 0.6191 - accuracy: 0.7635



73/92 [======================>…….] - ETA: 1s - loss: 0.6240 - accuracy: 0.7625



74/92 [=======================>……] - ETA: 1s - loss: 0.6252 - accuracy: 0.7614



75/92 [=======================>……] - ETA: 0s - loss: 0.6270 - accuracy: 0.7596



76/92 [=======================>……] - ETA: 0s - loss: 0.6247 - accuracy: 0.7603



77/92 [========================>…..] - ETA: 0s - loss: 0.6228 - accuracy: 0.7610



78/92 [========================>…..] - ETA: 0s - loss: 0.6224 - accuracy: 0.7609



79/92 [========================>…..] - ETA: 0s - loss: 0.6223 - accuracy: 0.7615



80/92 [=========================>….] - ETA: 0s - loss: 0.6221 - accuracy: 0.7614



81/92 [=========================>….] - ETA: 0s - loss: 0.6233 - accuracy: 0.7608



82/92 [=========================>….] - ETA: 0s - loss: 0.6214 - accuracy: 0.7615



83/92 [==========================>…] - ETA: 0s - loss: 0.6221 - accuracy: 0.7610



84/92 [==========================>…] - ETA: 0s - loss: 0.6238 - accuracy: 0.7616



85/92 [==========================>…] - ETA: 0s - loss: 0.6236 - accuracy: 0.7622



86/92 [===========================>..] - ETA: 0s - loss: 0.6234 - accuracy: 0.7624



87/92 [===========================>..] - ETA: 0s - loss: 0.6231 - accuracy: 0.7630



88/92 [===========================>..] - ETA: 0s - loss: 0.6217 - accuracy: 0.7632



89/92 [============================>.] - ETA: 0s - loss: 0.6214 - accuracy: 0.7641



90/92 [============================>.] - ETA: 0s - loss: 0.6248 - accuracy: 0.7639



91/92 [============================>.] - ETA: 0s - loss: 0.6238 - accuracy: 0.7645



92/92 [==============================] - ETA: 0s - loss: 0.6230 - accuracy: 0.7646



92/92 [==============================] - 6s 63ms/step - loss: 0.6230 - accuracy: 0.7646 - val_loss: 0.7725 - val_accuracy: 0.7153

Epoch 11/15
1/92 [..............................] - ETA: 7s - loss: 0.6668 - accuracy: 0.6875
   
2/92 [..............................] - ETA: 5s - loss: 0.5528 - accuracy: 0.7656
   
3/92 [..............................] - ETA: 5s - loss: 0.5535 - accuracy: 0.7917
   
4/92 [>.............................] - ETA: 5s - loss: 0.5296 - accuracy: 0.7969
   
5/92 [>.............................] - ETA: 5s - loss: 0.5133 - accuracy: 0.8062
   
6/92 [>.............................] - ETA: 4s - loss: 0.5098 - accuracy: 0.8073
   
7/92 [=>............................] - ETA: 4s - loss: 0.5332 - accuracy: 0.8036
   
8/92 [=>............................] - ETA: 4s - loss: 0.5426 - accuracy: 0.7969
   
9/92 [=>............................] - ETA: 4s - loss: 0.5770 - accuracy: 0.7778


10/92 [==>………………………] - ETA: 4s - loss: 0.6016 - accuracy: 0.7719



11/92 [==>………………………] - ETA: 4s - loss: 0.5975 - accuracy: 0.7699



12/92 [==>………………………] - ETA: 4s - loss: 0.5897 - accuracy: 0.7734



13/92 [===>……………………..] - ETA: 4s - loss: 0.5994 - accuracy: 0.7716



14/92 [===>……………………..] - ETA: 4s - loss: 0.5986 - accuracy: 0.7746



15/92 [===>……………………..] - ETA: 4s - loss: 0.5995 - accuracy: 0.7708



16/92 [====>…………………….] - ETA: 4s - loss: 0.6063 - accuracy: 0.7656



17/92 [====>…………………….] - ETA: 4s - loss: 0.6023 - accuracy: 0.7665



18/92 [====>…………………….] - ETA: 4s - loss: 0.6027 - accuracy: 0.7656



19/92 [=====>……………………] - ETA: 4s - loss: 0.5956 - accuracy: 0.7664



21/92 [=====>……………………] - ETA: 4s - loss: 0.5947 - accuracy: 0.7666



22/92 [======>…………………..] - ETA: 4s - loss: 0.5901 - accuracy: 0.7672



23/92 [======>…………………..] - ETA: 3s - loss: 0.5811 - accuracy: 0.7720



24/92 [======>…………………..] - ETA: 3s - loss: 0.5810 - accuracy: 0.7737



25/92 [=======>………………….] - ETA: 3s - loss: 0.5865 - accuracy: 0.7715



26/92 [=======>………………….] - ETA: 3s - loss: 0.5784 - accuracy: 0.7767



27/92 [=======>………………….] - ETA: 3s - loss: 0.5786 - accuracy: 0.7745



28/92 [========>…………………] - ETA: 3s - loss: 0.5726 - accuracy: 0.7770



29/92 [========>…………………] - ETA: 3s - loss: 0.5660 - accuracy: 0.7793



30/92 [========>…………………] - ETA: 3s - loss: 0.5637 - accuracy: 0.7805



31/92 [=========>………………..] - ETA: 3s - loss: 0.5661 - accuracy: 0.7785



32/92 [=========>………………..] - ETA: 3s - loss: 0.5745 - accuracy: 0.7766



33/92 [=========>………………..] - ETA: 3s - loss: 0.5816 - accuracy: 0.7748



34/92 [==========>……………….] - ETA: 3s - loss: 0.5781 - accuracy: 0.7750



35/92 [==========>……………….] - ETA: 3s - loss: 0.5772 - accuracy: 0.7734



36/92 [==========>……………….] - ETA: 3s - loss: 0.5729 - accuracy: 0.7753



37/92 [===========>………………] - ETA: 3s - loss: 0.5776 - accuracy: 0.7747



38/92 [===========>………………] - ETA: 3s - loss: 0.5781 - accuracy: 0.7781



39/92 [===========>………………] - ETA: 3s - loss: 0.5773 - accuracy: 0.7790



40/92 [============>……………..] - ETA: 3s - loss: 0.5783 - accuracy: 0.7799



41/92 [============>……………..] - ETA: 2s - loss: 0.5778 - accuracy: 0.7799



42/92 [============>……………..] - ETA: 2s - loss: 0.5755 - accuracy: 0.7792



43/92 [=============>…………….] - ETA: 2s - loss: 0.5769 - accuracy: 0.7792



44/92 [=============>…………….] - ETA: 2s - loss: 0.5776 - accuracy: 0.7786



45/92 [=============>…………….] - ETA: 2s - loss: 0.5752 - accuracy: 0.7793



46/92 [==============>……………] - ETA: 2s - loss: 0.5734 - accuracy: 0.7801



47/92 [==============>……………] - ETA: 2s - loss: 0.5765 - accuracy: 0.7787



48/92 [==============>……………] - ETA: 2s - loss: 0.5780 - accuracy: 0.7788



49/92 [==============>……………] - ETA: 2s - loss: 0.5799 - accuracy: 0.7776



50/92 [===============>…………..] - ETA: 2s - loss: 0.5836 - accuracy: 0.7770



51/92 [===============>…………..] - ETA: 2s - loss: 0.5823 - accuracy: 0.7777



52/92 [===============>…………..] - ETA: 2s - loss: 0.5808 - accuracy: 0.7790



53/92 [================>………….] - ETA: 2s - loss: 0.5838 - accuracy: 0.7778



54/92 [================>………….] - ETA: 2s - loss: 0.5808 - accuracy: 0.7791



55/92 [================>………….] - ETA: 2s - loss: 0.5803 - accuracy: 0.7780



56/92 [=================>…………] - ETA: 2s - loss: 0.5826 - accuracy: 0.7775



57/92 [=================>…………] - ETA: 2s - loss: 0.5896 - accuracy: 0.7742



58/92 [=================>…………] - ETA: 1s - loss: 0.5896 - accuracy: 0.7738



59/92 [==================>………..] - ETA: 1s - loss: 0.5893 - accuracy: 0.7739



60/92 [==================>………..] - ETA: 1s - loss: 0.5915 - accuracy: 0.7735



61/92 [==================>………..] - ETA: 1s - loss: 0.5931 - accuracy: 0.7726



62/92 [===================>……….] - ETA: 1s - loss: 0.5908 - accuracy: 0.7733



63/92 [===================>……….] - ETA: 1s - loss: 0.5886 - accuracy: 0.7744



64/92 [===================>……….] - ETA: 1s - loss: 0.5892 - accuracy: 0.7745



65/92 [====================>………] - ETA: 1s - loss: 0.5897 - accuracy: 0.7736



66/92 [====================>………] - ETA: 1s - loss: 0.5921 - accuracy: 0.7728



67/92 [====================>………] - ETA: 1s - loss: 0.5900 - accuracy: 0.7748



68/92 [=====================>……..] - ETA: 1s - loss: 0.5871 - accuracy: 0.7763



69/92 [=====================>……..] - ETA: 1s - loss: 0.5849 - accuracy: 0.7759



70/92 [=====================>……..] - ETA: 1s - loss: 0.5883 - accuracy: 0.7742



71/92 [======================>…….] - ETA: 1s - loss: 0.5860 - accuracy: 0.7752



72/92 [======================>…….] - ETA: 1s - loss: 0.5845 - accuracy: 0.7757



73/92 [======================>…….] - ETA: 1s - loss: 0.5850 - accuracy: 0.7762



74/92 [=======================>……] - ETA: 1s - loss: 0.5829 - accuracy: 0.7771



75/92 [=======================>……] - ETA: 0s - loss: 0.5818 - accuracy: 0.7780



76/92 [=======================>……] - ETA: 0s - loss: 0.5827 - accuracy: 0.7772



77/92 [========================>…..] - ETA: 0s - loss: 0.5818 - accuracy: 0.7761



78/92 [========================>…..] - ETA: 0s - loss: 0.5856 - accuracy: 0.7761



79/92 [========================>…..] - ETA: 0s - loss: 0.5870 - accuracy: 0.7758



80/92 [=========================>….] - ETA: 0s - loss: 0.5876 - accuracy: 0.7743



81/92 [=========================>….] - ETA: 0s - loss: 0.5868 - accuracy: 0.7748



82/92 [=========================>….] - ETA: 0s - loss: 0.5895 - accuracy: 0.7737



83/92 [==========================>…] - ETA: 0s - loss: 0.5908 - accuracy: 0.7727



84/92 [==========================>…] - ETA: 0s - loss: 0.5928 - accuracy: 0.7720



85/92 [==========================>…] - ETA: 0s - loss: 0.5920 - accuracy: 0.7718



86/92 [===========================>..] - ETA: 0s - loss: 0.5914 - accuracy: 0.7726



87/92 [===========================>..] - ETA: 0s - loss: 0.5921 - accuracy: 0.7723



88/92 [===========================>..] - ETA: 0s - loss: 0.5911 - accuracy: 0.7724



89/92 [============================>.] - ETA: 0s - loss: 0.5913 - accuracy: 0.7725



90/92 [============================>.] - ETA: 0s - loss: 0.5910 - accuracy: 0.7726



91/92 [============================>.] - ETA: 0s - loss: 0.5901 - accuracy: 0.7724



92/92 [==============================] - ETA: 0s - loss: 0.5883 - accuracy: 0.7725



92/92 [==============================] - 6s 64ms/step - loss: 0.5883 - accuracy: 0.7725 - val_loss: 0.7175 - val_accuracy: 0.7234

Epoch 12/15
1/92 [..............................] - ETA: 7s - loss: 0.5461 - accuracy: 0.7812
   
2/92 [..............................] - ETA: 5s - loss: 0.6078 - accuracy: 0.7500
   
3/92 [..............................] - ETA: 5s - loss: 0.5296 - accuracy: 0.7917
   
4/92 [>.............................] - ETA: 5s - loss: 0.5025 - accuracy: 0.8047
   
5/92 [>.............................] - ETA: 5s - loss: 0.5195 - accuracy: 0.7812
   
6/92 [>.............................] - ETA: 4s - loss: 0.4948 - accuracy: 0.7917
   
7/92 [=>............................] - ETA: 4s - loss: 0.4886 - accuracy: 0.7857
   
8/92 [=>............................] - ETA: 4s - loss: 0.5058 - accuracy: 0.7773
   
9/92 [=>............................] - ETA: 4s - loss: 0.4985 - accuracy: 0.7882


10/92 [==>………………………] - ETA: 4s - loss: 0.4993 - accuracy: 0.7969



11/92 [==>………………………] - ETA: 4s - loss: 0.4915 - accuracy: 0.8011



12/92 [==>………………………] - ETA: 4s - loss: 0.5063 - accuracy: 0.8047



13/92 [===>……………………..] - ETA: 4s - loss: 0.5257 - accuracy: 0.7981



14/92 [===>……………………..] - ETA: 4s - loss: 0.5310 - accuracy: 0.7969



15/92 [===>……………………..] - ETA: 4s - loss: 0.5420 - accuracy: 0.7875



16/92 [====>…………………….] - ETA: 4s - loss: 0.5323 - accuracy: 0.7910



17/92 [====>…………………….] - ETA: 4s - loss: 0.5475 - accuracy: 0.7831



18/92 [====>…………………….] - ETA: 4s - loss: 0.5417 - accuracy: 0.7830



19/92 [=====>……………………] - ETA: 4s - loss: 0.5430 - accuracy: 0.7829



20/92 [=====>……………………] - ETA: 4s - loss: 0.5497 - accuracy: 0.7766



21/92 [=====>……………………] - ETA: 4s - loss: 0.5450 - accuracy: 0.7783



22/92 [======>…………………..] - ETA: 4s - loss: 0.5501 - accuracy: 0.7784



23/92 [======>…………………..] - ETA: 3s - loss: 0.5590 - accuracy: 0.7799



24/92 [======>…………………..] - ETA: 3s - loss: 0.5482 - accuracy: 0.7865



25/92 [=======>………………….] - ETA: 3s - loss: 0.5465 - accuracy: 0.7875



26/92 [=======>………………….] - ETA: 3s - loss: 0.5421 - accuracy: 0.7885



27/92 [=======>………………….] - ETA: 3s - loss: 0.5454 - accuracy: 0.7882



28/92 [========>…………………] - ETA: 3s - loss: 0.5488 - accuracy: 0.7868



29/92 [========>…………………] - ETA: 3s - loss: 0.5551 - accuracy: 0.7812



30/92 [========>…………………] - ETA: 3s - loss: 0.5555 - accuracy: 0.7802



31/92 [=========>………………..] - ETA: 3s - loss: 0.5554 - accuracy: 0.7782



32/92 [=========>………………..] - ETA: 3s - loss: 0.5517 - accuracy: 0.7773



33/92 [=========>………………..] - ETA: 3s - loss: 0.5521 - accuracy: 0.7794



34/92 [==========>……………….] - ETA: 3s - loss: 0.5585 - accuracy: 0.7776



35/92 [==========>……………….] - ETA: 3s - loss: 0.5562 - accuracy: 0.7777



36/92 [==========>……………….] - ETA: 3s - loss: 0.5507 - accuracy: 0.7812



37/92 [===========>………………] - ETA: 3s - loss: 0.5586 - accuracy: 0.7779



38/92 [===========>………………] - ETA: 3s - loss: 0.5564 - accuracy: 0.7796



39/92 [===========>………………] - ETA: 3s - loss: 0.5534 - accuracy: 0.7812



40/92 [============>……………..] - ETA: 3s - loss: 0.5536 - accuracy: 0.7820



41/92 [============>……………..] - ETA: 2s - loss: 0.5572 - accuracy: 0.7805



42/92 [============>……………..] - ETA: 2s - loss: 0.5569 - accuracy: 0.7805



43/92 [=============>…………….] - ETA: 2s - loss: 0.5613 - accuracy: 0.7776



44/92 [=============>…………….] - ETA: 2s - loss: 0.5597 - accuracy: 0.7784



45/92 [=============>…………….] - ETA: 2s - loss: 0.5597 - accuracy: 0.7799



46/92 [==============>……………] - ETA: 2s - loss: 0.5566 - accuracy: 0.7812



47/92 [==============>……………] - ETA: 2s - loss: 0.5596 - accuracy: 0.7793



48/92 [==============>……………] - ETA: 2s - loss: 0.5574 - accuracy: 0.7806



49/92 [==============>……………] - ETA: 2s - loss: 0.5618 - accuracy: 0.7793



50/92 [===============>…………..] - ETA: 2s - loss: 0.5593 - accuracy: 0.7806



51/92 [===============>…………..] - ETA: 2s - loss: 0.5628 - accuracy: 0.7794



52/92 [===============>…………..] - ETA: 2s - loss: 0.5675 - accuracy: 0.7776



53/92 [================>………….] - ETA: 2s - loss: 0.5661 - accuracy: 0.7783



54/92 [================>………….] - ETA: 2s - loss: 0.5638 - accuracy: 0.7795



55/92 [================>………….] - ETA: 2s - loss: 0.5641 - accuracy: 0.7790



56/92 [=================>…………] - ETA: 2s - loss: 0.5638 - accuracy: 0.7785



57/92 [=================>…………] - ETA: 2s - loss: 0.5621 - accuracy: 0.7785



58/92 [=================>…………] - ETA: 1s - loss: 0.5647 - accuracy: 0.7775



59/92 [==================>………..] - ETA: 1s - loss: 0.5642 - accuracy: 0.7770



60/92 [==================>………..] - ETA: 1s - loss: 0.5652 - accuracy: 0.7760



61/92 [==================>………..] - ETA: 1s - loss: 0.5647 - accuracy: 0.7772



62/92 [===================>……….] - ETA: 1s - loss: 0.5618 - accuracy: 0.7792



63/92 [===================>……….] - ETA: 1s - loss: 0.5621 - accuracy: 0.7783



64/92 [===================>……….] - ETA: 1s - loss: 0.5602 - accuracy: 0.7788



65/92 [====================>………] - ETA: 1s - loss: 0.5594 - accuracy: 0.7793



66/92 [====================>………] - ETA: 1s - loss: 0.5603 - accuracy: 0.7784



67/92 [====================>………] - ETA: 1s - loss: 0.5598 - accuracy: 0.7789



68/92 [=====================>……..] - ETA: 1s - loss: 0.5633 - accuracy: 0.7776



69/92 [=====================>……..] - ETA: 1s - loss: 0.5660 - accuracy: 0.7767



70/92 [=====================>……..] - ETA: 1s - loss: 0.5666 - accuracy: 0.7763



71/92 [======================>…….] - ETA: 1s - loss: 0.5642 - accuracy: 0.7777



72/92 [======================>…….] - ETA: 1s - loss: 0.5633 - accuracy: 0.7778



73/92 [======================>…….] - ETA: 1s - loss: 0.5618 - accuracy: 0.7787



74/92 [=======================>……] - ETA: 1s - loss: 0.5634 - accuracy: 0.7779



75/92 [=======================>……] - ETA: 0s - loss: 0.5613 - accuracy: 0.7783



77/92 [========================>…..] - ETA: 0s - loss: 0.5630 - accuracy: 0.7769



78/92 [========================>…..] - ETA: 0s - loss: 0.5655 - accuracy: 0.7761



79/92 [========================>…..] - ETA: 0s - loss: 0.5639 - accuracy: 0.7770



80/92 [=========================>….] - ETA: 0s - loss: 0.5662 - accuracy: 0.7766



81/92 [=========================>….] - ETA: 0s - loss: 0.5626 - accuracy: 0.7783



82/92 [=========================>….] - ETA: 0s - loss: 0.5621 - accuracy: 0.7794



83/92 [==========================>…] - ETA: 0s - loss: 0.5608 - accuracy: 0.7806



84/92 [==========================>…] - ETA: 0s - loss: 0.5608 - accuracy: 0.7806



85/92 [==========================>…] - ETA: 0s - loss: 0.5613 - accuracy: 0.7802



86/92 [===========================>..] - ETA: 0s - loss: 0.5608 - accuracy: 0.7806



87/92 [===========================>..] - ETA: 0s - loss: 0.5642 - accuracy: 0.7803



88/92 [===========================>..] - ETA: 0s - loss: 0.5625 - accuracy: 0.7810



89/92 [============================>.] - ETA: 0s - loss: 0.5648 - accuracy: 0.7806



90/92 [============================>.] - ETA: 0s - loss: 0.5615 - accuracy: 0.7824



91/92 [============================>.] - ETA: 0s - loss: 0.5611 - accuracy: 0.7820



92/92 [==============================] - ETA: 0s - loss: 0.5609 - accuracy: 0.7827



92/92 [==============================] - 6s 64ms/step - loss: 0.5609 - accuracy: 0.7827 - val_loss: 0.6652 - val_accuracy: 0.7357

Epoch 13/15
1/92 [..............................] - ETA: 7s - loss: 0.5252 - accuracy: 0.8438
   
2/92 [..............................] - ETA: 5s - loss: 0.5595 - accuracy: 0.7969
   
3/92 [..............................] - ETA: 5s - loss: 0.5306 - accuracy: 0.8125
   
4/92 [>.............................] - ETA: 5s - loss: 0.5318 - accuracy: 0.8125
   
5/92 [>.............................] - ETA: 5s - loss: 0.4936 - accuracy: 0.8313
   
6/92 [>.............................] - ETA: 5s - loss: 0.4675 - accuracy: 0.8438
   
7/92 [=>............................] - ETA: 4s - loss: 0.4796 - accuracy: 0.8348
   
8/92 [=>............................] - ETA: 4s - loss: 0.5024 - accuracy: 0.8164
   
9/92 [=>............................] - ETA: 4s - loss: 0.4919 - accuracy: 0.8264


10/92 [==>………………………] - ETA: 4s - loss: 0.5071 - accuracy: 0.8219



11/92 [==>………………………] - ETA: 4s - loss: 0.5112 - accuracy: 0.8182



12/92 [==>………………………] - ETA: 4s - loss: 0.5037 - accuracy: 0.8203



13/92 [===>……………………..] - ETA: 4s - loss: 0.4893 - accuracy: 0.8245



14/92 [===>……………………..] - ETA: 4s - loss: 0.4904 - accuracy: 0.8281



15/92 [===>……………………..] - ETA: 4s - loss: 0.4893 - accuracy: 0.8271



16/92 [====>…………………….] - ETA: 4s - loss: 0.4908 - accuracy: 0.8242



17/92 [====>…………………….] - ETA: 4s - loss: 0.4963 - accuracy: 0.8180



18/92 [====>…………………….] - ETA: 4s - loss: 0.4981 - accuracy: 0.8177



19/92 [=====>……………………] - ETA: 4s - loss: 0.5066 - accuracy: 0.8141



20/92 [=====>……………………] - ETA: 4s - loss: 0.5055 - accuracy: 0.8125



21/92 [=====>……………………] - ETA: 4s - loss: 0.5156 - accuracy: 0.8065



22/92 [======>…………………..] - ETA: 4s - loss: 0.5282 - accuracy: 0.8054



23/92 [======>…………………..] - ETA: 3s - loss: 0.5264 - accuracy: 0.8084



24/92 [======>…………………..] - ETA: 3s - loss: 0.5195 - accuracy: 0.8099



25/92 [=======>………………….] - ETA: 3s - loss: 0.5105 - accuracy: 0.8138



26/92 [=======>………………….] - ETA: 3s - loss: 0.5050 - accuracy: 0.8149



27/92 [=======>………………….] - ETA: 3s - loss: 0.5063 - accuracy: 0.8171



28/92 [========>…………………] - ETA: 3s - loss: 0.5090 - accuracy: 0.8147



29/92 [========>…………………] - ETA: 3s - loss: 0.5016 - accuracy: 0.8179



30/92 [========>…………………] - ETA: 3s - loss: 0.4980 - accuracy: 0.8188



31/92 [=========>………………..] - ETA: 3s - loss: 0.5032 - accuracy: 0.8155



32/92 [=========>………………..] - ETA: 3s - loss: 0.4998 - accuracy: 0.8164



33/92 [=========>………………..] - ETA: 3s - loss: 0.4974 - accuracy: 0.8163



34/92 [==========>……………….] - ETA: 3s - loss: 0.5017 - accuracy: 0.8134



35/92 [==========>……………….] - ETA: 3s - loss: 0.5067 - accuracy: 0.8116



36/92 [==========>……………….] - ETA: 3s - loss: 0.5030 - accuracy: 0.8134



37/92 [===========>………………] - ETA: 3s - loss: 0.4970 - accuracy: 0.8159



38/92 [===========>………………] - ETA: 3s - loss: 0.4961 - accuracy: 0.8158



39/92 [===========>………………] - ETA: 3s - loss: 0.4938 - accuracy: 0.8165



40/92 [============>……………..] - ETA: 3s - loss: 0.4929 - accuracy: 0.8164



41/92 [============>……………..] - ETA: 2s - loss: 0.4964 - accuracy: 0.8171



42/92 [============>……………..] - ETA: 2s - loss: 0.4964 - accuracy: 0.8155



43/92 [=============>…………….] - ETA: 2s - loss: 0.4995 - accuracy: 0.8147



44/92 [=============>…………….] - ETA: 2s - loss: 0.5069 - accuracy: 0.8118



45/92 [=============>…………….] - ETA: 2s - loss: 0.5136 - accuracy: 0.8083



46/92 [==============>……………] - ETA: 2s - loss: 0.5124 - accuracy: 0.8084



47/92 [==============>……………] - ETA: 2s - loss: 0.5139 - accuracy: 0.8072



48/92 [==============>……………] - ETA: 2s - loss: 0.5154 - accuracy: 0.8073



49/92 [==============>……………] - ETA: 2s - loss: 0.5157 - accuracy: 0.8068



50/92 [===============>…………..] - ETA: 2s - loss: 0.5195 - accuracy: 0.8062



51/92 [===============>…………..] - ETA: 2s - loss: 0.5191 - accuracy: 0.8064



52/92 [===============>…………..] - ETA: 2s - loss: 0.5174 - accuracy: 0.8077



53/92 [================>………….] - ETA: 2s - loss: 0.5186 - accuracy: 0.8078



54/92 [================>………….] - ETA: 2s - loss: 0.5211 - accuracy: 0.8079



55/92 [================>………….] - ETA: 2s - loss: 0.5175 - accuracy: 0.8091



56/92 [=================>…………] - ETA: 2s - loss: 0.5185 - accuracy: 0.8092



57/92 [=================>…………] - ETA: 2s - loss: 0.5193 - accuracy: 0.8087



58/92 [=================>…………] - ETA: 1s - loss: 0.5222 - accuracy: 0.8071



59/92 [==================>………..] - ETA: 1s - loss: 0.5193 - accuracy: 0.8088



60/92 [==================>………..] - ETA: 1s - loss: 0.5179 - accuracy: 0.8094



61/92 [==================>………..] - ETA: 1s - loss: 0.5190 - accuracy: 0.8094



62/92 [===================>……….] - ETA: 1s - loss: 0.5203 - accuracy: 0.8095



63/92 [===================>……….] - ETA: 1s - loss: 0.5183 - accuracy: 0.8100



64/92 [===================>……….] - ETA: 1s - loss: 0.5221 - accuracy: 0.8091



65/92 [====================>………] - ETA: 1s - loss: 0.5257 - accuracy: 0.8077



66/92 [====================>………] - ETA: 1s - loss: 0.5256 - accuracy: 0.8078



67/92 [====================>………] - ETA: 1s - loss: 0.5246 - accuracy: 0.8083



68/92 [=====================>……..] - ETA: 1s - loss: 0.5257 - accuracy: 0.8065



69/92 [=====================>……..] - ETA: 1s - loss: 0.5277 - accuracy: 0.8057



70/92 [=====================>……..] - ETA: 1s - loss: 0.5288 - accuracy: 0.8049



71/92 [======================>…….] - ETA: 1s - loss: 0.5315 - accuracy: 0.8041



72/92 [======================>…….] - ETA: 1s - loss: 0.5324 - accuracy: 0.8030



73/92 [======================>…….] - ETA: 1s - loss: 0.5341 - accuracy: 0.8031



74/92 [=======================>……] - ETA: 1s - loss: 0.5340 - accuracy: 0.8032



75/92 [=======================>……] - ETA: 0s - loss: 0.5340 - accuracy: 0.8029



76/92 [=======================>……] - ETA: 0s - loss: 0.5359 - accuracy: 0.8014



77/92 [========================>…..] - ETA: 0s - loss: 0.5358 - accuracy: 0.8019



78/92 [========================>…..] - ETA: 0s - loss: 0.5377 - accuracy: 0.8021



79/92 [========================>…..] - ETA: 0s - loss: 0.5358 - accuracy: 0.8030



80/92 [=========================>….] - ETA: 0s - loss: 0.5352 - accuracy: 0.8035



81/92 [=========================>….] - ETA: 0s - loss: 0.5352 - accuracy: 0.8025



82/92 [=========================>….] - ETA: 0s - loss: 0.5368 - accuracy: 0.8018



83/92 [==========================>…] - ETA: 0s - loss: 0.5345 - accuracy: 0.8027



84/92 [==========================>…] - ETA: 0s - loss: 0.5322 - accuracy: 0.8036



86/92 [===========================>..] - ETA: 0s - loss: 0.5315 - accuracy: 0.8043



87/92 [===========================>..] - ETA: 0s - loss: 0.5294 - accuracy: 0.8044



88/92 [===========================>..] - ETA: 0s - loss: 0.5280 - accuracy: 0.8052



89/92 [============================>.] - ETA: 0s - loss: 0.5288 - accuracy: 0.8042



90/92 [============================>.] - ETA: 0s - loss: 0.5304 - accuracy: 0.8043



91/92 [============================>.] - ETA: 0s - loss: 0.5279 - accuracy: 0.8054



92/92 [==============================] - ETA: 0s - loss: 0.5255 - accuracy: 0.8072



92/92 [==============================] - 6s 64ms/step - loss: 0.5255 - accuracy: 0.8072 - val_loss: 0.7346 - val_accuracy: 0.7384

Epoch 14/15
1/92 [..............................] - ETA: 7s - loss: 0.3946 - accuracy: 0.8125
   
2/92 [..............................] - ETA: 5s - loss: 0.4186 - accuracy: 0.8438
   
3/92 [..............................] - ETA: 5s - loss: 0.5825 - accuracy: 0.7708
   
4/92 [>.............................] - ETA: 5s - loss: 0.5210 - accuracy: 0.8047
   
5/92 [>.............................] - ETA: 5s - loss: 0.5927 - accuracy: 0.7750
   
6/92 [>.............................] - ETA: 4s - loss: 0.5649 - accuracy: 0.7865
   
7/92 [=>............................] - ETA: 4s - loss: 0.5574 - accuracy: 0.7902
   
8/92 [=>............................] - ETA: 4s - loss: 0.5343 - accuracy: 0.7969
   
9/92 [=>............................] - ETA: 4s - loss: 0.5348 - accuracy: 0.7951


10/92 [==>………………………] - ETA: 4s - loss: 0.5328 - accuracy: 0.7937



11/92 [==>………………………] - ETA: 4s - loss: 0.5417 - accuracy: 0.7898



12/92 [==>………………………] - ETA: 4s - loss: 0.5423 - accuracy: 0.7839



13/92 [===>……………………..] - ETA: 4s - loss: 0.5293 - accuracy: 0.7957



14/92 [===>……………………..] - ETA: 4s - loss: 0.5237 - accuracy: 0.8013



15/92 [===>……………………..] - ETA: 4s - loss: 0.5192 - accuracy: 0.7958



16/92 [====>…………………….] - ETA: 4s - loss: 0.5227 - accuracy: 0.7969



17/92 [====>…………………….] - ETA: 4s - loss: 0.5219 - accuracy: 0.7996



18/92 [====>…………………….] - ETA: 4s - loss: 0.5389 - accuracy: 0.7951



20/92 [=====>……………………] - ETA: 4s - loss: 0.5362 - accuracy: 0.7927



21/92 [=====>……………………] - ETA: 4s - loss: 0.5310 - accuracy: 0.7937



22/92 [======>…………………..] - ETA: 4s - loss: 0.5365 - accuracy: 0.7945



23/92 [======>…………………..] - ETA: 3s - loss: 0.5421 - accuracy: 0.7940



24/92 [======>…………………..] - ETA: 3s - loss: 0.5364 - accuracy: 0.7947



25/92 [=======>………………….] - ETA: 3s - loss: 0.5400 - accuracy: 0.7942



26/92 [=======>………………….] - ETA: 3s - loss: 0.5380 - accuracy: 0.7949



27/92 [=======>………………….] - ETA: 3s - loss: 0.5330 - accuracy: 0.7967



28/92 [========>…………………] - ETA: 3s - loss: 0.5423 - accuracy: 0.7928



29/92 [========>…………………] - ETA: 3s - loss: 0.5431 - accuracy: 0.7935



30/92 [========>…………………] - ETA: 3s - loss: 0.5458 - accuracy: 0.7920



31/92 [=========>………………..] - ETA: 3s - loss: 0.5496 - accuracy: 0.7907



32/92 [=========>………………..] - ETA: 3s - loss: 0.5505 - accuracy: 0.7884



33/92 [=========>………………..] - ETA: 3s - loss: 0.5535 - accuracy: 0.7872



34/92 [==========>……………….] - ETA: 3s - loss: 0.5620 - accuracy: 0.7870



35/92 [==========>……………….] - ETA: 3s - loss: 0.5640 - accuracy: 0.7878



36/92 [==========>……………….] - ETA: 3s - loss: 0.5689 - accuracy: 0.7858



37/92 [===========>………………] - ETA: 3s - loss: 0.5683 - accuracy: 0.7849



38/92 [===========>………………] - ETA: 3s - loss: 0.5653 - accuracy: 0.7848



39/92 [===========>………………] - ETA: 3s - loss: 0.5606 - accuracy: 0.7863



40/92 [============>……………..] - ETA: 2s - loss: 0.5612 - accuracy: 0.7862



41/92 [============>……………..] - ETA: 2s - loss: 0.5642 - accuracy: 0.7860



42/92 [============>……………..] - ETA: 2s - loss: 0.5674 - accuracy: 0.7844



43/92 [=============>…………….] - ETA: 2s - loss: 0.5612 - accuracy: 0.7880



44/92 [=============>…………….] - ETA: 2s - loss: 0.5614 - accuracy: 0.7879



45/92 [=============>…………….] - ETA: 2s - loss: 0.5615 - accuracy: 0.7863



46/92 [==============>……………] - ETA: 2s - loss: 0.5615 - accuracy: 0.7855



47/92 [==============>……………] - ETA: 2s - loss: 0.5600 - accuracy: 0.7861



48/92 [==============>……………] - ETA: 2s - loss: 0.5585 - accuracy: 0.7880



49/92 [==============>……………] - ETA: 2s - loss: 0.5641 - accuracy: 0.7846



50/92 [===============>…………..] - ETA: 2s - loss: 0.5635 - accuracy: 0.7864



51/92 [===============>…………..] - ETA: 2s - loss: 0.5630 - accuracy: 0.7857



52/92 [===============>…………..] - ETA: 2s - loss: 0.5604 - accuracy: 0.7874



53/92 [================>………….] - ETA: 2s - loss: 0.5621 - accuracy: 0.7855



54/92 [================>………….] - ETA: 2s - loss: 0.5613 - accuracy: 0.7855



55/92 [================>………….] - ETA: 2s - loss: 0.5630 - accuracy: 0.7842



56/92 [=================>…………] - ETA: 2s - loss: 0.5613 - accuracy: 0.7853



57/92 [=================>…………] - ETA: 2s - loss: 0.5607 - accuracy: 0.7858



58/92 [=================>…………] - ETA: 1s - loss: 0.5580 - accuracy: 0.7873



59/92 [==================>………..] - ETA: 1s - loss: 0.5560 - accuracy: 0.7872



60/92 [==================>………..] - ETA: 1s - loss: 0.5529 - accuracy: 0.7887



61/92 [==================>………..] - ETA: 1s - loss: 0.5507 - accuracy: 0.7896



62/92 [===================>……….] - ETA: 1s - loss: 0.5497 - accuracy: 0.7900



63/92 [===================>……….] - ETA: 1s - loss: 0.5490 - accuracy: 0.7903



64/92 [===================>……….] - ETA: 1s - loss: 0.5505 - accuracy: 0.7887



65/92 [====================>………] - ETA: 1s - loss: 0.5467 - accuracy: 0.7901



66/92 [====================>………] - ETA: 1s - loss: 0.5484 - accuracy: 0.7899



67/92 [====================>………] - ETA: 1s - loss: 0.5478 - accuracy: 0.7903



68/92 [=====================>……..] - ETA: 1s - loss: 0.5458 - accuracy: 0.7911



69/92 [=====================>……..] - ETA: 1s - loss: 0.5454 - accuracy: 0.7914



70/92 [=====================>……..] - ETA: 1s - loss: 0.5482 - accuracy: 0.7899



71/92 [======================>…….] - ETA: 1s - loss: 0.5472 - accuracy: 0.7902



72/92 [======================>…….] - ETA: 1s - loss: 0.5475 - accuracy: 0.7888



73/92 [======================>…….] - ETA: 1s - loss: 0.5450 - accuracy: 0.7904



74/92 [=======================>……] - ETA: 1s - loss: 0.5426 - accuracy: 0.7911



75/92 [=======================>……] - ETA: 0s - loss: 0.5456 - accuracy: 0.7901



76/92 [=======================>……] - ETA: 0s - loss: 0.5460 - accuracy: 0.7892



77/92 [========================>…..] - ETA: 0s - loss: 0.5432 - accuracy: 0.7911



78/92 [========================>…..] - ETA: 0s - loss: 0.5412 - accuracy: 0.7914



79/92 [========================>…..] - ETA: 0s - loss: 0.5420 - accuracy: 0.7909



80/92 [=========================>….] - ETA: 0s - loss: 0.5455 - accuracy: 0.7888



81/92 [=========================>….] - ETA: 0s - loss: 0.5470 - accuracy: 0.7891



82/92 [=========================>….] - ETA: 0s - loss: 0.5454 - accuracy: 0.7898



83/92 [==========================>…] - ETA: 0s - loss: 0.5431 - accuracy: 0.7908



84/92 [==========================>…] - ETA: 0s - loss: 0.5453 - accuracy: 0.7907



85/92 [==========================>…] - ETA: 0s - loss: 0.5463 - accuracy: 0.7906



86/92 [===========================>..] - ETA: 0s - loss: 0.5460 - accuracy: 0.7905



87/92 [===========================>..] - ETA: 0s - loss: 0.5441 - accuracy: 0.7903



88/92 [===========================>..] - ETA: 0s - loss: 0.5444 - accuracy: 0.7899



89/92 [============================>.] - ETA: 0s - loss: 0.5429 - accuracy: 0.7901



90/92 [============================>.] - ETA: 0s - loss: 0.5423 - accuracy: 0.7904



91/92 [============================>.] - ETA: 0s - loss: 0.5448 - accuracy: 0.7893



92/92 [==============================] - ETA: 0s - loss: 0.5438 - accuracy: 0.7895



92/92 [==============================] - 6s 64ms/step - loss: 0.5438 - accuracy: 0.7895 - val_loss: 0.7761 - val_accuracy: 0.7275

Epoch 15/15
1/92 [..............................] - ETA: 7s - loss: 0.5374 - accuracy: 0.8750
   
2/92 [..............................] - ETA: 5s - loss: 0.4638 - accuracy: 0.8906
   
3/92 [..............................] - ETA: 5s - loss: 0.4361 - accuracy: 0.8750
   
4/92 [>.............................] - ETA: 5s - loss: 0.4714 - accuracy: 0.8281
   
5/92 [>.............................] - ETA: 4s - loss: 0.4472 - accuracy: 0.8375
   
6/92 [>.............................] - ETA: 4s - loss: 0.4562 - accuracy: 0.8281
   
7/92 [=>............................] - ETA: 4s - loss: 0.4228 - accuracy: 0.8438
   
8/92 [=>............................] - ETA: 4s - loss: 0.4377 - accuracy: 0.8359
   
9/92 [=>............................] - ETA: 4s - loss: 0.4744 - accuracy: 0.8264


10/92 [==>………………………] - ETA: 4s - loss: 0.4706 - accuracy: 0.8313



11/92 [==>………………………] - ETA: 4s - loss: 0.4714 - accuracy: 0.8324



12/92 [==>………………………] - ETA: 4s - loss: 0.4935 - accuracy: 0.8255



13/92 [===>……………………..] - ETA: 4s - loss: 0.4925 - accuracy: 0.8245



14/92 [===>……………………..] - ETA: 4s - loss: 0.4784 - accuracy: 0.8281



15/92 [===>……………………..] - ETA: 4s - loss: 0.4763 - accuracy: 0.8271



16/92 [====>…………………….] - ETA: 4s - loss: 0.4744 - accuracy: 0.8301



17/92 [====>…………………….] - ETA: 4s - loss: 0.4797 - accuracy: 0.8254



18/92 [====>…………………….] - ETA: 4s - loss: 0.4824 - accuracy: 0.8264



19/92 [=====>……………………] - ETA: 4s - loss: 0.4777 - accuracy: 0.8273



20/92 [=====>……………………] - ETA: 4s - loss: 0.4756 - accuracy: 0.8297



21/92 [=====>……………………] - ETA: 4s - loss: 0.4717 - accuracy: 0.8304



22/92 [======>…………………..] - ETA: 4s - loss: 0.4740 - accuracy: 0.8281



23/92 [======>…………………..] - ETA: 3s - loss: 0.4742 - accuracy: 0.8261



24/92 [======>…………………..] - ETA: 3s - loss: 0.4757 - accuracy: 0.8229



25/92 [=======>………………….] - ETA: 3s - loss: 0.4786 - accuracy: 0.8213



26/92 [=======>………………….] - ETA: 3s - loss: 0.4786 - accuracy: 0.8209



27/92 [=======>………………….] - ETA: 3s - loss: 0.4836 - accuracy: 0.8183



28/92 [========>…………………] - ETA: 3s - loss: 0.4816 - accuracy: 0.8181



29/92 [========>…………………] - ETA: 3s - loss: 0.4780 - accuracy: 0.8190



30/92 [========>…………………] - ETA: 3s - loss: 0.4789 - accuracy: 0.8177



31/92 [=========>………………..] - ETA: 3s - loss: 0.4751 - accuracy: 0.8175



32/92 [=========>………………..] - ETA: 3s - loss: 0.4703 - accuracy: 0.8184



33/92 [=========>………………..] - ETA: 3s - loss: 0.4701 - accuracy: 0.8191



34/92 [==========>……………….] - ETA: 3s - loss: 0.4664 - accuracy: 0.8199



35/92 [==========>……………….] - ETA: 3s - loss: 0.4676 - accuracy: 0.8196



36/92 [==========>……………….] - ETA: 3s - loss: 0.4686 - accuracy: 0.8203



37/92 [===========>………………] - ETA: 3s - loss: 0.4684 - accuracy: 0.8209



38/92 [===========>………………] - ETA: 3s - loss: 0.4648 - accuracy: 0.8199



39/92 [===========>………………] - ETA: 3s - loss: 0.4673 - accuracy: 0.8181



40/92 [============>……………..] - ETA: 2s - loss: 0.4649 - accuracy: 0.8188



41/92 [============>……………..] - ETA: 2s - loss: 0.4656 - accuracy: 0.8186



42/92 [============>……………..] - ETA: 2s - loss: 0.4694 - accuracy: 0.8177



43/92 [=============>…………….] - ETA: 2s - loss: 0.4750 - accuracy: 0.8154



44/92 [=============>…………….] - ETA: 2s - loss: 0.4788 - accuracy: 0.8118



45/92 [=============>…………….] - ETA: 2s - loss: 0.4772 - accuracy: 0.8132



46/92 [==============>……………] - ETA: 2s - loss: 0.4774 - accuracy: 0.8132



47/92 [==============>……………] - ETA: 2s - loss: 0.4781 - accuracy: 0.8145



48/92 [==============>……………] - ETA: 2s - loss: 0.4757 - accuracy: 0.8158



49/92 [==============>……………] - ETA: 2s - loss: 0.4758 - accuracy: 0.8157



50/92 [===============>…………..] - ETA: 2s - loss: 0.4741 - accuracy: 0.8169



51/92 [===============>…………..] - ETA: 2s - loss: 0.4775 - accuracy: 0.8168



52/92 [===============>…………..] - ETA: 2s - loss: 0.4794 - accuracy: 0.8149



53/92 [================>………….] - ETA: 2s - loss: 0.4896 - accuracy: 0.8125



54/92 [================>………….] - ETA: 2s - loss: 0.4885 - accuracy: 0.8137



55/92 [================>………….] - ETA: 2s - loss: 0.4864 - accuracy: 0.8136



56/92 [=================>…………] - ETA: 2s - loss: 0.4861 - accuracy: 0.8136



57/92 [=================>…………] - ETA: 2s - loss: 0.4869 - accuracy: 0.8130



58/92 [=================>…………] - ETA: 1s - loss: 0.4843 - accuracy: 0.8141



59/92 [==================>………..] - ETA: 1s - loss: 0.4882 - accuracy: 0.8130



60/92 [==================>………..] - ETA: 1s - loss: 0.4926 - accuracy: 0.8109



61/92 [==================>………..] - ETA: 1s - loss: 0.4920 - accuracy: 0.8115



62/92 [===================>……….] - ETA: 1s - loss: 0.4945 - accuracy: 0.8095



63/92 [===================>……….] - ETA: 1s - loss: 0.4951 - accuracy: 0.8095



64/92 [===================>……….] - ETA: 1s - loss: 0.4949 - accuracy: 0.8101



65/92 [====================>………] - ETA: 1s - loss: 0.4933 - accuracy: 0.8096



66/92 [====================>………] - ETA: 1s - loss: 0.4981 - accuracy: 0.8073



67/92 [====================>………] - ETA: 1s - loss: 0.4964 - accuracy: 0.8083



68/92 [=====================>……..] - ETA: 1s - loss: 0.4934 - accuracy: 0.8093



69/92 [=====================>……..] - ETA: 1s - loss: 0.4972 - accuracy: 0.8093



70/92 [=====================>……..] - ETA: 1s - loss: 0.4996 - accuracy: 0.8080



71/92 [======================>…….] - ETA: 1s - loss: 0.5009 - accuracy: 0.8072



72/92 [======================>…….] - ETA: 1s - loss: 0.5067 - accuracy: 0.8064



73/92 [======================>…….] - ETA: 1s - loss: 0.5055 - accuracy: 0.8061



74/92 [=======================>……] - ETA: 1s - loss: 0.5039 - accuracy: 0.8066



75/92 [=======================>……] - ETA: 0s - loss: 0.5070 - accuracy: 0.8046



76/92 [=======================>……] - ETA: 0s - loss: 0.5080 - accuracy: 0.8047



77/92 [========================>…..] - ETA: 0s - loss: 0.5113 - accuracy: 0.8040



78/92 [========================>…..] - ETA: 0s - loss: 0.5107 - accuracy: 0.8045



79/92 [========================>…..] - ETA: 0s - loss: 0.5141 - accuracy: 0.8026



80/92 [=========================>….] - ETA: 0s - loss: 0.5121 - accuracy: 0.8035



81/92 [=========================>….] - ETA: 0s - loss: 0.5107 - accuracy: 0.8044



82/92 [=========================>….] - ETA: 0s - loss: 0.5102 - accuracy: 0.8041



83/92 [==========================>…] - ETA: 0s - loss: 0.5116 - accuracy: 0.8035



84/92 [==========================>…] - ETA: 0s - loss: 0.5117 - accuracy: 0.8032



85/92 [==========================>…] - ETA: 0s - loss: 0.5099 - accuracy: 0.8037



86/92 [===========================>..] - ETA: 0s - loss: 0.5129 - accuracy: 0.8016



87/92 [===========================>..] - ETA: 0s - loss: 0.5155 - accuracy: 0.8006



89/92 [============================>.] - ETA: 0s - loss: 0.5151 - accuracy: 0.8011



90/92 [============================>.] - ETA: 0s - loss: 0.5154 - accuracy: 0.8012



91/92 [============================>.] - ETA: 0s - loss: 0.5143 - accuracy: 0.8017



92/92 [==============================] - ETA: 0s - loss: 0.5165 - accuracy: 0.8004



92/92 [==============================] - 6s 64ms/step - loss: 0.5165 - accuracy: 0.8004 - val_loss: 0.7822 - val_accuracy: 0.7289

../_images/301-tensorflow-training-openvino-nncf-with-output_3_1452.png
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 74ms/step
This image most likely belongs to sunflowers with a 99.24 percent confidence.
2024-02-10 01:10:41.607321: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'random_flip_input' with dtype float and shape [?,180,180,3]
     [[{{node random_flip_input}}]]
2024-02-10 01:10:41.692936: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
     [[{{node inputs}}]]
2024-02-10 01:10:41.703478: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'random_flip_input' with dtype float and shape [?,180,180,3]
     [[{{node random_flip_input}}]]
2024-02-10 01:10:41.714441: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
     [[{{node inputs}}]]
2024-02-10 01:10:41.722136: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
     [[{{node inputs}}]]
2024-02-10 01:10:41.728943: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
     [[{{node inputs}}]]
2024-02-10 01:10:41.739850: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
     [[{{node inputs}}]]
2024-02-10 01:10:41.778944: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'sequential_1_input' with dtype float and shape [?,180,180,3]
     [[{{node sequential_1_input}}]]
2024-02-10 01:10:41.847949: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
     [[{{node inputs}}]]
2024-02-10 01:10:41.868730: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'sequential_1_input' with dtype float and shape [?,180,180,3]
     [[{{node sequential_1_input}}]]
2024-02-10 01:10:41.907533: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,22,22,64]
     [[{{node inputs}}]]
2024-02-10 01:10:41.933427: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
     [[{{node inputs}}]]
2024-02-10 01:10:42.007195: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
     [[{{node inputs}}]]
2024-02-10 01:10:42.149634: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
     [[{{node inputs}}]]
2024-02-10 01:10:42.286901: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,22,22,64]
     [[{{node inputs}}]]
2024-02-10 01:10:42.489392: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
     [[{{node inputs}}]]
2024-02-10 01:10:42.517820: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
     [[{{node inputs}}]]
2024-02-10 01:10:42.563861: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
     [[{{node inputs}}]]
WARNING:absl:Found untraced functions such as _jit_compiled_convolution_op, _jit_compiled_convolution_op, _jit_compiled_convolution_op, _update_step_xla while saving (showing 4 of 4). These functions will not be directly callable after loading.
INFO:tensorflow:Assets written to: model/flower/saved_model/assets
INFO:tensorflow:Assets written to: model/flower/saved_model/assets
output/A_Close_Up_Photo_of_a_Dandelion.jpg:   0%|          | 0.00/21.7k [00:00<?, ?B/s]
(1, 180, 180, 3)
[1,180,180,3]
This image most likely belongs to dandelion with a 97.96 percent confidence.
../_images/301-tensorflow-training-openvino-nncf-with-output_3_1464.png

Imports

The Post Training Quantization API is implemented in the nncf library.

import sys

import matplotlib.pyplot as plt
import numpy as np
import nncf
from openvino.runtime import Core
from openvino.runtime import serialize
from PIL import Image
from sklearn.metrics import accuracy_score

sys.path.append("../utils")
from notebook_utils import download_file
INFO:nncf:NNCF initialized successfully. Supported frameworks detected: torch, tensorflow, onnx, openvino

Post-training Quantization with NNCF

NNCF provides a suite of advanced algorithms for Neural Networks inference optimization in OpenVINO with minimal accuracy drop.

Create a quantized model from the pre-trained FP32 model and the calibration dataset. The optimization process contains the following steps:

  1. Create a Dataset for quantization.

  2. Run nncf.quantize for getting an optimized model.

The validation dataset already defined in the training notebook.

img_height = 180
img_width = 180
val_dataset = tf.keras.preprocessing.image_dataset_from_directory(
  data_dir,
  validation_split=0.2,
  subset="validation",
  seed=123,
  image_size=(img_height, img_width),
  batch_size=1
)

for a, b in val_dataset:
    print(type(a), type(b))
    break
Found 3670 files belonging to 5 classes.
Using 734 files for validation.
<class 'tensorflow.python.framework.ops.EagerTensor'> <class 'tensorflow.python.framework.ops.EagerTensor'>
2024-02-10 01:10:45.668839: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [734]
     [[{{node Placeholder/_0}}]]
2024-02-10 01:10:45.669302: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [734]
     [[{{node Placeholder/_0}}]]

The validation dataset can be reused in quantization process. But it returns a tuple (images, labels), whereas calibration_dataset should only return images. The transformation function helps to transform a user validation dataset to the calibration dataset.

def transform_fn(data_item):
    """
    The transformation function transforms a data item into model input data.
    This function should be passed when the data item cannot be used as model's input.
    """
    images, _ = data_item
    return images.numpy()


calibration_dataset = nncf.Dataset(val_dataset, transform_fn)

Download Intermediate Representation (IR) model.

core = Core()
ir_model = core.read_model(model_xml)

Use Basic Quantization Flow. To use the most advanced quantization flow that allows to apply 8-bit quantization to the model with accuracy control see Quantizing with accuracy control.

quantized_model = nncf.quantize(
    ir_model,
    calibration_dataset,
    subset_size=1000
)
Output()
Exception in thread Thread-88:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File
"/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-609/.workspace/scm/ov-notebook/.venv/lib/python3.8/si
te-packages/rich/live.py", line 32, in run
    self.live.refresh()
  File
"/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-609/.workspace/scm/ov-notebook/.venv/lib/python3.8/si
te-packages/rich/live.py", line 223, in refresh
    self._live_render.set_renderable(self.renderable)
  File
"/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-609/.workspace/scm/ov-notebook/.venv/lib/python3.8/si
te-packages/rich/live.py", line 203, in renderable
    renderable = self.get_renderable()
  File
"/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-609/.workspace/scm/ov-notebook/.venv/lib/python3.8/si
te-packages/rich/live.py", line 98, in get_renderable
    self._get_renderable()
  File
"/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-609/.workspace/scm/ov-notebook/.venv/lib/python3.8/si
te-packages/rich/progress.py", line 1537, in get_renderable
    renderable = Group(*self.get_renderables())
  File
"/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-609/.workspace/scm/ov-notebook/.venv/lib/python3.8/si
te-packages/rich/progress.py", line 1542, in get_renderables
    table = self.make_tasks_table(self.tasks)
  File
"/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-609/.workspace/scm/ov-notebook/.venv/lib/python3.8/si
te-packages/rich/progress.py", line 1566, in make_tasks_table
    table.add_row(
  File
"/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-609/.workspace/scm/ov-notebook/.venv/lib/python3.8/si
te-packages/rich/progress.py", line 1571, in <genexpr>
    else column(task)
  File
"/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-609/.workspace/scm/ov-notebook/.venv/lib/python3.8/si
te-packages/rich/progress.py", line 528, in __call__
    renderable = self.render(task)
  File
"/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-609/.workspace/scm/ov-notebook/.venv/lib/python3.8/si
te-packages/nncf/common/logging/track_progress.py", line 58, in render
    text = super().render(task)
  File
"/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-609/.workspace/scm/ov-notebook/.venv/lib/python3.8/si
te-packages/rich/progress.py", line 787, in render
    task_time = task.time_remaining
  File
"/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-609/.workspace/scm/ov-notebook/.venv/lib/python3.8/si
te-packages/rich/progress.py", line 1039, in time_remaining
    estimate = ceil(remaining / speed)
  File
"/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-609/.workspace/scm/ov-notebook/.venv/lib/python3.8/si
te-packages/tensorflow/python/util/traceback_utils.py", line 153, in error_handler
    raise e.with_traceback(filtered_tb) from None
  File
"/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-609/.workspace/scm/ov-notebook/.venv/lib/python3.8/si
te-packages/tensorflow/python/ops/math_ops.py", line 1569, in _truediv_python3
    raise TypeError(f"`x` and `y` must have the same dtype, "
TypeError: `x` and `y` must have the same dtype, got tf.int64 != tf.float32.
Output()

Save quantized model to benchmark.

compressed_model_dir = Path("model/optimized")
compressed_model_dir.mkdir(parents=True, exist_ok=True)
compressed_model_xml = compressed_model_dir / "flower_ir.xml"
serialize(quantized_model, str(compressed_model_xml))

Select inference device

select device from dropdown list for running inference using OpenVINO

import ipywidgets as widgets

device = widgets.Dropdown(
    options=core.available_devices + ["AUTO"],
    value='AUTO',
    description='Device:',
    disabled=False,
)

device
Dropdown(description='Device:', index=1, options=('CPU', 'AUTO'), value='AUTO')

Compare Metrics

Define a metric to determine the performance of the model.

For this demo we define validate function to compute accuracy metrics.

def validate(model, validation_loader):
    """
    Evaluate model and compute accuracy metrics.

    :param model: Model to validate
    :param validation_loader: Validation dataset
    :returns: Accuracy scores
    """
    predictions = []
    references = []

    output = model.outputs[0]

    for images, target in validation_loader:
        pred = model(images.numpy())[output]

        predictions.append(np.argmax(pred, axis=1))
        references.append(target)

    predictions = np.concatenate(predictions, axis=0)
    references = np.concatenate(references, axis=0)

    scores = accuracy_score(references, predictions)

    return scores

Calculate accuracy for the original model and the quantized model.

original_compiled_model = core.compile_model(model=ir_model, device_name=device.value)
quantized_compiled_model = core.compile_model(model=quantized_model, device_name=device.value)

original_accuracy = validate(original_compiled_model, val_dataset)
quantized_accuracy = validate(quantized_compiled_model, val_dataset)

print(f"Accuracy of the original model: {original_accuracy:.3f}")
print(f"Accuracy of the quantized model: {quantized_accuracy:.3f}")
Accuracy of the original model: 0.729
Accuracy of the quantized model: 0.729

Compare file size of the models.

original_model_size = model_xml.with_suffix(".bin").stat().st_size / 1024
quantized_model_size = compressed_model_xml.with_suffix(".bin").stat().st_size / 1024

print(f"Original model size: {original_model_size:.2f} KB")
print(f"Quantized model size: {quantized_model_size:.2f} KB")
Original model size: 7791.65 KB
Quantized model size: 3897.08 KB

So, we can see that the original and quantized models have similar accuracy with a much smaller size of the quantized model.

Run Inference on Quantized Model

Copy the preprocess function from the training notebook and run inference on the quantized model with OpenVINO. See the OpenVINO API tutorial for more information about running inference with OpenVINO Python API.

def pre_process_image(imagePath, img_height=180):
    # Model input format
    n, c, h, w = [1, 3, img_height, img_height]
    image = Image.open(imagePath)
    image = image.resize((h, w), resample=Image.BILINEAR)

    # Convert to array and change data layout from HWC to CHW
    image = np.array(image)

    input_image = image.reshape((n, h, w, c))

    return input_image
# Get the names of the input and output layer
# model_pot = ie.read_model(model="model/optimized/flower_ir.xml")
input_layer = quantized_compiled_model.input(0)
output_layer = quantized_compiled_model.output(0)

# Get the class names: a list of directory names in alphabetical order
class_names = sorted([item.name for item in Path(data_dir).iterdir() if item.is_dir()])

# Run inference on an input image...
inp_img_url = (
    "https://upload.wikimedia.org/wikipedia/commons/4/48/A_Close_Up_Photo_of_a_Dandelion.jpg"
)
directory = "output"
inp_file_name = "A_Close_Up_Photo_of_a_Dandelion.jpg"
file_path = Path(directory)/Path(inp_file_name)
# Download the image if it does not exist yet
if not Path(inp_file_name).exists():
    download_file(inp_img_url, inp_file_name, directory=directory)

# Pre-process the image and get it ready for inference.
input_image = pre_process_image(imagePath=file_path)
print(f'input image shape: {input_image.shape}')
print(f'input layer shape: {input_layer.shape}')

res = quantized_compiled_model([input_image])[output_layer]

score = tf.nn.softmax(res[0])

# Show the results
image = Image.open(file_path)
plt.imshow(image)
print(
    "This image most likely belongs to {} with a {:.2f} percent confidence.".format(
        class_names[np.argmax(score)], 100 * np.max(score)
    )
)
'output/A_Close_Up_Photo_of_a_Dandelion.jpg' already exists.
input image shape: (1, 180, 180, 3)
input layer shape: [1,180,180,3]
This image most likely belongs to dandelion with a 98.03 percent confidence.
../_images/301-tensorflow-training-openvino-nncf-with-output_27_1.png

Compare Inference Speed

Measure inference speed with the OpenVINO Benchmark App.

Benchmark App is a command line tool that measures raw inference performance for a specified OpenVINO IR model. Run benchmark_app --help to see a list of available parameters. By default, Benchmark App tests the performance of the model specified with the -m parameter with asynchronous inference on CPU, for one minute. Use the -d parameter to test performance on a different device, for example an Intel integrated Graphics (iGPU), and -t to set the number of seconds to run inference. See the documentation for more information.

This tutorial uses a wrapper function from Notebook Utils. It prints the benchmark_app command with the chosen parameters.

In the next cells, inference speed will be measured for the original and quantized model on CPU. If an iGPU is available, inference speed will be measured for CPU+GPU as well. The number of seconds is set to 15.

NOTE: For the most accurate performance estimation, it is recommended to run benchmark_app in a terminal/command prompt after closing other applications.

# print the available devices on this system
print("Device information:")
print(core.get_property("CPU", "FULL_DEVICE_NAME"))
if "GPU" in core.available_devices:
    print(core.get_property("GPU", "FULL_DEVICE_NAME"))
Device information:
Intel(R) Core(TM) i9-10920X CPU @ 3.50GHz
# Original model - CPU
! benchmark_app -m $model_xml -d CPU -t 15 -api async
[Step 1/11] Parsing and validating input arguments
[ INFO ] Parsing input parameters
[Step 2/11] Loading OpenVINO Runtime
[ INFO ] OpenVINO:
[ INFO ] Build ................................. 2023.3.0-13775-ceeafaf64f3-releases/2023/3
[ INFO ]
[ INFO ] Device info:
[ INFO ] CPU
[ INFO ] Build ................................. 2023.3.0-13775-ceeafaf64f3-releases/2023/3
[ INFO ]
[ INFO ]
[Step 3/11] Setting device configuration
[ WARNING ] Performance hint was not explicitly specified in command line. Device(CPU) performance hint will be set to PerformanceMode.THROUGHPUT.
[Step 4/11] Reading model files
[ INFO ] Loading model files
[ INFO ] Read model took 12.98 ms
[ INFO ] Original model I/O parameters:
[ INFO ] Model inputs:
[ INFO ]     sequential_1_input (node: sequential_1_input) : f32 / [...] / [1,180,180,3]
[ INFO ] Model outputs:
[ INFO ]     outputs (node: sequential_2/outputs/BiasAdd) : f32 / [...] / [1,5]
[Step 5/11] Resizing model to match image sizes and given batch
[ INFO ] Model batch size: 1
[Step 6/11] Configuring input of the model
[ INFO ] Model inputs:
[ INFO ]     sequential_1_input (node: sequential_1_input) : u8 / [N,H,W,C] / [1,180,180,3]
[ INFO ] Model outputs:
[ INFO ]     outputs (node: sequential_2/outputs/BiasAdd) : f32 / [...] / [1,5]
[Step 7/11] Loading the model to the device
[ INFO ] Compile model took 71.95 ms
[Step 8/11] Querying optimal runtime parameters
[ INFO ] Model:
[ INFO ]   NETWORK_NAME: TensorFlow_Frontend_IR
[ INFO ]   OPTIMAL_NUMBER_OF_INFER_REQUESTS: 12
[ INFO ]   NUM_STREAMS: 12
[ INFO ]   AFFINITY: Affinity.CORE
[ INFO ]   INFERENCE_NUM_THREADS: 24
[ INFO ]   PERF_COUNT: NO
[ INFO ]   INFERENCE_PRECISION_HINT: <Type: 'float32'>
[ INFO ]   PERFORMANCE_HINT: THROUGHPUT
[ INFO ]   EXECUTION_MODE_HINT: ExecutionMode.PERFORMANCE
[ INFO ]   PERFORMANCE_HINT_NUM_REQUESTS: 0
[ INFO ]   ENABLE_CPU_PINNING: True
[ INFO ]   SCHEDULING_CORE_TYPE: SchedulingCoreType.ANY_CORE
[ INFO ]   ENABLE_HYPER_THREADING: True
[ INFO ]   EXECUTION_DEVICES: ['CPU']
[ INFO ]   CPU_DENORMALS_OPTIMIZATION: False
[ INFO ]   CPU_SPARSE_WEIGHTS_DECOMPRESSION_RATE: 1.0
[Step 9/11] Creating infer requests and preparing input tensors
[ WARNING ] No input files were given for input 'sequential_1_input'!. This input will be filled with random values!
[ INFO ] Fill input 'sequential_1_input' with random values
[Step 10/11] Measuring performance (Start inference asynchronously, 12 inference requests, limits: 15000 ms duration)
[ INFO ] Benchmarking in inference only mode (inputs filling are not included in measurement loop).
[ INFO ] First inference took 7.48 ms
[Step 11/11] Dumping statistics report
[ INFO ] Execution Devices:['CPU']
[ INFO ] Count:            57660 iterations
[ INFO ] Duration:         15004.59 ms
[ INFO ] Latency:
[ INFO ]    Median:        2.95 ms
[ INFO ]    Average:       2.95 ms
[ INFO ]    Min:           1.69 ms
[ INFO ]    Max:           12.80 ms
[ INFO ] Throughput:   3842.82 FPS
# Quantized model - CPU
! benchmark_app -m $compressed_model_xml -d CPU -t 15 -api async
[Step 1/11] Parsing and validating input arguments
[ INFO ] Parsing input parameters
[Step 2/11] Loading OpenVINO Runtime
[ INFO ] OpenVINO:
[ INFO ] Build ................................. 2023.3.0-13775-ceeafaf64f3-releases/2023/3
[ INFO ]
[ INFO ] Device info:
[ INFO ] CPU
[ INFO ] Build ................................. 2023.3.0-13775-ceeafaf64f3-releases/2023/3
[ INFO ]
[ INFO ]
[Step 3/11] Setting device configuration
[ WARNING ] Performance hint was not explicitly specified in command line. Device(CPU) performance hint will be set to PerformanceMode.THROUGHPUT.
[Step 4/11] Reading model files
[ INFO ] Loading model files
[ INFO ] Read model took 15.15 ms
[ INFO ] Original model I/O parameters:
[ INFO ] Model inputs:
[ INFO ]     sequential_1_input (node: sequential_1_input) : f32 / [...] / [1,180,180,3]
[ INFO ] Model outputs:
[ INFO ]     outputs (node: sequential_2/outputs/BiasAdd) : f32 / [...] / [1,5]
[Step 5/11] Resizing model to match image sizes and given batch
[ INFO ] Model batch size: 1
[Step 6/11] Configuring input of the model
[ INFO ] Model inputs:
[ INFO ]     sequential_1_input (node: sequential_1_input) : u8 / [N,H,W,C] / [1,180,180,3]
[ INFO ] Model outputs:
[ INFO ]     outputs (node: sequential_2/outputs/BiasAdd) : f32 / [...] / [1,5]
[Step 7/11] Loading the model to the device
[ INFO ] Compile model took 67.57 ms
[Step 8/11] Querying optimal runtime parameters
[ INFO ] Model:
[ INFO ]   NETWORK_NAME: TensorFlow_Frontend_IR
[ INFO ]   OPTIMAL_NUMBER_OF_INFER_REQUESTS: 12
[ INFO ]   NUM_STREAMS: 12
[ INFO ]   AFFINITY: Affinity.CORE
[ INFO ]   INFERENCE_NUM_THREADS: 24
[ INFO ]   PERF_COUNT: NO
[ INFO ]   INFERENCE_PRECISION_HINT: <Type: 'float32'>
[ INFO ]   PERFORMANCE_HINT: THROUGHPUT
[ INFO ]   EXECUTION_MODE_HINT: ExecutionMode.PERFORMANCE
[ INFO ]   PERFORMANCE_HINT_NUM_REQUESTS: 0
[ INFO ]   ENABLE_CPU_PINNING: True
[ INFO ]   SCHEDULING_CORE_TYPE: SchedulingCoreType.ANY_CORE
[ INFO ]   ENABLE_HYPER_THREADING: True
[ INFO ]   EXECUTION_DEVICES: ['CPU']
[ INFO ]   CPU_DENORMALS_OPTIMIZATION: False
[ INFO ]   CPU_SPARSE_WEIGHTS_DECOMPRESSION_RATE: 1.0
[Step 9/11] Creating infer requests and preparing input tensors
[ WARNING ] No input files were given for input 'sequential_1_input'!. This input will be filled with random values!
[ INFO ] Fill input 'sequential_1_input' with random values
[Step 10/11] Measuring performance (Start inference asynchronously, 12 inference requests, limits: 15000 ms duration)
[ INFO ] Benchmarking in inference only mode (inputs filling are not included in measurement loop).
[ INFO ] First inference took 1.99 ms
[Step 11/11] Dumping statistics report
[ INFO ] Execution Devices:['CPU']
[ INFO ] Count:            178152 iterations
[ INFO ] Duration:         15001.85 ms
[ INFO ] Latency:
[ INFO ]    Median:        0.94 ms
[ INFO ]    Average:       0.98 ms
[ INFO ]    Min:           0.55 ms
[ INFO ]    Max:           11.77 ms
[ INFO ] Throughput:   11875.34 FPS

Benchmark on MULTI:CPU,GPU

With a recent Intel CPU, the best performance can often be achieved by doing inference on both the CPU and the iGPU, with OpenVINO’s Multi Device Plugin. It takes a bit longer to load a model on GPU than on CPU, so this benchmark will take a bit longer to complete than the CPU benchmark, when run for the first time. Benchmark App supports caching, by specifying the --cdir parameter. In the cells below, the model will cached to the model_cache directory.

# Original model - MULTI:CPU,GPU
if "GPU" in core.available_devices:
    ! benchmark_app -m $model_xml -d MULTI:CPU,GPU -t 15 -api async
else:
    print("A supported integrated GPU is not available on this system.")
A supported integrated GPU is not available on this system.
# Quantized model - MULTI:CPU,GPU
if "GPU" in core.available_devices:
    ! benchmark_app -m $compressed_model_xml -d MULTI:CPU,GPU -t 15 -api async
else:
    print("A supported integrated GPU is not available on this system.")
A supported integrated GPU is not available on this system.
# print the available devices on this system
print("Device information:")
print(core.get_property("CPU", "FULL_DEVICE_NAME"))
if "GPU" in core.available_devices:
    print(core.get_property("GPU", "FULL_DEVICE_NAME"))
Device information:
Intel(R) Core(TM) i9-10920X CPU @ 3.50GHz

Original IR model - CPU

benchmark_output = %sx benchmark_app -m $model_xml -t 15 -api async
# Remove logging info from benchmark_app output and show only the results
benchmark_result = benchmark_output[-8:]
print("\n".join(benchmark_result))
[ INFO ] Count:            57840 iterations
[ INFO ] Duration:         15004.24 ms
[ INFO ] Latency:
[ INFO ]    Median:        2.94 ms
[ INFO ]    Average:       2.94 ms
[ INFO ]    Min:           1.98 ms
[ INFO ]    Max:           12.12 ms
[ INFO ] Throughput:   3854.91 FPS

Quantized IR model - CPU

benchmark_output = %sx benchmark_app -m $compressed_model_xml -t 15 -api async
# Remove logging info from benchmark_app output and show only the results
benchmark_result = benchmark_output[-8:]
print("\n".join(benchmark_result))
[ INFO ] Count:            178836 iterations
[ INFO ] Duration:         15001.19 ms
[ INFO ] Latency:
[ INFO ]    Median:        0.94 ms
[ INFO ]    Average:       0.97 ms
[ INFO ]    Min:           0.58 ms
[ INFO ]    Max:           6.85 ms
[ INFO ] Throughput:   11921.45 FPS

Original IR model - MULTI:CPU,GPU

With a recent Intel CPU, the best performance can often be achieved by doing inference on both the CPU and the iGPU, with OpenVINO’s Multi Device Plugin. It takes a bit longer to load a model on GPU than on CPU, so this benchmark will take a bit longer to complete than the CPU benchmark.

if "GPU" in core.available_devices:
    benchmark_output = %sx benchmark_app -m $model_xml -d MULTI:CPU,GPU -t 15 -api async
    # Remove logging info from benchmark_app output and show only the results
    benchmark_result = benchmark_output[-8:]
    print("\n".join(benchmark_result))
else:
    print("An GPU is not available on this system.")
An GPU is not available on this system.

Quantized IR model - MULTI:CPU,GPU

if "GPU" in core.available_devices:
    benchmark_output = %sx benchmark_app -m $compressed_model_xml -d MULTI:CPU,GPU -t 15 -api async
    # Remove logging info from benchmark_app output and show only the results
    benchmark_result = benchmark_output[-8:]
    print("\n".join(benchmark_result))
else:
    print("An GPU is not available on this system.")
An GPU is not available on this system.