From Training to Deployment with TensorFlow and OpenVINO™¶
This Jupyter notebook can be launched after a local installation only.
Table of contents:¶
# @title Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Copyright 2018 The TensorFlow Authors
#
# Modified for OpenVINO Notebooks
This tutorial demonstrates how to train, convert, and deploy an image classification model with TensorFlow and OpenVINO. This particular notebook shows the process where we perform the inference step on the freshly trained model that is converted to OpenVINO IR with model conversion API. For faster inference speed on the model created in this notebook, check out the Post-Training Quantization with TensorFlow Classification Model notebook.
This training code comprises the official TensorFlow Image Classification Tutorial in its entirety.
The flower_ir.bin
and flower_ir.xml
(pre-trained models) can be
obtained by executing the code with ‘Runtime->Run All’ or the
Ctrl+F9
command.
%pip install -q "openvino>=2023.1.0"
DEPRECATION: pytorch-lightning 1.6.5 has a non-standard dependency specifier torch>=1.8.*. pip 24.1 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of pytorch-lightning or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at https://github.com/pypa/pip/issues/12063
Note: you may need to restart the kernel to use updated packages.
TensorFlow Image Classification Training¶
The first part of the tutorial shows how to classify images of flowers
(based on the TensorFlow’s official tutorial). It creates an image
classifier using a keras.Sequential
model, and loads data using
preprocessing.image_dataset_from_directory
. You will gain practical
experience with the following concepts:
Efficiently loading a dataset off disk.
Identifying overfitting and applying techniques to mitigate it, including data augmentation and Dropout.
This tutorial follows a basic machine learning workflow:
Examine and understand data
Build an input pipeline
Build the model
Train the model
Test the model
Import TensorFlow and Other Libraries¶
import os
import sys
from pathlib import Path
import PIL
import matplotlib.pyplot as plt
import numpy as np
import tensorflow as tf
from PIL import Image
import openvino as ov
from tensorflow import keras
from tensorflow.keras import layers
from tensorflow.keras.models import Sequential
sys.path.append("../utils")
from notebook_utils import download_file
2024-03-13 01:02:24.497427: I tensorflow/core/util/port.cc:110] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable TF_ENABLE_ONEDNN_OPTS=0. 2024-03-13 01:02:24.532546: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2024-03-13 01:02:25.044342: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Download and Explore the Dataset¶
This tutorial uses a dataset of about 3,700 photos of flowers. The dataset contains 5 sub-directories, one per class:
flower_photo/
daisy/
dandelion/
roses/
sunflowers/
tulips/
import pathlib
dataset_url = "https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz"
data_dir = tf.keras.utils.get_file('flower_photos', origin=dataset_url, untar=True)
data_dir = pathlib.Path(data_dir)
After downloading, you should now have a copy of the dataset available. There are 3,670 total images:
image_count = len(list(data_dir.glob('*/*.jpg')))
print(image_count)
3670
Here are some roses:
roses = list(data_dir.glob('roses/*'))
PIL.Image.open(str(roses[0]))
PIL.Image.open(str(roses[1]))
And some tulips:
tulips = list(data_dir.glob('tulips/*'))
PIL.Image.open(str(tulips[0]))
PIL.Image.open(str(tulips[1]))
Load Using keras.preprocessing¶
Let’s load these images off disk using the helpful
image_dataset_from_directory
utility. This will take you from a directory of images on disk to a
tf.data.Dataset
in just a couple lines of code. If you like, you can
also write your own data loading code from scratch by visiting the load
images
tutorial.
Create a Dataset¶
Define some parameters for the loader:
batch_size = 32
img_height = 180
img_width = 180
It’s good practice to use a validation split when developing your model. Let’s use 80% of the images for training, and 20% for validation.
train_ds = tf.keras.preprocessing.image_dataset_from_directory(
data_dir,
validation_split=0.2,
subset="training",
seed=123,
image_size=(img_height, img_width),
batch_size=batch_size)
Found 3670 files belonging to 5 classes.
Using 2936 files for training.
2024-03-13 01:02:28.106945: E tensorflow/compiler/xla/stream_executor/cuda/cuda_driver.cc:266] failed call to cuInit: CUDA_ERROR_COMPAT_NOT_SUPPORTED_ON_DEVICE: forward compatibility was attempted on non supported HW
2024-03-13 01:02:28.106977: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:168] retrieving CUDA diagnostic information for host: iotg-dev-workstation-07
2024-03-13 01:02:28.106982: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:175] hostname: iotg-dev-workstation-07
2024-03-13 01:02:28.107105: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:199] libcuda reported version is: 470.223.2
2024-03-13 01:02:28.107122: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:203] kernel reported version is: 470.182.3
2024-03-13 01:02:28.107125: E tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:312] kernel version 470.182.3 does not match DSO version 470.223.2 -- cannot find working devices in this configuration
val_ds = tf.keras.preprocessing.image_dataset_from_directory(
data_dir,
validation_split=0.2,
subset="validation",
seed=123,
image_size=(img_height, img_width),
batch_size=batch_size)
Found 3670 files belonging to 5 classes.
Using 734 files for validation.
You can find the class names in the class_names
attribute on these
datasets. These correspond to the directory names in alphabetical order.
class_names = train_ds.class_names
print(class_names)
['daisy', 'dandelion', 'roses', 'sunflowers', 'tulips']
Visualize the Data¶
Here are the first 9 images from the training dataset.
plt.figure(figsize=(10, 10))
for images, labels in train_ds.take(1):
for i in range(9):
ax = plt.subplot(3, 3, i + 1)
plt.imshow(images[i].numpy().astype("uint8"))
plt.title(class_names[labels[i]])
plt.axis("off")
2024-03-13 01:02:28.449873: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
[[{{node Placeholder/_4}}]]
2024-03-13 01:02:28.450244: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
[[{{node Placeholder/_0}}]]
You will train a model using these datasets by passing them to
model.fit
in a moment. If you like, you can also manually iterate
over the dataset and retrieve batches of images:
for image_batch, labels_batch in train_ds:
print(image_batch.shape)
print(labels_batch.shape)
break
(32, 180, 180, 3)
(32,)
2024-03-13 01:02:29.296083: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
[[{{node Placeholder/_4}}]]
2024-03-13 01:02:29.296450: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
[[{{node Placeholder/_4}}]]
The image_batch
is a tensor of the shape (32, 180, 180, 3)
. This
is a batch of 32 images of shape 180x180x3
(the last dimension
refers to color channels RGB). The label_batch
is a tensor of the
shape (32,)
, these are corresponding labels to the 32 images.
You can call .numpy()
on the image_batch
and labels_batch
tensors to convert them to a numpy.ndarray
.
Configure the Dataset for Performance¶
Let’s make sure to use buffered prefetching so you can yield data from disk without having I/O become blocking. These are two important methods you should use when loading data.
Dataset.cache()
keeps the images in memory after they’re loaded off
disk during the first epoch. This will ensure the dataset does not
become a bottleneck while training your model. If your dataset is too
large to fit into memory, you can also use this method to create a
performant on-disk cache.
Dataset.prefetch()
overlaps data preprocessing and model execution
while training.
Interested readers can learn more about both methods, as well as how to cache data to disk in the data performance guide.
AUTOTUNE = tf.data.AUTOTUNE
train_ds = train_ds.cache().shuffle(1000).prefetch(buffer_size=AUTOTUNE)
val_ds = val_ds.cache().prefetch(buffer_size=AUTOTUNE)
Standardize the Data¶
The RGB channel values are in the [0, 255]
range. This is not ideal
for a neural network; in general you should seek to make your input
values small. Here, you will standardize values to be in the [0, 1]
range by using a Rescaling layer.
normalization_layer = layers.Rescaling(1./255)
Note: The Keras Preprocessing utilities and layers introduced in this section are currently experimental and may change.
There are two ways to use this layer. You can apply it to the dataset by calling map:
normalized_ds = train_ds.map(lambda x, y: (normalization_layer(x), y))
image_batch, labels_batch = next(iter(normalized_ds))
first_image = image_batch[0]
# Notice the pixels values are now in `[0,1]`.
print(np.min(first_image), np.max(first_image))
2024-03-13 01:02:29.513870: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
[[{{node Placeholder/_0}}]]
2024-03-13 01:02:29.514493: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
[[{{node Placeholder/_4}}]]
0.0 1.0
Or, you can include the layer inside your model definition, which can simplify deployment. Let’s use the second approach here.
Note: you previously resized images using the image_size
argument of
image_dataset_from_directory
. If you want to include the resizing
logic in your model as well, you can use the
Resizing
layer.
Create the Model¶
The model consists of three convolution blocks with a max pool layer in
each of them. There’s a fully connected layer with 128 units on top of
it that is activated by a relu
activation function. This model has
not been tuned for high accuracy, the goal of this tutorial is to show a
standard approach.
num_classes = 5
model = Sequential([
layers.experimental.preprocessing.Rescaling(1./255, input_shape=(img_height, img_width, 3)),
layers.Conv2D(16, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Conv2D(32, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Conv2D(64, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Flatten(),
layers.Dense(128, activation='relu'),
layers.Dense(num_classes)
])
Compile the Model¶
For this tutorial, choose the optimizers.Adam
optimizer and
losses.SparseCategoricalCrossentropy
loss function. To view training
and validation accuracy for each training epoch, pass the metrics
argument.
model.compile(optimizer='adam',
loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
metrics=['accuracy'])
Model Summary¶
View all the layers of the network using the model’s summary
method.
NOTE: This section is commented out for performance reasons. Please feel free to uncomment these to compare the results.
# model.summary()
Train the Model¶
# epochs=10
# history = model.fit(
# train_ds,
# validation_data=val_ds,
# epochs=epochs
# )
Visualize Training Results¶
Create plots of loss and accuracy on the training and validation sets.
# acc = history.history['accuracy']
# val_acc = history.history['val_accuracy']
# loss = history.history['loss']
# val_loss = history.history['val_loss']
# epochs_range = range(epochs)
# plt.figure(figsize=(8, 8))
# plt.subplot(1, 2, 1)
# plt.plot(epochs_range, acc, label='Training Accuracy')
# plt.plot(epochs_range, val_acc, label='Validation Accuracy')
# plt.legend(loc='lower right')
# plt.title('Training and Validation Accuracy')
# plt.subplot(1, 2, 2)
# plt.plot(epochs_range, loss, label='Training Loss')
# plt.plot(epochs_range, val_loss, label='Validation Loss')
# plt.legend(loc='upper right')
# plt.title('Training and Validation Loss')
# plt.show()
As you can see from the plots, training accuracy and validation accuracy are off by large margin and the model has achieved only around 60% accuracy on the validation set.
Let’s look at what went wrong and try to increase the overall performance of the model.
Overfitting¶
In the plots above, the training accuracy is increasing linearly over time, whereas validation accuracy stalls around 60% in the training process. Also, the difference in accuracy between training and validation accuracy is noticeable — a sign of overfitting.
When there are a small number of training examples, the model sometimes learns from noises or unwanted details from training examples—to an extent that it negatively impacts the performance of the model on new examples. This phenomenon is known as overfitting. It means that the model will have a difficult time generalizing on a new dataset.
There are multiple ways to fight overfitting in the training process. In this tutorial, you’ll use data augmentation and add Dropout to your model.
Data Augmentation¶
Overfitting generally occurs when there are a small number of training examples. Data augmentation takes the approach of generating additional training data from your existing examples by augmenting them using random transformations that yield believable-looking images. This helps expose the model to more aspects of the data and generalize better.
You will implement data augmentation using the layers from
tf.keras.layers.experimental.preprocessing
. These can be included
inside your model like other layers, and run on the GPU.
data_augmentation = keras.Sequential(
[
layers.RandomFlip("horizontal",
input_shape=(img_height,
img_width,
3)),
layers.RandomRotation(0.1),
layers.RandomZoom(0.1),
]
)
Let’s visualize what a few augmented examples look like by applying data augmentation to the same image several times:
plt.figure(figsize=(10, 10))
for images, _ in train_ds.take(1):
for i in range(9):
augmented_images = data_augmentation(images)
ax = plt.subplot(3, 3, i + 1)
plt.imshow(augmented_images[0].numpy().astype("uint8"))
plt.axis("off")
2024-03-13 01:02:30.494557: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
[[{{node Placeholder/_0}}]]
2024-03-13 01:02:30.495526: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
[[{{node Placeholder/_4}}]]
You will use data augmentation to train a model in a moment.
Dropout¶
Another technique to reduce overfitting is to introduce Dropout to the network, a form of regularization.
When you apply Dropout to a layer it randomly drops out (by setting the activation to zero) a number of output units from the layer during the training process. Dropout takes a fractional number as its input value, in the form such as 0.1, 0.2, 0.4, etc. This means dropping out 10%, 20% or 40% of the output units randomly from the applied layer.
Let’s create a new neural network using layers.Dropout
, then train
it using augmented images.
model = Sequential([
data_augmentation,
layers.Rescaling(1./255),
layers.Conv2D(16, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Conv2D(32, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Conv2D(64, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Dropout(0.2),
layers.Flatten(),
layers.Dense(128, activation='relu'),
layers.Dense(num_classes, name="outputs")
])
Compile and Train the Model¶
model.compile(optimizer='adam',
loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
metrics=['accuracy'])
model.summary()
Model: "sequential_2"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
sequential_1 (Sequential) (None, 180, 180, 3) 0
rescaling_2 (Rescaling) (None, 180, 180, 3) 0
conv2d_3 (Conv2D) (None, 180, 180, 16) 448
max_pooling2d_3 (MaxPooling (None, 90, 90, 16) 0
2D)
conv2d_4 (Conv2D) (None, 90, 90, 32) 4640
max_pooling2d_4 (MaxPooling (None, 45, 45, 32) 0
2D)
conv2d_5 (Conv2D) (None, 45, 45, 64) 18496
max_pooling2d_5 (MaxPooling (None, 22, 22, 64) 0
2D)
dropout (Dropout) (None, 22, 22, 64) 0
flatten_1 (Flatten) (None, 30976) 0
dense_2 (Dense) (None, 128) 3965056
outputs (Dense) (None, 5) 645
=================================================================
Total params: 3,989,285
Trainable params: 3,989,285
Non-trainable params: 0
_________________________________________________________________
epochs = 15
history = model.fit(
train_ds,
validation_data=val_ds,
epochs=epochs
)
Epoch 1/15
2024-03-13 01:02:31.608332: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
[[{{node Placeholder/_4}}]]
2024-03-13 01:02:31.608737: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
[[{{node Placeholder/_0}}]]
1/92 [..............................] - ETA: 1:24 - loss: 1.6184 - accuracy: 0.1875
2/92 [..............................] - ETA: 6s - loss: 2.2743 - accuracy: 0.2344
3/92 [..............................] - ETA: 5s - loss: 2.2543 - accuracy: 0.2708
4/92 [>.............................] - ETA: 5s - loss: 2.1636 - accuracy: 0.2344
5/92 [>.............................] - ETA: 5s - loss: 2.0592 - accuracy: 0.2313
6/92 [>.............................] - ETA: 5s - loss: 1.9847 - accuracy: 0.2188
7/92 [=>............................] - ETA: 5s - loss: 1.9289 - accuracy: 0.2232
8/92 [=>............................] - ETA: 5s - loss: 1.8900 - accuracy: 0.2109
9/92 [=>............................] - ETA: 5s - loss: 1.8577 - accuracy: 0.2118
10/92 [==>………………………] - ETA: 5s - loss: 1.8316 - accuracy: 0.2219
11/92 [==>………………………] - ETA: 4s - loss: 1.8075 - accuracy: 0.2386
12/92 [==>………………………] - ETA: 4s - loss: 1.7888 - accuracy: 0.2396
13/92 [===>……………………..] - ETA: 4s - loss: 1.7741 - accuracy: 0.2380
14/92 [===>……………………..] - ETA: 4s - loss: 1.7589 - accuracy: 0.2388
15/92 [===>……………………..] - ETA: 4s - loss: 1.7447 - accuracy: 0.2438
16/92 [====>…………………….] - ETA: 4s - loss: 1.7372 - accuracy: 0.2363
17/92 [====>…………………….] - ETA: 4s - loss: 1.7278 - accuracy: 0.2353
18/92 [====>…………………….] - ETA: 4s - loss: 1.7193 - accuracy: 0.2344
19/92 [=====>……………………] - ETA: 4s - loss: 1.7100 - accuracy: 0.2434
20/92 [=====>……………………] - ETA: 4s - loss: 1.7013 - accuracy: 0.2500
21/92 [=====>……………………] - ETA: 4s - loss: 1.6927 - accuracy: 0.2515
22/92 [======>…………………..] - ETA: 4s - loss: 1.6837 - accuracy: 0.2571
23/92 [======>…………………..] - ETA: 4s - loss: 1.6761 - accuracy: 0.2609
24/92 [======>…………………..] - ETA: 4s - loss: 1.6660 - accuracy: 0.2591
25/92 [=======>………………….] - ETA: 3s - loss: 1.6576 - accuracy: 0.2600
26/92 [=======>………………….] - ETA: 3s - loss: 1.6490 - accuracy: 0.2632
27/92 [=======>………………….] - ETA: 3s - loss: 1.6399 - accuracy: 0.2650
28/92 [========>…………………] - ETA: 3s - loss: 1.6377 - accuracy: 0.2634
29/92 [========>…………………] - ETA: 3s - loss: 1.6316 - accuracy: 0.2694
30/92 [========>…………………] - ETA: 3s - loss: 1.6295 - accuracy: 0.2708
31/92 [=========>………………..] - ETA: 3s - loss: 1.6243 - accuracy: 0.2762
32/92 [=========>………………..] - ETA: 3s - loss: 1.6173 - accuracy: 0.2803
33/92 [=========>………………..] - ETA: 3s - loss: 1.6127 - accuracy: 0.2812
34/92 [==========>……………….] - ETA: 3s - loss: 1.6060 - accuracy: 0.2840
35/92 [==========>……………….] - ETA: 3s - loss: 1.5981 - accuracy: 0.2929
36/92 [==========>……………….] - ETA: 3s - loss: 1.5899 - accuracy: 0.2969
37/92 [===========>………………] - ETA: 3s - loss: 1.5830 - accuracy: 0.3015
38/92 [===========>………………] - ETA: 3s - loss: 1.5829 - accuracy: 0.3043
39/92 [===========>………………] - ETA: 3s - loss: 1.5791 - accuracy: 0.3061
40/92 [============>……………..] - ETA: 3s - loss: 1.5691 - accuracy: 0.3086
41/92 [============>……………..] - ETA: 3s - loss: 1.5647 - accuracy: 0.3095
42/92 [============>……………..] - ETA: 2s - loss: 1.5574 - accuracy: 0.3110
43/92 [=============>…………….] - ETA: 2s - loss: 1.5506 - accuracy: 0.3125
44/92 [=============>…………….] - ETA: 2s - loss: 1.5488 - accuracy: 0.3146
45/92 [=============>…………….] - ETA: 2s - loss: 1.5400 - accuracy: 0.3222
46/92 [==============>……………] - ETA: 2s - loss: 1.5359 - accuracy: 0.3247
47/92 [==============>……………] - ETA: 2s - loss: 1.5324 - accuracy: 0.3265
48/92 [==============>……………] - ETA: 2s - loss: 1.5247 - accuracy: 0.3307
49/92 [==============>……………] - ETA: 2s - loss: 1.5200 - accuracy: 0.3304
50/92 [===============>…………..] - ETA: 2s - loss: 1.5206 - accuracy: 0.3300
51/92 [===============>…………..] - ETA: 2s - loss: 1.5190 - accuracy: 0.3303
52/92 [===============>…………..] - ETA: 2s - loss: 1.5134 - accuracy: 0.3341
53/92 [================>………….] - ETA: 2s - loss: 1.5079 - accuracy: 0.3361
54/92 [================>………….] - ETA: 2s - loss: 1.5048 - accuracy: 0.3374
55/92 [================>………….] - ETA: 2s - loss: 1.4971 - accuracy: 0.3398
56/92 [=================>…………] - ETA: 2s - loss: 1.4886 - accuracy: 0.3415
57/92 [=================>…………] - ETA: 2s - loss: 1.4843 - accuracy: 0.3464
58/92 [=================>…………] - ETA: 2s - loss: 1.4789 - accuracy: 0.3490
59/92 [==================>………..] - ETA: 1s - loss: 1.4769 - accuracy: 0.3521
60/92 [==================>………..] - ETA: 1s - loss: 1.4753 - accuracy: 0.3536
61/92 [==================>………..] - ETA: 1s - loss: 1.4759 - accuracy: 0.3529
62/92 [===================>……….] - ETA: 1s - loss: 1.4752 - accuracy: 0.3527
63/92 [===================>……….] - ETA: 1s - loss: 1.4726 - accuracy: 0.3531
64/92 [===================>……….] - ETA: 1s - loss: 1.4728 - accuracy: 0.3520
65/92 [====================>………] - ETA: 1s - loss: 1.4683 - accuracy: 0.3533
66/92 [====================>………] - ETA: 1s - loss: 1.4644 - accuracy: 0.3541
67/92 [====================>………] - ETA: 1s - loss: 1.4629 - accuracy: 0.3553
68/92 [=====================>……..] - ETA: 1s - loss: 1.4577 - accuracy: 0.3584
69/92 [=====================>……..] - ETA: 1s - loss: 1.4533 - accuracy: 0.3618
70/92 [=====================>……..] - ETA: 1s - loss: 1.4489 - accuracy: 0.3651
71/92 [======================>…….] - ETA: 1s - loss: 1.4463 - accuracy: 0.3684
72/92 [======================>…….] - ETA: 1s - loss: 1.4421 - accuracy: 0.3706
73/92 [======================>…….] - ETA: 1s - loss: 1.4395 - accuracy: 0.3716
74/92 [=======================>……] - ETA: 1s - loss: 1.4346 - accuracy: 0.3746
75/92 [=======================>……] - ETA: 0s - loss: 1.4299 - accuracy: 0.3779
76/92 [=======================>……] - ETA: 0s - loss: 1.4250 - accuracy: 0.3804
77/92 [========================>…..] - ETA: 0s - loss: 1.4247 - accuracy: 0.3803
78/92 [========================>…..] - ETA: 0s - loss: 1.4206 - accuracy: 0.3822
79/92 [========================>…..] - ETA: 0s - loss: 1.4185 - accuracy: 0.3829
80/92 [=========================>….] - ETA: 0s - loss: 1.4136 - accuracy: 0.3848
81/92 [=========================>….] - ETA: 0s - loss: 1.4083 - accuracy: 0.3870
82/92 [=========================>….] - ETA: 0s - loss: 1.4077 - accuracy: 0.3869
83/92 [==========================>…] - ETA: 0s - loss: 1.4029 - accuracy: 0.3901
84/92 [==========================>…] - ETA: 0s - loss: 1.3995 - accuracy: 0.3922
85/92 [==========================>…] - ETA: 0s - loss: 1.3979 - accuracy: 0.3945
86/92 [===========================>..] - ETA: 0s - loss: 1.3949 - accuracy: 0.3969
87/92 [===========================>..] - ETA: 0s - loss: 1.3903 - accuracy: 0.3999
88/92 [===========================>..] - ETA: 0s - loss: 1.3886 - accuracy: 0.4010
89/92 [============================>.] - ETA: 0s - loss: 1.3860 - accuracy: 0.4004
90/92 [============================>.] - ETA: 0s - loss: 1.3812 - accuracy: 0.4011
91/92 [============================>.] - ETA: 0s - loss: 1.3783 - accuracy: 0.4029
92/92 [==============================] - ETA: 0s - loss: 1.3797 - accuracy: 0.4026
2024-03-13 01:02:37.888562: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [734]
[[{{node Placeholder/_4}}]]
2024-03-13 01:02:37.888844: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [734]
[[{{node Placeholder/_4}}]]
92/92 [==============================] - 7s 67ms/step - loss: 1.3797 - accuracy: 0.4026 - val_loss: 1.1118 - val_accuracy: 0.5763
Epoch 2/15
1/92 [..............................] - ETA: 7s - loss: 0.9105 - accuracy: 0.5625
2/92 [..............................] - ETA: 5s - loss: 0.9593 - accuracy: 0.5938
3/92 [..............................] - ETA: 5s - loss: 0.9856 - accuracy: 0.5729
4/92 [>.............................] - ETA: 5s - loss: 1.0403 - accuracy: 0.5859
5/92 [>.............................] - ETA: 5s - loss: 1.0360 - accuracy: 0.5875
6/92 [>.............................] - ETA: 5s - loss: 1.0193 - accuracy: 0.6094
7/92 [=>............................] - ETA: 5s - loss: 0.9867 - accuracy: 0.6295
8/92 [=>............................] - ETA: 4s - loss: 1.0265 - accuracy: 0.6094
9/92 [=>............................] - ETA: 4s - loss: 1.0582 - accuracy: 0.5903
10/92 [==>………………………] - ETA: 4s - loss: 1.0566 - accuracy: 0.5875
11/92 [==>………………………] - ETA: 4s - loss: 1.0390 - accuracy: 0.5938
12/92 [==>………………………] - ETA: 4s - loss: 1.0259 - accuracy: 0.6068
13/92 [===>……………………..] - ETA: 4s - loss: 1.0263 - accuracy: 0.6010
14/92 [===>……………………..] - ETA: 4s - loss: 1.0501 - accuracy: 0.5871
15/92 [===>……………………..] - ETA: 4s - loss: 1.0419 - accuracy: 0.5938
16/92 [====>…………………….] - ETA: 4s - loss: 1.0477 - accuracy: 0.5938
17/92 [====>…………………….] - ETA: 4s - loss: 1.0513 - accuracy: 0.5882
18/92 [====>…………………….] - ETA: 4s - loss: 1.0676 - accuracy: 0.5816
19/92 [=====>……………………] - ETA: 4s - loss: 1.0657 - accuracy: 0.5839
20/92 [=====>……………………] - ETA: 4s - loss: 1.0612 - accuracy: 0.5813
21/92 [=====>……………………] - ETA: 4s - loss: 1.0648 - accuracy: 0.5833
22/92 [======>…………………..] - ETA: 4s - loss: 1.0619 - accuracy: 0.5852
23/92 [======>…………………..] - ETA: 4s - loss: 1.0571 - accuracy: 0.5856
24/92 [======>…………………..] - ETA: 3s - loss: 1.0605 - accuracy: 0.5833
25/92 [=======>………………….] - ETA: 3s - loss: 1.0545 - accuracy: 0.5813
26/92 [=======>………………….] - ETA: 3s - loss: 1.0681 - accuracy: 0.5745
27/92 [=======>………………….] - ETA: 3s - loss: 1.0650 - accuracy: 0.5718
28/92 [========>…………………] - ETA: 3s - loss: 1.0666 - accuracy: 0.5714
29/92 [========>…………………] - ETA: 3s - loss: 1.0674 - accuracy: 0.5711
30/92 [========>…………………] - ETA: 3s - loss: 1.0589 - accuracy: 0.5760
31/92 [=========>………………..] - ETA: 3s - loss: 1.0596 - accuracy: 0.5726
32/92 [=========>………………..] - ETA: 3s - loss: 1.0527 - accuracy: 0.5781
33/92 [=========>………………..] - ETA: 3s - loss: 1.0541 - accuracy: 0.5795
34/92 [==========>……………….] - ETA: 3s - loss: 1.0645 - accuracy: 0.5754
35/92 [==========>……………….] - ETA: 3s - loss: 1.0628 - accuracy: 0.5759
36/92 [==========>……………….] - ETA: 3s - loss: 1.0694 - accuracy: 0.5755
37/92 [===========>………………] - ETA: 3s - loss: 1.0661 - accuracy: 0.5769
38/92 [===========>………………] - ETA: 3s - loss: 1.0729 - accuracy: 0.5757
39/92 [===========>………………] - ETA: 3s - loss: 1.0714 - accuracy: 0.5777
40/92 [============>……………..] - ETA: 3s - loss: 1.0669 - accuracy: 0.5797
41/92 [============>……………..] - ETA: 2s - loss: 1.0680 - accuracy: 0.5777
42/92 [============>……………..] - ETA: 2s - loss: 1.0679 - accuracy: 0.5796
43/92 [=============>…………….] - ETA: 2s - loss: 1.0672 - accuracy: 0.5799
44/92 [=============>…………….] - ETA: 2s - loss: 1.0661 - accuracy: 0.5824
45/92 [=============>…………….] - ETA: 2s - loss: 1.0683 - accuracy: 0.5806
46/92 [==============>……………] - ETA: 2s - loss: 1.0705 - accuracy: 0.5781
47/92 [==============>……………] - ETA: 2s - loss: 1.0696 - accuracy: 0.5785
48/92 [==============>……………] - ETA: 2s - loss: 1.0685 - accuracy: 0.5781
49/92 [==============>……………] - ETA: 2s - loss: 1.0680 - accuracy: 0.5784
50/92 [===============>…………..] - ETA: 2s - loss: 1.0653 - accuracy: 0.5794
51/92 [===============>…………..] - ETA: 2s - loss: 1.0619 - accuracy: 0.5803
52/92 [===============>…………..] - ETA: 2s - loss: 1.0605 - accuracy: 0.5793
53/92 [================>………….] - ETA: 2s - loss: 1.0614 - accuracy: 0.5784
54/92 [================>………….] - ETA: 2s - loss: 1.0619 - accuracy: 0.5799
55/92 [================>………….] - ETA: 2s - loss: 1.0602 - accuracy: 0.5818
56/92 [=================>…………] - ETA: 2s - loss: 1.0626 - accuracy: 0.5809
57/92 [=================>…………] - ETA: 2s - loss: 1.0606 - accuracy: 0.5811
58/92 [=================>…………] - ETA: 1s - loss: 1.0610 - accuracy: 0.5803
59/92 [==================>………..] - ETA: 1s - loss: 1.0651 - accuracy: 0.5768
60/92 [==================>………..] - ETA: 1s - loss: 1.0656 - accuracy: 0.5766
61/92 [==================>………..] - ETA: 1s - loss: 1.0637 - accuracy: 0.5763
63/92 [===================>……….] - ETA: 1s - loss: 1.0669 - accuracy: 0.5762
64/92 [===================>……….] - ETA: 1s - loss: 1.0693 - accuracy: 0.5755
65/92 [====================>………] - ETA: 1s - loss: 1.0669 - accuracy: 0.5777
66/92 [====================>………] - ETA: 1s - loss: 1.0697 - accuracy: 0.5770
67/92 [====================>………] - ETA: 1s - loss: 1.0703 - accuracy: 0.5758
68/92 [=====================>……..] - ETA: 1s - loss: 1.0701 - accuracy: 0.5756
69/92 [=====================>……..] - ETA: 1s - loss: 1.0717 - accuracy: 0.5745
70/92 [=====================>……..] - ETA: 1s - loss: 1.0695 - accuracy: 0.5757
71/92 [======================>…….] - ETA: 1s - loss: 1.0663 - accuracy: 0.5777
72/92 [======================>…….] - ETA: 1s - loss: 1.0643 - accuracy: 0.5784
73/92 [======================>…….] - ETA: 1s - loss: 1.0641 - accuracy: 0.5786
74/92 [=======================>……] - ETA: 1s - loss: 1.0667 - accuracy: 0.5775
75/92 [=======================>……] - ETA: 0s - loss: 1.0668 - accuracy: 0.5786
76/92 [=======================>……] - ETA: 0s - loss: 1.0638 - accuracy: 0.5813
77/92 [========================>…..] - ETA: 0s - loss: 1.0649 - accuracy: 0.5794
78/92 [========================>…..] - ETA: 0s - loss: 1.0628 - accuracy: 0.5808
79/92 [========================>…..] - ETA: 0s - loss: 1.0628 - accuracy: 0.5813
80/92 [=========================>….] - ETA: 0s - loss: 1.0607 - accuracy: 0.5835
81/92 [=========================>….] - ETA: 0s - loss: 1.0591 - accuracy: 0.5832
82/92 [=========================>….] - ETA: 0s - loss: 1.0591 - accuracy: 0.5837
83/92 [==========================>…] - ETA: 0s - loss: 1.0640 - accuracy: 0.5812
84/92 [==========================>…] - ETA: 0s - loss: 1.0634 - accuracy: 0.5825
85/92 [==========================>…] - ETA: 0s - loss: 1.0626 - accuracy: 0.5833
86/92 [===========================>..] - ETA: 0s - loss: 1.0595 - accuracy: 0.5853
87/92 [===========================>..] - ETA: 0s - loss: 1.0590 - accuracy: 0.5857
88/92 [===========================>..] - ETA: 0s - loss: 1.0568 - accuracy: 0.5873
89/92 [============================>.] - ETA: 0s - loss: 1.0556 - accuracy: 0.5880
90/92 [============================>.] - ETA: 0s - loss: 1.0545 - accuracy: 0.5874
91/92 [============================>.] - ETA: 0s - loss: 1.0543 - accuracy: 0.5868
92/92 [==============================] - ETA: 0s - loss: 1.0518 - accuracy: 0.5882
92/92 [==============================] - 6s 64ms/step - loss: 1.0518 - accuracy: 0.5882 - val_loss: 0.9841 - val_accuracy: 0.5981
Epoch 3/15
1/92 [..............................] - ETA: 7s - loss: 1.1521 - accuracy: 0.5312
2/92 [..............................] - ETA: 5s - loss: 1.0563 - accuracy: 0.5781
3/92 [..............................] - ETA: 5s - loss: 1.0220 - accuracy: 0.5938
4/92 [>.............................] - ETA: 5s - loss: 1.0244 - accuracy: 0.5859
5/92 [>.............................] - ETA: 5s - loss: 1.0256 - accuracy: 0.5813
6/92 [>.............................] - ETA: 5s - loss: 0.9926 - accuracy: 0.5990
7/92 [=>............................] - ETA: 5s - loss: 0.9785 - accuracy: 0.6071
8/92 [=>............................] - ETA: 4s - loss: 0.9672 - accuracy: 0.6094
9/92 [=>............................] - ETA: 4s - loss: 0.9527 - accuracy: 0.6076
10/92 [==>………………………] - ETA: 4s - loss: 0.9405 - accuracy: 0.6156
11/92 [==>………………………] - ETA: 4s - loss: 0.9489 - accuracy: 0.6193
12/92 [==>………………………] - ETA: 4s - loss: 0.9709 - accuracy: 0.6094
13/92 [===>……………………..] - ETA: 4s - loss: 0.9715 - accuracy: 0.6106
14/92 [===>……………………..] - ETA: 4s - loss: 0.9619 - accuracy: 0.6138
15/92 [===>……………………..] - ETA: 4s - loss: 0.9677 - accuracy: 0.6125
16/92 [====>…………………….] - ETA: 4s - loss: 0.9741 - accuracy: 0.6074
17/92 [====>…………………….] - ETA: 4s - loss: 0.9724 - accuracy: 0.6085
18/92 [====>…………………….] - ETA: 4s - loss: 0.9591 - accuracy: 0.6146
19/92 [=====>……………………] - ETA: 4s - loss: 0.9407 - accuracy: 0.6234
20/92 [=====>……………………] - ETA: 4s - loss: 0.9417 - accuracy: 0.6281
21/92 [=====>……………………] - ETA: 4s - loss: 0.9341 - accuracy: 0.6310
22/92 [======>…………………..] - ETA: 4s - loss: 0.9311 - accuracy: 0.6335
23/92 [======>…………………..] - ETA: 4s - loss: 0.9406 - accuracy: 0.6318
24/92 [======>…………………..] - ETA: 3s - loss: 0.9421 - accuracy: 0.6341
25/92 [=======>………………….] - ETA: 3s - loss: 0.9379 - accuracy: 0.6363
26/92 [=======>………………….] - ETA: 3s - loss: 0.9524 - accuracy: 0.6274
27/92 [=======>………………….] - ETA: 3s - loss: 0.9595 - accuracy: 0.6238
28/92 [========>…………………] - ETA: 3s - loss: 0.9559 - accuracy: 0.6261
29/92 [========>…………………] - ETA: 3s - loss: 0.9540 - accuracy: 0.6250
30/92 [========>…………………] - ETA: 3s - loss: 0.9505 - accuracy: 0.6250
31/92 [=========>………………..] - ETA: 3s - loss: 0.9499 - accuracy: 0.6290
32/92 [=========>………………..] - ETA: 3s - loss: 0.9488 - accuracy: 0.6309
33/92 [=========>………………..] - ETA: 3s - loss: 0.9437 - accuracy: 0.6316
34/92 [==========>……………….] - ETA: 3s - loss: 0.9398 - accuracy: 0.6324
35/92 [==========>……………….] - ETA: 3s - loss: 0.9302 - accuracy: 0.6366
36/92 [==========>……………….] - ETA: 3s - loss: 0.9310 - accuracy: 0.6328
37/92 [===========>………………] - ETA: 3s - loss: 0.9278 - accuracy: 0.6343
38/92 [===========>………………] - ETA: 3s - loss: 0.9292 - accuracy: 0.6324
39/92 [===========>………………] - ETA: 3s - loss: 0.9283 - accuracy: 0.6322
40/92 [============>……………..] - ETA: 3s - loss: 0.9275 - accuracy: 0.6336
41/92 [============>……………..] - ETA: 2s - loss: 0.9367 - accuracy: 0.6319
42/92 [============>……………..] - ETA: 2s - loss: 0.9357 - accuracy: 0.6317
43/92 [=============>…………….] - ETA: 2s - loss: 0.9372 - accuracy: 0.6330
44/92 [=============>…………….] - ETA: 2s - loss: 0.9420 - accuracy: 0.6286
45/92 [=============>…………….] - ETA: 2s - loss: 0.9441 - accuracy: 0.6264
46/92 [==============>……………] - ETA: 2s - loss: 0.9472 - accuracy: 0.6264
47/92 [==============>……………] - ETA: 2s - loss: 0.9472 - accuracy: 0.6283
48/92 [==============>……………] - ETA: 2s - loss: 0.9445 - accuracy: 0.6289
49/92 [==============>……………] - ETA: 2s - loss: 0.9433 - accuracy: 0.6301
50/92 [===============>…………..] - ETA: 2s - loss: 0.9481 - accuracy: 0.6300
51/92 [===============>…………..] - ETA: 2s - loss: 0.9492 - accuracy: 0.6281
52/92 [===============>…………..] - ETA: 2s - loss: 0.9519 - accuracy: 0.6268
53/92 [================>………….] - ETA: 2s - loss: 0.9495 - accuracy: 0.6291
54/92 [================>………….] - ETA: 2s - loss: 0.9574 - accuracy: 0.6238
55/92 [================>………….] - ETA: 2s - loss: 0.9581 - accuracy: 0.6227
56/92 [=================>…………] - ETA: 2s - loss: 0.9565 - accuracy: 0.6228
57/92 [=================>…………] - ETA: 2s - loss: 0.9548 - accuracy: 0.6239
58/92 [=================>…………] - ETA: 1s - loss: 0.9557 - accuracy: 0.6245
59/92 [==================>………..] - ETA: 1s - loss: 0.9570 - accuracy: 0.6229
60/92 [==================>………..] - ETA: 1s - loss: 0.9561 - accuracy: 0.6245
61/92 [==================>………..] - ETA: 1s - loss: 0.9546 - accuracy: 0.6255
62/92 [===================>……….] - ETA: 1s - loss: 0.9585 - accuracy: 0.6225
63/92 [===================>……….] - ETA: 1s - loss: 0.9584 - accuracy: 0.6210
64/92 [===================>……….] - ETA: 1s - loss: 0.9548 - accuracy: 0.6230
65/92 [====================>………] - ETA: 1s - loss: 0.9520 - accuracy: 0.6231
66/92 [====================>………] - ETA: 1s - loss: 0.9485 - accuracy: 0.6255
67/92 [====================>………] - ETA: 1s - loss: 0.9491 - accuracy: 0.6264
68/92 [=====================>……..] - ETA: 1s - loss: 0.9464 - accuracy: 0.6273
69/92 [=====================>……..] - ETA: 1s - loss: 0.9452 - accuracy: 0.6291
70/92 [=====================>……..] - ETA: 1s - loss: 0.9460 - accuracy: 0.6299
71/92 [======================>…….] - ETA: 1s - loss: 0.9446 - accuracy: 0.6307
72/92 [======================>…….] - ETA: 1s - loss: 0.9407 - accuracy: 0.6324
73/92 [======================>…….] - ETA: 1s - loss: 0.9405 - accuracy: 0.6327
74/92 [=======================>……] - ETA: 1s - loss: 0.9419 - accuracy: 0.6322
75/92 [=======================>……] - ETA: 0s - loss: 0.9417 - accuracy: 0.6317
76/92 [=======================>……] - ETA: 0s - loss: 0.9425 - accuracy: 0.6332
77/92 [========================>…..] - ETA: 0s - loss: 0.9392 - accuracy: 0.6339
79/92 [========================>…..] - ETA: 0s - loss: 0.9406 - accuracy: 0.6341
80/92 [=========================>….] - ETA: 0s - loss: 0.9403 - accuracy: 0.6356
81/92 [=========================>….] - ETA: 0s - loss: 0.9384 - accuracy: 0.6354
82/92 [=========================>….] - ETA: 0s - loss: 0.9377 - accuracy: 0.6369
83/92 [==========================>…] - ETA: 0s - loss: 0.9376 - accuracy: 0.6371
84/92 [==========================>…] - ETA: 0s - loss: 0.9396 - accuracy: 0.6377
85/92 [==========================>…] - ETA: 0s - loss: 0.9436 - accuracy: 0.6350
86/92 [===========================>..] - ETA: 0s - loss: 0.9425 - accuracy: 0.6356
87/92 [===========================>..] - ETA: 0s - loss: 0.9419 - accuracy: 0.6358
88/92 [===========================>..] - ETA: 0s - loss: 0.9404 - accuracy: 0.6364
89/92 [============================>.] - ETA: 0s - loss: 0.9380 - accuracy: 0.6380
90/92 [============================>.] - ETA: 0s - loss: 0.9362 - accuracy: 0.6382
91/92 [============================>.] - ETA: 0s - loss: 0.9391 - accuracy: 0.6371
92/92 [==============================] - ETA: 0s - loss: 0.9395 - accuracy: 0.6362
92/92 [==============================] - 6s 64ms/step - loss: 0.9395 - accuracy: 0.6362 - val_loss: 0.9104 - val_accuracy: 0.6226
Epoch 4/15
1/92 [..............................] - ETA: 7s - loss: 0.8301 - accuracy: 0.6250
2/92 [..............................] - ETA: 5s - loss: 0.8889 - accuracy: 0.6719
3/92 [..............................] - ETA: 5s - loss: 0.8489 - accuracy: 0.6667
4/92 [>.............................] - ETA: 5s - loss: 0.9063 - accuracy: 0.6562
5/92 [>.............................] - ETA: 5s - loss: 0.9087 - accuracy: 0.6750
6/92 [>.............................] - ETA: 4s - loss: 0.9049 - accuracy: 0.6667
7/92 [=>............................] - ETA: 4s - loss: 0.9044 - accuracy: 0.6652
8/92 [=>............................] - ETA: 4s - loss: 0.8891 - accuracy: 0.6680
9/92 [=>............................] - ETA: 4s - loss: 0.8748 - accuracy: 0.6736
10/92 [==>………………………] - ETA: 4s - loss: 0.8838 - accuracy: 0.6750
11/92 [==>………………………] - ETA: 4s - loss: 0.8795 - accuracy: 0.6761
12/92 [==>………………………] - ETA: 4s - loss: 0.8821 - accuracy: 0.6719
13/92 [===>……………………..] - ETA: 4s - loss: 0.8807 - accuracy: 0.6779
14/92 [===>……………………..] - ETA: 4s - loss: 0.8823 - accuracy: 0.6808
15/92 [===>……………………..] - ETA: 4s - loss: 0.8780 - accuracy: 0.6833
16/92 [====>…………………….] - ETA: 4s - loss: 0.8646 - accuracy: 0.6875
17/92 [====>…………………….] - ETA: 4s - loss: 0.8655 - accuracy: 0.6912
18/92 [====>…………………….] - ETA: 4s - loss: 0.8688 - accuracy: 0.6875
19/92 [=====>……………………] - ETA: 4s - loss: 0.8594 - accuracy: 0.6891
20/92 [=====>……………………] - ETA: 4s - loss: 0.8546 - accuracy: 0.6891
21/92 [=====>……………………] - ETA: 4s - loss: 0.8565 - accuracy: 0.6875
22/92 [======>…………………..] - ETA: 4s - loss: 0.8586 - accuracy: 0.6832
23/92 [======>…………………..] - ETA: 4s - loss: 0.8535 - accuracy: 0.6821
24/92 [======>…………………..] - ETA: 4s - loss: 0.8458 - accuracy: 0.6862
25/92 [=======>………………….] - ETA: 3s - loss: 0.8491 - accuracy: 0.6837
26/92 [=======>………………….] - ETA: 3s - loss: 0.8537 - accuracy: 0.6803
27/92 [=======>………………….] - ETA: 3s - loss: 0.8588 - accuracy: 0.6771
28/92 [========>…………………] - ETA: 3s - loss: 0.8589 - accuracy: 0.6797
29/92 [========>…………………] - ETA: 3s - loss: 0.8469 - accuracy: 0.6832
30/92 [========>…………………] - ETA: 3s - loss: 0.8618 - accuracy: 0.6771
31/92 [=========>………………..] - ETA: 3s - loss: 0.8629 - accuracy: 0.6754
32/92 [=========>………………..] - ETA: 3s - loss: 0.8558 - accuracy: 0.6748
33/92 [=========>………………..] - ETA: 3s - loss: 0.8503 - accuracy: 0.6771
34/92 [==========>……………….] - ETA: 3s - loss: 0.8537 - accuracy: 0.6765
35/92 [==========>……………….] - ETA: 3s - loss: 0.8730 - accuracy: 0.6661
36/92 [==========>……………….] - ETA: 3s - loss: 0.8816 - accuracy: 0.6667
37/92 [===========>………………] - ETA: 3s - loss: 0.8813 - accuracy: 0.6647
39/92 [===========>………………] - ETA: 3s - loss: 0.8844 - accuracy: 0.6605
40/92 [============>……………..] - ETA: 3s - loss: 0.8790 - accuracy: 0.6619
41/92 [============>……………..] - ETA: 2s - loss: 0.8801 - accuracy: 0.6603
42/92 [============>……………..] - ETA: 2s - loss: 0.8767 - accuracy: 0.6617
43/92 [=============>…………….] - ETA: 2s - loss: 0.8735 - accuracy: 0.6630
44/92 [=============>…………….] - ETA: 2s - loss: 0.8766 - accuracy: 0.6621
45/92 [=============>…………….] - ETA: 2s - loss: 0.8854 - accuracy: 0.6585
46/92 [==============>……………] - ETA: 2s - loss: 0.8815 - accuracy: 0.6612
47/92 [==============>……………] - ETA: 2s - loss: 0.8805 - accuracy: 0.6584
48/92 [==============>……………] - ETA: 2s - loss: 0.8752 - accuracy: 0.6597
49/92 [==============>……………] - ETA: 2s - loss: 0.8762 - accuracy: 0.6609
50/92 [===============>…………..] - ETA: 2s - loss: 0.8765 - accuracy: 0.6608
51/92 [===============>…………..] - ETA: 2s - loss: 0.8728 - accuracy: 0.6638
52/92 [===============>…………..] - ETA: 2s - loss: 0.8726 - accuracy: 0.6636
53/92 [================>………….] - ETA: 2s - loss: 0.8774 - accuracy: 0.6611
54/92 [================>………….] - ETA: 2s - loss: 0.8785 - accuracy: 0.6605
55/92 [================>………….] - ETA: 2s - loss: 0.8763 - accuracy: 0.6610
56/92 [=================>…………] - ETA: 2s - loss: 0.8768 - accuracy: 0.6614
57/92 [=================>…………] - ETA: 2s - loss: 0.8795 - accuracy: 0.6591
58/92 [=================>…………] - ETA: 1s - loss: 0.8808 - accuracy: 0.6580
59/92 [==================>………..] - ETA: 1s - loss: 0.8781 - accuracy: 0.6585
60/92 [==================>………..] - ETA: 1s - loss: 0.8796 - accuracy: 0.6574
61/92 [==================>………..] - ETA: 1s - loss: 0.8845 - accuracy: 0.6574
62/92 [===================>……….] - ETA: 1s - loss: 0.8840 - accuracy: 0.6574
63/92 [===================>……….] - ETA: 1s - loss: 0.8821 - accuracy: 0.6584
64/92 [===================>……….] - ETA: 1s - loss: 0.8823 - accuracy: 0.6569
65/92 [====================>………] - ETA: 1s - loss: 0.8862 - accuracy: 0.6549
66/92 [====================>………] - ETA: 1s - loss: 0.8851 - accuracy: 0.6549
67/92 [====================>………] - ETA: 1s - loss: 0.8861 - accuracy: 0.6540
68/92 [=====================>……..] - ETA: 1s - loss: 0.8845 - accuracy: 0.6545
69/92 [=====================>……..] - ETA: 1s - loss: 0.8839 - accuracy: 0.6536
70/92 [=====================>……..] - ETA: 1s - loss: 0.8839 - accuracy: 0.6532
71/92 [======================>…….] - ETA: 1s - loss: 0.8829 - accuracy: 0.6546
72/92 [======================>…….] - ETA: 1s - loss: 0.8809 - accuracy: 0.6555
73/92 [======================>…….] - ETA: 1s - loss: 0.8842 - accuracy: 0.6538
74/92 [=======================>……] - ETA: 1s - loss: 0.8871 - accuracy: 0.6525
75/92 [=======================>……] - ETA: 0s - loss: 0.8866 - accuracy: 0.6534
76/92 [=======================>……] - ETA: 0s - loss: 0.8853 - accuracy: 0.6543
77/92 [========================>…..] - ETA: 0s - loss: 0.8851 - accuracy: 0.6551
78/92 [========================>…..] - ETA: 0s - loss: 0.8832 - accuracy: 0.6555
79/92 [========================>…..] - ETA: 0s - loss: 0.8857 - accuracy: 0.6548
80/92 [=========================>….] - ETA: 0s - loss: 0.8858 - accuracy: 0.6552
81/92 [=========================>….] - ETA: 0s - loss: 0.8850 - accuracy: 0.6560
82/92 [=========================>….] - ETA: 0s - loss: 0.8846 - accuracy: 0.6560
83/92 [==========================>…] - ETA: 0s - loss: 0.8838 - accuracy: 0.6556
84/92 [==========================>…] - ETA: 0s - loss: 0.8853 - accuracy: 0.6556
85/92 [==========================>…] - ETA: 0s - loss: 0.8843 - accuracy: 0.6556
86/92 [===========================>..] - ETA: 0s - loss: 0.8818 - accuracy: 0.6563
87/92 [===========================>..] - ETA: 0s - loss: 0.8787 - accuracy: 0.6585
88/92 [===========================>..] - ETA: 0s - loss: 0.8786 - accuracy: 0.6578
89/92 [============================>.] - ETA: 0s - loss: 0.8795 - accuracy: 0.6574
90/92 [============================>.] - ETA: 0s - loss: 0.8813 - accuracy: 0.6563
91/92 [============================>.] - ETA: 0s - loss: 0.8799 - accuracy: 0.6570
92/92 [==============================] - ETA: 0s - loss: 0.8800 - accuracy: 0.6570
92/92 [==============================] - 6s 64ms/step - loss: 0.8800 - accuracy: 0.6570 - val_loss: 0.9390 - val_accuracy: 0.6553
Epoch 5/15
1/92 [..............................] - ETA: 7s - loss: 0.8812 - accuracy: 0.5938
2/92 [..............................] - ETA: 5s - loss: 0.9164 - accuracy: 0.6094
3/92 [..............................] - ETA: 5s - loss: 0.8794 - accuracy: 0.6250
4/92 [>.............................] - ETA: 5s - loss: 0.9695 - accuracy: 0.5938
5/92 [>.............................] - ETA: 5s - loss: 0.9142 - accuracy: 0.6187
6/92 [>.............................] - ETA: 5s - loss: 0.9198 - accuracy: 0.6406
7/92 [=>............................] - ETA: 5s - loss: 0.9138 - accuracy: 0.6562
8/92 [=>............................] - ETA: 4s - loss: 0.9124 - accuracy: 0.6406
9/92 [=>............................] - ETA: 4s - loss: 0.8889 - accuracy: 0.6562
10/92 [==>………………………] - ETA: 4s - loss: 0.8911 - accuracy: 0.6625
11/92 [==>………………………] - ETA: 4s - loss: 0.9110 - accuracy: 0.6506
12/92 [==>………………………] - ETA: 4s - loss: 0.9076 - accuracy: 0.6510
13/92 [===>……………………..] - ETA: 4s - loss: 0.9137 - accuracy: 0.6490
14/92 [===>……………………..] - ETA: 4s - loss: 0.9154 - accuracy: 0.6496
15/92 [===>……………………..] - ETA: 4s - loss: 0.9019 - accuracy: 0.6562
16/92 [====>…………………….] - ETA: 4s - loss: 0.9025 - accuracy: 0.6562
17/92 [====>…………………….] - ETA: 4s - loss: 0.8931 - accuracy: 0.6581
18/92 [====>…………………….] - ETA: 4s - loss: 0.8880 - accuracy: 0.6615
19/92 [=====>……………………] - ETA: 4s - loss: 0.8759 - accuracy: 0.6645
20/92 [=====>……………………] - ETA: 4s - loss: 0.8755 - accuracy: 0.6609
21/92 [=====>……………………] - ETA: 4s - loss: 0.8677 - accuracy: 0.6622
22/92 [======>…………………..] - ETA: 4s - loss: 0.8679 - accuracy: 0.6619
23/92 [======>…………………..] - ETA: 4s - loss: 0.8670 - accuracy: 0.6590
24/92 [======>…………………..] - ETA: 3s - loss: 0.8763 - accuracy: 0.6549
25/92 [=======>………………….] - ETA: 3s - loss: 0.8764 - accuracy: 0.6550
26/92 [=======>………………….] - ETA: 3s - loss: 0.8705 - accuracy: 0.6587
27/92 [=======>………………….] - ETA: 3s - loss: 0.8718 - accuracy: 0.6609
28/92 [========>…………………] - ETA: 3s - loss: 0.8648 - accuracy: 0.6641
29/92 [========>…………………] - ETA: 3s - loss: 0.8630 - accuracy: 0.6670
30/92 [========>…………………] - ETA: 3s - loss: 0.8663 - accuracy: 0.6635
31/92 [=========>………………..] - ETA: 3s - loss: 0.8637 - accuracy: 0.6623
32/92 [=========>………………..] - ETA: 3s - loss: 0.8571 - accuracy: 0.6650
33/92 [=========>………………..] - ETA: 3s - loss: 0.8529 - accuracy: 0.6657
34/92 [==========>……………….] - ETA: 3s - loss: 0.8525 - accuracy: 0.6645
35/92 [==========>……………….] - ETA: 3s - loss: 0.8567 - accuracy: 0.6625
36/92 [==========>……………….] - ETA: 3s - loss: 0.8536 - accuracy: 0.6623
37/92 [===========>………………] - ETA: 3s - loss: 0.8573 - accuracy: 0.6605
38/92 [===========>………………] - ETA: 3s - loss: 0.8572 - accuracy: 0.6612
39/92 [===========>………………] - ETA: 3s - loss: 0.8635 - accuracy: 0.6587
40/92 [============>……………..] - ETA: 3s - loss: 0.8663 - accuracy: 0.6570
41/92 [============>……………..] - ETA: 2s - loss: 0.8664 - accuracy: 0.6578
42/92 [============>……………..] - ETA: 2s - loss: 0.8603 - accuracy: 0.6615
43/92 [=============>…………….] - ETA: 2s - loss: 0.8584 - accuracy: 0.6606
44/92 [=============>…………….] - ETA: 2s - loss: 0.8597 - accuracy: 0.6619
45/92 [=============>…………….] - ETA: 2s - loss: 0.8649 - accuracy: 0.6597
46/92 [==============>……………] - ETA: 2s - loss: 0.8618 - accuracy: 0.6624
47/92 [==============>……………] - ETA: 2s - loss: 0.8613 - accuracy: 0.6642
48/92 [==============>……………] - ETA: 2s - loss: 0.8609 - accuracy: 0.6654
49/92 [==============>……………] - ETA: 2s - loss: 0.8622 - accuracy: 0.6633
50/92 [===============>…………..] - ETA: 2s - loss: 0.8586 - accuracy: 0.6644
51/92 [===============>…………..] - ETA: 2s - loss: 0.8583 - accuracy: 0.6642
52/92 [===============>…………..] - ETA: 2s - loss: 0.8594 - accuracy: 0.6647
53/92 [================>………….] - ETA: 2s - loss: 0.8609 - accuracy: 0.6645
54/92 [================>………….] - ETA: 2s - loss: 0.8660 - accuracy: 0.6632
55/92 [================>………….] - ETA: 2s - loss: 0.8657 - accuracy: 0.6614
56/92 [=================>…………] - ETA: 2s - loss: 0.8618 - accuracy: 0.6618
57/92 [=================>…………] - ETA: 2s - loss: 0.8631 - accuracy: 0.6612
58/92 [=================>…………] - ETA: 1s - loss: 0.8668 - accuracy: 0.6579
59/92 [==================>………..] - ETA: 1s - loss: 0.8626 - accuracy: 0.6589
60/92 [==================>………..] - ETA: 1s - loss: 0.8650 - accuracy: 0.6568
61/92 [==================>………..] - ETA: 1s - loss: 0.8623 - accuracy: 0.6593
62/92 [===================>……….] - ETA: 1s - loss: 0.8625 - accuracy: 0.6583
63/92 [===================>……….] - ETA: 1s - loss: 0.8627 - accuracy: 0.6582
64/92 [===================>……….] - ETA: 1s - loss: 0.8592 - accuracy: 0.6602
66/92 [====================>………] - ETA: 1s - loss: 0.8576 - accuracy: 0.6611
67/92 [====================>………] - ETA: 1s - loss: 0.8557 - accuracy: 0.6610
68/92 [=====================>……..] - ETA: 1s - loss: 0.8520 - accuracy: 0.6637
69/92 [=====================>……..] - ETA: 1s - loss: 0.8513 - accuracy: 0.6650
70/92 [=====================>……..] - ETA: 1s - loss: 0.8522 - accuracy: 0.6644
71/92 [======================>…….] - ETA: 1s - loss: 0.8504 - accuracy: 0.6661
72/92 [======================>…….] - ETA: 1s - loss: 0.8502 - accuracy: 0.6668
73/92 [======================>…….] - ETA: 1s - loss: 0.8468 - accuracy: 0.6684
74/92 [=======================>……] - ETA: 1s - loss: 0.8451 - accuracy: 0.6691
75/92 [=======================>……] - ETA: 0s - loss: 0.8438 - accuracy: 0.6706
76/92 [=======================>……] - ETA: 0s - loss: 0.8427 - accuracy: 0.6704
77/92 [========================>…..] - ETA: 0s - loss: 0.8413 - accuracy: 0.6718
78/92 [========================>…..] - ETA: 0s - loss: 0.8387 - accuracy: 0.6728
79/92 [========================>…..] - ETA: 0s - loss: 0.8374 - accuracy: 0.6730
80/92 [=========================>….] - ETA: 0s - loss: 0.8359 - accuracy: 0.6728
81/92 [=========================>….] - ETA: 0s - loss: 0.8352 - accuracy: 0.6738
82/92 [=========================>….] - ETA: 0s - loss: 0.8345 - accuracy: 0.6743
83/92 [==========================>…] - ETA: 0s - loss: 0.8320 - accuracy: 0.6748
84/92 [==========================>…] - ETA: 0s - loss: 0.8285 - accuracy: 0.6757
85/92 [==========================>…] - ETA: 0s - loss: 0.8268 - accuracy: 0.6763
86/92 [===========================>..] - ETA: 0s - loss: 0.8247 - accuracy: 0.6764
87/92 [===========================>..] - ETA: 0s - loss: 0.8244 - accuracy: 0.6765
88/92 [===========================>..] - ETA: 0s - loss: 0.8236 - accuracy: 0.6763
89/92 [============================>.] - ETA: 0s - loss: 0.8236 - accuracy: 0.6764
90/92 [============================>.] - ETA: 0s - loss: 0.8291 - accuracy: 0.6748
91/92 [============================>.] - ETA: 0s - loss: 0.8322 - accuracy: 0.6739
92/92 [==============================] - ETA: 0s - loss: 0.8332 - accuracy: 0.6737
92/92 [==============================] - 6s 64ms/step - loss: 0.8332 - accuracy: 0.6737 - val_loss: 0.8496 - val_accuracy: 0.6744
Epoch 6/15
1/92 [..............................] - ETA: 7s - loss: 0.6199 - accuracy: 0.7812
2/92 [..............................] - ETA: 5s - loss: 0.7833 - accuracy: 0.7344
3/92 [..............................] - ETA: 5s - loss: 0.8298 - accuracy: 0.7500
4/92 [>.............................] - ETA: 5s - loss: 0.8400 - accuracy: 0.7188
5/92 [>.............................] - ETA: 5s - loss: 0.8397 - accuracy: 0.7125
6/92 [>.............................] - ETA: 5s - loss: 0.8399 - accuracy: 0.7083
7/92 [=>............................] - ETA: 4s - loss: 0.8296 - accuracy: 0.7098
8/92 [=>............................] - ETA: 4s - loss: 0.8467 - accuracy: 0.7031
9/92 [=>............................] - ETA: 4s - loss: 0.8815 - accuracy: 0.6806
10/92 [==>………………………] - ETA: 4s - loss: 0.8850 - accuracy: 0.6719
11/92 [==>………………………] - ETA: 4s - loss: 0.8874 - accuracy: 0.6705
12/92 [==>………………………] - ETA: 4s - loss: 0.8762 - accuracy: 0.6771
13/92 [===>……………………..] - ETA: 4s - loss: 0.8703 - accuracy: 0.6803
14/92 [===>……………………..] - ETA: 4s - loss: 0.8811 - accuracy: 0.6808
15/92 [===>……………………..] - ETA: 4s - loss: 0.8734 - accuracy: 0.6875
16/92 [====>…………………….] - ETA: 4s - loss: 0.8746 - accuracy: 0.6816
17/92 [====>…………………….] - ETA: 4s - loss: 0.8690 - accuracy: 0.6838
18/92 [====>…………………….] - ETA: 4s - loss: 0.8626 - accuracy: 0.6892
19/92 [=====>……………………] - ETA: 4s - loss: 0.8588 - accuracy: 0.6891
20/92 [=====>……………………] - ETA: 4s - loss: 0.8482 - accuracy: 0.6906
21/92 [=====>……………………] - ETA: 4s - loss: 0.8316 - accuracy: 0.6979
22/92 [======>…………………..] - ETA: 4s - loss: 0.8284 - accuracy: 0.6974
23/92 [======>…………………..] - ETA: 4s - loss: 0.8323 - accuracy: 0.6929
24/92 [======>…………………..] - ETA: 3s - loss: 0.8270 - accuracy: 0.6953
25/92 [=======>………………….] - ETA: 3s - loss: 0.8250 - accuracy: 0.6963
26/92 [=======>………………….] - ETA: 3s - loss: 0.8211 - accuracy: 0.6959
27/92 [=======>………………….] - ETA: 3s - loss: 0.8161 - accuracy: 0.6991
28/92 [========>…………………] - ETA: 3s - loss: 0.8185 - accuracy: 0.6975
29/92 [========>…………………] - ETA: 3s - loss: 0.8167 - accuracy: 0.6972
30/92 [========>…………………] - ETA: 3s - loss: 0.8212 - accuracy: 0.6958
31/92 [=========>………………..] - ETA: 3s - loss: 0.8256 - accuracy: 0.6935
32/92 [=========>………………..] - ETA: 3s - loss: 0.8234 - accuracy: 0.6943
33/92 [=========>………………..] - ETA: 3s - loss: 0.8263 - accuracy: 0.6922
34/92 [==========>……………….] - ETA: 3s - loss: 0.8224 - accuracy: 0.6921
35/92 [==========>……………….] - ETA: 3s - loss: 0.8256 - accuracy: 0.6938
36/92 [==========>……………….] - ETA: 3s - loss: 0.8271 - accuracy: 0.6918
37/92 [===========>………………] - ETA: 3s - loss: 0.8301 - accuracy: 0.6934
38/92 [===========>………………] - ETA: 3s - loss: 0.8221 - accuracy: 0.6974
39/92 [===========>………………] - ETA: 3s - loss: 0.8311 - accuracy: 0.6939
40/92 [============>……………..] - ETA: 3s - loss: 0.8315 - accuracy: 0.6930
41/92 [============>……………..] - ETA: 2s - loss: 0.8302 - accuracy: 0.6921
42/92 [============>……………..] - ETA: 2s - loss: 0.8271 - accuracy: 0.6935
43/92 [=============>…………….] - ETA: 2s - loss: 0.8218 - accuracy: 0.6955
44/92 [=============>…………….] - ETA: 2s - loss: 0.8249 - accuracy: 0.6939
45/92 [=============>…………….] - ETA: 2s - loss: 0.8202 - accuracy: 0.6944
46/92 [==============>……………] - ETA: 2s - loss: 0.8188 - accuracy: 0.6943
47/92 [==============>……………] - ETA: 2s - loss: 0.8185 - accuracy: 0.6948
48/92 [==============>……………] - ETA: 2s - loss: 0.8134 - accuracy: 0.6960
49/92 [==============>……………] - ETA: 2s - loss: 0.8133 - accuracy: 0.6945
50/92 [===============>…………..] - ETA: 2s - loss: 0.8111 - accuracy: 0.6956
51/92 [===============>…………..] - ETA: 2s - loss: 0.8098 - accuracy: 0.6961
52/92 [===============>…………..] - ETA: 2s - loss: 0.8141 - accuracy: 0.6929
53/92 [================>………….] - ETA: 2s - loss: 0.8125 - accuracy: 0.6940
54/92 [================>………….] - ETA: 2s - loss: 0.8133 - accuracy: 0.6939
55/92 [================>………….] - ETA: 2s - loss: 0.8159 - accuracy: 0.6926
56/92 [=================>…………] - ETA: 2s - loss: 0.8124 - accuracy: 0.6942
57/92 [=================>…………] - ETA: 2s - loss: 0.8129 - accuracy: 0.6946
58/92 [=================>…………] - ETA: 1s - loss: 0.8087 - accuracy: 0.6967
59/92 [==================>………..] - ETA: 1s - loss: 0.8073 - accuracy: 0.6970
60/92 [==================>………..] - ETA: 1s - loss: 0.8078 - accuracy: 0.6953
61/92 [==================>………..] - ETA: 1s - loss: 0.8053 - accuracy: 0.6967
62/92 [===================>……….] - ETA: 1s - loss: 0.8042 - accuracy: 0.6981
64/92 [===================>……….] - ETA: 1s - loss: 0.8060 - accuracy: 0.6980
65/92 [====================>………] - ETA: 1s - loss: 0.8099 - accuracy: 0.6969
66/92 [====================>………] - ETA: 1s - loss: 0.8084 - accuracy: 0.6977
67/92 [====================>………] - ETA: 1s - loss: 0.8075 - accuracy: 0.6980
68/92 [=====================>……..] - ETA: 1s - loss: 0.8117 - accuracy: 0.6956
69/92 [=====================>……..] - ETA: 1s - loss: 0.8098 - accuracy: 0.6964
70/92 [=====================>……..] - ETA: 1s - loss: 0.8070 - accuracy: 0.6971
71/92 [======================>…….] - ETA: 1s - loss: 0.8074 - accuracy: 0.6961
72/92 [======================>…….] - ETA: 1s - loss: 0.8085 - accuracy: 0.6947
73/92 [======================>…….] - ETA: 1s - loss: 0.8054 - accuracy: 0.6954
74/92 [=======================>……] - ETA: 1s - loss: 0.8072 - accuracy: 0.6936
75/92 [=======================>……] - ETA: 0s - loss: 0.8059 - accuracy: 0.6948
76/92 [=======================>……] - ETA: 0s - loss: 0.8074 - accuracy: 0.6931
77/92 [========================>…..] - ETA: 0s - loss: 0.8065 - accuracy: 0.6938
78/92 [========================>…..] - ETA: 0s - loss: 0.8042 - accuracy: 0.6953
79/92 [========================>…..] - ETA: 0s - loss: 0.8038 - accuracy: 0.6960
80/92 [=========================>….] - ETA: 0s - loss: 0.8034 - accuracy: 0.6947
81/92 [=========================>….] - ETA: 0s - loss: 0.8033 - accuracy: 0.6943
82/92 [=========================>….] - ETA: 0s - loss: 0.8084 - accuracy: 0.6919
83/92 [==========================>…] - ETA: 0s - loss: 0.8063 - accuracy: 0.6926
84/92 [==========================>…] - ETA: 0s - loss: 0.8063 - accuracy: 0.6925
85/92 [==========================>…] - ETA: 0s - loss: 0.8048 - accuracy: 0.6936
86/92 [===========================>..] - ETA: 0s - loss: 0.8068 - accuracy: 0.6931
87/92 [===========================>..] - ETA: 0s - loss: 0.8051 - accuracy: 0.6945
88/92 [===========================>..] - ETA: 0s - loss: 0.8063 - accuracy: 0.6934
89/92 [============================>.] - ETA: 0s - loss: 0.8043 - accuracy: 0.6944
90/92 [============================>.] - ETA: 0s - loss: 0.8045 - accuracy: 0.6939
91/92 [============================>.] - ETA: 0s - loss: 0.8011 - accuracy: 0.6952
92/92 [==============================] - ETA: 0s - loss: 0.8013 - accuracy: 0.6952
92/92 [==============================] - 6s 64ms/step - loss: 0.8013 - accuracy: 0.6952 - val_loss: 0.7885 - val_accuracy: 0.6921
Epoch 7/15
1/92 [..............................] - ETA: 7s - loss: 0.7625 - accuracy: 0.5938
2/92 [..............................] - ETA: 5s - loss: 0.8142 - accuracy: 0.5938
3/92 [..............................] - ETA: 5s - loss: 0.7949 - accuracy: 0.6771
4/92 [>.............................] - ETA: 5s - loss: 0.7718 - accuracy: 0.6719
5/92 [>.............................] - ETA: 5s - loss: 0.7636 - accuracy: 0.6687
6/92 [>.............................] - ETA: 4s - loss: 0.8231 - accuracy: 0.6458
7/92 [=>............................] - ETA: 4s - loss: 0.8185 - accuracy: 0.6518
8/92 [=>............................] - ETA: 4s - loss: 0.8045 - accuracy: 0.6680
9/92 [=>............................] - ETA: 4s - loss: 0.7917 - accuracy: 0.6736
10/92 [==>………………………] - ETA: 4s - loss: 0.7723 - accuracy: 0.6750
11/92 [==>………………………] - ETA: 4s - loss: 0.7690 - accuracy: 0.6818
12/92 [==>………………………] - ETA: 4s - loss: 0.7872 - accuracy: 0.6797
13/92 [===>……………………..] - ETA: 4s - loss: 0.7886 - accuracy: 0.6755
14/92 [===>……………………..] - ETA: 4s - loss: 0.7963 - accuracy: 0.6741
15/92 [===>……………………..] - ETA: 4s - loss: 0.7875 - accuracy: 0.6771
16/92 [====>…………………….] - ETA: 4s - loss: 0.7746 - accuracy: 0.6777
17/92 [====>…………………….] - ETA: 4s - loss: 0.7709 - accuracy: 0.6783
18/92 [====>…………………….] - ETA: 4s - loss: 0.7652 - accuracy: 0.6858
19/92 [=====>……………………] - ETA: 4s - loss: 0.7664 - accuracy: 0.6859
20/92 [=====>……………………] - ETA: 4s - loss: 0.7756 - accuracy: 0.6828
21/92 [=====>……………………] - ETA: 4s - loss: 0.7639 - accuracy: 0.6890
22/92 [======>…………………..] - ETA: 4s - loss: 0.7706 - accuracy: 0.6875
23/92 [======>…………………..] - ETA: 3s - loss: 0.7612 - accuracy: 0.6902
24/92 [======>…………………..] - ETA: 3s - loss: 0.7561 - accuracy: 0.6927
25/92 [=======>………………….] - ETA: 3s - loss: 0.7639 - accuracy: 0.6888
26/92 [=======>………………….] - ETA: 3s - loss: 0.7809 - accuracy: 0.6791
27/92 [=======>………………….] - ETA: 3s - loss: 0.7862 - accuracy: 0.6771
29/92 [========>…………………] - ETA: 3s - loss: 0.7785 - accuracy: 0.6793
30/92 [========>…………………] - ETA: 3s - loss: 0.7737 - accuracy: 0.6817
31/92 [=========>………………..] - ETA: 3s - loss: 0.7615 - accuracy: 0.6890
32/92 [=========>………………..] - ETA: 3s - loss: 0.7587 - accuracy: 0.6900
33/92 [=========>………………..] - ETA: 3s - loss: 0.7596 - accuracy: 0.6880
34/92 [==========>……………….] - ETA: 3s - loss: 0.7565 - accuracy: 0.6880
35/92 [==========>……………….] - ETA: 3s - loss: 0.7545 - accuracy: 0.6888
36/92 [==========>……………….] - ETA: 3s - loss: 0.7500 - accuracy: 0.6914
37/92 [===========>………………] - ETA: 3s - loss: 0.7499 - accuracy: 0.6922
38/92 [===========>………………] - ETA: 3s - loss: 0.7522 - accuracy: 0.6929
39/92 [===========>………………] - ETA: 3s - loss: 0.7490 - accuracy: 0.6944
40/92 [============>……………..] - ETA: 3s - loss: 0.7522 - accuracy: 0.6950
41/92 [============>……………..] - ETA: 2s - loss: 0.7488 - accuracy: 0.6933
42/92 [============>……………..] - ETA: 2s - loss: 0.7466 - accuracy: 0.6961
43/92 [=============>…………….] - ETA: 2s - loss: 0.7524 - accuracy: 0.6930
44/92 [=============>…………….] - ETA: 2s - loss: 0.7599 - accuracy: 0.6893
45/92 [=============>…………….] - ETA: 2s - loss: 0.7605 - accuracy: 0.6899
46/92 [==============>……………] - ETA: 2s - loss: 0.7626 - accuracy: 0.6872
47/92 [==============>……………] - ETA: 2s - loss: 0.7634 - accuracy: 0.6872
48/92 [==============>……………] - ETA: 2s - loss: 0.7714 - accuracy: 0.6846
49/92 [==============>……………] - ETA: 2s - loss: 0.7746 - accuracy: 0.6827
50/92 [===============>…………..] - ETA: 2s - loss: 0.7786 - accuracy: 0.6834
51/92 [===============>…………..] - ETA: 2s - loss: 0.7776 - accuracy: 0.6841
52/92 [===============>…………..] - ETA: 2s - loss: 0.7777 - accuracy: 0.6842
53/92 [================>………….] - ETA: 2s - loss: 0.7754 - accuracy: 0.6842
54/92 [================>………….] - ETA: 2s - loss: 0.7761 - accuracy: 0.6837
55/92 [================>………….] - ETA: 2s - loss: 0.7774 - accuracy: 0.6832
56/92 [=================>…………] - ETA: 2s - loss: 0.7812 - accuracy: 0.6811
57/92 [=================>…………] - ETA: 2s - loss: 0.7819 - accuracy: 0.6817
58/92 [=================>…………] - ETA: 1s - loss: 0.7844 - accuracy: 0.6807
59/92 [==================>………..] - ETA: 1s - loss: 0.7844 - accuracy: 0.6809
60/92 [==================>………..] - ETA: 1s - loss: 0.7820 - accuracy: 0.6825
61/92 [==================>………..] - ETA: 1s - loss: 0.7807 - accuracy: 0.6831
62/92 [===================>……….] - ETA: 1s - loss: 0.7782 - accuracy: 0.6832
63/92 [===================>……….] - ETA: 1s - loss: 0.7790 - accuracy: 0.6843
64/92 [===================>……….] - ETA: 1s - loss: 0.7773 - accuracy: 0.6858
65/92 [====================>………] - ETA: 1s - loss: 0.7760 - accuracy: 0.6873
66/92 [====================>………] - ETA: 1s - loss: 0.7741 - accuracy: 0.6882
67/92 [====================>………] - ETA: 1s - loss: 0.7741 - accuracy: 0.6868
68/92 [=====================>……..] - ETA: 1s - loss: 0.7709 - accuracy: 0.6877
69/92 [=====================>……..] - ETA: 1s - loss: 0.7695 - accuracy: 0.6895
70/92 [=====================>……..] - ETA: 1s - loss: 0.7678 - accuracy: 0.6918
71/92 [======================>…….] - ETA: 1s - loss: 0.7654 - accuracy: 0.6930
72/92 [======================>…….] - ETA: 1s - loss: 0.7645 - accuracy: 0.6925
73/92 [======================>…….] - ETA: 1s - loss: 0.7624 - accuracy: 0.6937
74/92 [=======================>……] - ETA: 1s - loss: 0.7617 - accuracy: 0.6945
75/92 [=======================>……] - ETA: 0s - loss: 0.7631 - accuracy: 0.6940
76/92 [=======================>……] - ETA: 0s - loss: 0.7600 - accuracy: 0.6947
77/92 [========================>…..] - ETA: 0s - loss: 0.7557 - accuracy: 0.6963
78/92 [========================>…..] - ETA: 0s - loss: 0.7526 - accuracy: 0.6982
79/92 [========================>…..] - ETA: 0s - loss: 0.7559 - accuracy: 0.6976
80/92 [=========================>….] - ETA: 0s - loss: 0.7533 - accuracy: 0.6995
81/92 [=========================>….] - ETA: 0s - loss: 0.7518 - accuracy: 0.7009
82/92 [=========================>….] - ETA: 0s - loss: 0.7533 - accuracy: 0.7007
83/92 [==========================>…] - ETA: 0s - loss: 0.7527 - accuracy: 0.7005
84/92 [==========================>…] - ETA: 0s - loss: 0.7522 - accuracy: 0.7007
85/92 [==========================>…] - ETA: 0s - loss: 0.7542 - accuracy: 0.7006
86/92 [===========================>..] - ETA: 0s - loss: 0.7532 - accuracy: 0.7008
87/92 [===========================>..] - ETA: 0s - loss: 0.7522 - accuracy: 0.7014
88/92 [===========================>..] - ETA: 0s - loss: 0.7515 - accuracy: 0.7016
89/92 [============================>.] - ETA: 0s - loss: 0.7499 - accuracy: 0.7021
90/92 [============================>.] - ETA: 0s - loss: 0.7476 - accuracy: 0.7026
91/92 [============================>.] - ETA: 0s - loss: 0.7485 - accuracy: 0.7025
92/92 [==============================] - ETA: 0s - loss: 0.7487 - accuracy: 0.7027
92/92 [==============================] - 6s 64ms/step - loss: 0.7487 - accuracy: 0.7027 - val_loss: 0.7632 - val_accuracy: 0.7180
Epoch 8/15
1/92 [..............................] - ETA: 7s - loss: 0.7172 - accuracy: 0.7188
2/92 [..............................] - ETA: 5s - loss: 0.6769 - accuracy: 0.7344
3/92 [..............................] - ETA: 5s - loss: 0.6460 - accuracy: 0.7396
4/92 [>.............................] - ETA: 5s - loss: 0.6215 - accuracy: 0.7344
5/92 [>.............................] - ETA: 5s - loss: 0.6422 - accuracy: 0.7312
6/92 [>.............................] - ETA: 5s - loss: 0.6388 - accuracy: 0.7448
7/92 [=>............................] - ETA: 4s - loss: 0.6498 - accuracy: 0.7411
8/92 [=>............................] - ETA: 4s - loss: 0.6860 - accuracy: 0.7148
9/92 [=>............................] - ETA: 4s - loss: 0.7119 - accuracy: 0.7083
10/92 [==>………………………] - ETA: 4s - loss: 0.7098 - accuracy: 0.7188
11/92 [==>………………………] - ETA: 4s - loss: 0.7116 - accuracy: 0.7216
12/92 [==>………………………] - ETA: 4s - loss: 0.7132 - accuracy: 0.7266
13/92 [===>……………………..] - ETA: 4s - loss: 0.7007 - accuracy: 0.7260
14/92 [===>……………………..] - ETA: 4s - loss: 0.7044 - accuracy: 0.7254
15/92 [===>……………………..] - ETA: 4s - loss: 0.7288 - accuracy: 0.7188
16/92 [====>…………………….] - ETA: 4s - loss: 0.7369 - accuracy: 0.7188
17/92 [====>…………………….] - ETA: 4s - loss: 0.7261 - accuracy: 0.7261
18/92 [====>…………………….] - ETA: 4s - loss: 0.7367 - accuracy: 0.7240
20/92 [=====>……………………] - ETA: 4s - loss: 0.7338 - accuracy: 0.7231
21/92 [=====>……………………] - ETA: 4s - loss: 0.7249 - accuracy: 0.7274
22/92 [======>…………………..] - ETA: 4s - loss: 0.7204 - accuracy: 0.7299
23/92 [======>…………………..] - ETA: 4s - loss: 0.7158 - accuracy: 0.7335
24/92 [======>…………………..] - ETA: 3s - loss: 0.7139 - accuracy: 0.7368
25/92 [=======>………………….] - ETA: 3s - loss: 0.7096 - accuracy: 0.7361
26/92 [=======>………………….] - ETA: 3s - loss: 0.7115 - accuracy: 0.7354
27/92 [=======>………………….] - ETA: 3s - loss: 0.7129 - accuracy: 0.7360
28/92 [========>…………………] - ETA: 3s - loss: 0.7124 - accuracy: 0.7365
29/92 [========>…………………] - ETA: 3s - loss: 0.7017 - accuracy: 0.7380
30/92 [========>…………………] - ETA: 3s - loss: 0.7011 - accuracy: 0.7374
31/92 [=========>………………..] - ETA: 3s - loss: 0.7043 - accuracy: 0.7388
32/92 [=========>………………..] - ETA: 3s - loss: 0.7160 - accuracy: 0.7352
33/92 [=========>………………..] - ETA: 3s - loss: 0.7132 - accuracy: 0.7347
34/92 [==========>……………….] - ETA: 3s - loss: 0.7126 - accuracy: 0.7352
35/92 [==========>……………….] - ETA: 3s - loss: 0.7083 - accuracy: 0.7374
36/92 [==========>……………….] - ETA: 3s - loss: 0.7061 - accuracy: 0.7378
37/92 [===========>………………] - ETA: 3s - loss: 0.7082 - accuracy: 0.7347
38/92 [===========>………………] - ETA: 3s - loss: 0.7058 - accuracy: 0.7334
39/92 [===========>………………] - ETA: 3s - loss: 0.6990 - accuracy: 0.7363
40/92 [============>……………..] - ETA: 3s - loss: 0.7005 - accuracy: 0.7366
41/92 [============>……………..] - ETA: 2s - loss: 0.7000 - accuracy: 0.7362
42/92 [============>……………..] - ETA: 2s - loss: 0.7049 - accuracy: 0.7320
43/92 [=============>…………….] - ETA: 2s - loss: 0.7077 - accuracy: 0.7295
44/92 [=============>…………….] - ETA: 2s - loss: 0.7090 - accuracy: 0.7293
45/92 [=============>…………….] - ETA: 2s - loss: 0.7103 - accuracy: 0.7304
46/92 [==============>……………] - ETA: 2s - loss: 0.7073 - accuracy: 0.7309
47/92 [==============>……………] - ETA: 2s - loss: 0.7056 - accuracy: 0.7326
48/92 [==============>……………] - ETA: 2s - loss: 0.7036 - accuracy: 0.7336
49/92 [==============>……………] - ETA: 2s - loss: 0.7025 - accuracy: 0.7340
50/92 [===============>…………..] - ETA: 2s - loss: 0.7066 - accuracy: 0.7324
51/92 [===============>…………..] - ETA: 2s - loss: 0.7055 - accuracy: 0.7334
52/92 [===============>…………..] - ETA: 2s - loss: 0.7037 - accuracy: 0.7337
53/92 [================>………….] - ETA: 2s - loss: 0.7035 - accuracy: 0.7340
54/92 [================>………….] - ETA: 2s - loss: 0.6989 - accuracy: 0.7355
55/92 [================>………….] - ETA: 2s - loss: 0.6984 - accuracy: 0.7357
56/92 [=================>…………] - ETA: 2s - loss: 0.6962 - accuracy: 0.7365
57/92 [=================>…………] - ETA: 2s - loss: 0.6983 - accuracy: 0.7357
58/92 [=================>…………] - ETA: 1s - loss: 0.7028 - accuracy: 0.7316
59/92 [==================>………..] - ETA: 1s - loss: 0.7032 - accuracy: 0.7298
60/92 [==================>………..] - ETA: 1s - loss: 0.7020 - accuracy: 0.7306
61/92 [==================>………..] - ETA: 1s - loss: 0.6995 - accuracy: 0.7320
62/92 [===================>……….] - ETA: 1s - loss: 0.6997 - accuracy: 0.7308
63/92 [===================>……….] - ETA: 1s - loss: 0.6983 - accuracy: 0.7311
64/92 [===================>……….] - ETA: 1s - loss: 0.6967 - accuracy: 0.7324
65/92 [====================>………] - ETA: 1s - loss: 0.6963 - accuracy: 0.7331
66/92 [====================>………] - ETA: 1s - loss: 0.6936 - accuracy: 0.7338
67/92 [====================>………] - ETA: 1s - loss: 0.6913 - accuracy: 0.7350
68/92 [=====================>……..] - ETA: 1s - loss: 0.6887 - accuracy: 0.7357
69/92 [=====================>……..] - ETA: 1s - loss: 0.6888 - accuracy: 0.7359
70/92 [=====================>……..] - ETA: 1s - loss: 0.6861 - accuracy: 0.7370
71/92 [======================>…….] - ETA: 1s - loss: 0.6876 - accuracy: 0.7372
72/92 [======================>…….] - ETA: 1s - loss: 0.6856 - accuracy: 0.7382
73/92 [======================>…….] - ETA: 1s - loss: 0.6849 - accuracy: 0.7384
74/92 [=======================>……] - ETA: 1s - loss: 0.6884 - accuracy: 0.7377
75/92 [=======================>……] - ETA: 0s - loss: 0.6878 - accuracy: 0.7383
76/92 [=======================>……] - ETA: 0s - loss: 0.6862 - accuracy: 0.7389
77/92 [========================>…..] - ETA: 0s - loss: 0.6864 - accuracy: 0.7386
78/92 [========================>…..] - ETA: 0s - loss: 0.6870 - accuracy: 0.7371
79/92 [========================>…..] - ETA: 0s - loss: 0.6894 - accuracy: 0.7365
80/92 [=========================>….] - ETA: 0s - loss: 0.6936 - accuracy: 0.7359
81/92 [=========================>….] - ETA: 0s - loss: 0.6963 - accuracy: 0.7349
82/92 [=========================>….] - ETA: 0s - loss: 0.6976 - accuracy: 0.7339
83/92 [==========================>…] - ETA: 0s - loss: 0.6980 - accuracy: 0.7334
84/92 [==========================>…] - ETA: 0s - loss: 0.6979 - accuracy: 0.7332
85/92 [==========================>…] - ETA: 0s - loss: 0.6987 - accuracy: 0.7327
86/92 [===========================>..] - ETA: 0s - loss: 0.6967 - accuracy: 0.7340
87/92 [===========================>..] - ETA: 0s - loss: 0.6976 - accuracy: 0.7334
88/92 [===========================>..] - ETA: 0s - loss: 0.6974 - accuracy: 0.7343
89/92 [============================>.] - ETA: 0s - loss: 0.6991 - accuracy: 0.7338
90/92 [============================>.] - ETA: 0s - loss: 0.7016 - accuracy: 0.7319
91/92 [============================>.] - ETA: 0s - loss: 0.7019 - accuracy: 0.7311
92/92 [==============================] - ETA: 0s - loss: 0.7012 - accuracy: 0.7316
92/92 [==============================] - 6s 64ms/step - loss: 0.7012 - accuracy: 0.7316 - val_loss: 0.7702 - val_accuracy: 0.7153
Epoch 9/15
1/92 [..............................] - ETA: 7s - loss: 0.7327 - accuracy: 0.7812
2/92 [..............................] - ETA: 5s - loss: 0.6516 - accuracy: 0.7812
3/92 [..............................] - ETA: 5s - loss: 0.7418 - accuracy: 0.7396
4/92 [>.............................] - ETA: 5s - loss: 0.7133 - accuracy: 0.7422
5/92 [>.............................] - ETA: 5s - loss: 0.6932 - accuracy: 0.7563
6/92 [>.............................] - ETA: 5s - loss: 0.7019 - accuracy: 0.7292
7/92 [=>............................] - ETA: 4s - loss: 0.6774 - accuracy: 0.7366
8/92 [=>............................] - ETA: 4s - loss: 0.6611 - accuracy: 0.7422
9/92 [=>............................] - ETA: 4s - loss: 0.6545 - accuracy: 0.7569
10/92 [==>………………………] - ETA: 4s - loss: 0.6741 - accuracy: 0.7563
11/92 [==>………………………] - ETA: 4s - loss: 0.7102 - accuracy: 0.7443
12/92 [==>………………………] - ETA: 4s - loss: 0.7125 - accuracy: 0.7396
13/92 [===>……………………..] - ETA: 4s - loss: 0.7043 - accuracy: 0.7428
14/92 [===>……………………..] - ETA: 4s - loss: 0.7054 - accuracy: 0.7455
15/92 [===>……………………..] - ETA: 4s - loss: 0.7136 - accuracy: 0.7417
16/92 [====>…………………….] - ETA: 4s - loss: 0.6973 - accuracy: 0.7480
17/92 [====>…………………….] - ETA: 4s - loss: 0.7043 - accuracy: 0.7482
18/92 [====>…………………….] - ETA: 4s - loss: 0.6905 - accuracy: 0.7535
19/92 [=====>……………………] - ETA: 4s - loss: 0.6920 - accuracy: 0.7516
20/92 [=====>……………………] - ETA: 4s - loss: 0.6807 - accuracy: 0.7531
21/92 [=====>……………………] - ETA: 4s - loss: 0.6729 - accuracy: 0.7545
22/92 [======>…………………..] - ETA: 4s - loss: 0.6833 - accuracy: 0.7514
23/92 [======>…………………..] - ETA: 4s - loss: 0.6871 - accuracy: 0.7514
24/92 [======>…………………..] - ETA: 3s - loss: 0.6880 - accuracy: 0.7539
25/92 [=======>………………….] - ETA: 3s - loss: 0.6813 - accuracy: 0.7563
26/92 [=======>………………….] - ETA: 3s - loss: 0.6815 - accuracy: 0.7560
27/92 [=======>………………….] - ETA: 3s - loss: 0.6813 - accuracy: 0.7558
28/92 [========>…………………] - ETA: 3s - loss: 0.6902 - accuracy: 0.7511
29/92 [========>…………………] - ETA: 3s - loss: 0.6940 - accuracy: 0.7500
30/92 [========>…………………] - ETA: 3s - loss: 0.6877 - accuracy: 0.7510
31/92 [=========>………………..] - ETA: 3s - loss: 0.6805 - accuracy: 0.7540
32/92 [=========>………………..] - ETA: 3s - loss: 0.6852 - accuracy: 0.7500
33/92 [=========>………………..] - ETA: 3s - loss: 0.6852 - accuracy: 0.7491
34/92 [==========>……………….] - ETA: 3s - loss: 0.6803 - accuracy: 0.7491
35/92 [==========>……………….] - ETA: 3s - loss: 0.6842 - accuracy: 0.7473
36/92 [==========>……………….] - ETA: 3s - loss: 0.6855 - accuracy: 0.7465
37/92 [===========>………………] - ETA: 3s - loss: 0.6893 - accuracy: 0.7432
38/92 [===========>………………] - ETA: 3s - loss: 0.6896 - accuracy: 0.7410
39/92 [===========>………………] - ETA: 3s - loss: 0.6872 - accuracy: 0.7412
40/92 [============>……………..] - ETA: 3s - loss: 0.6829 - accuracy: 0.7437
41/92 [============>……………..] - ETA: 2s - loss: 0.6822 - accuracy: 0.7454
42/92 [============>……………..] - ETA: 2s - loss: 0.6821 - accuracy: 0.7455
43/92 [=============>…………….] - ETA: 2s - loss: 0.6765 - accuracy: 0.7485
44/92 [=============>…………….] - ETA: 2s - loss: 0.6732 - accuracy: 0.7514
45/92 [=============>…………….] - ETA: 2s - loss: 0.6690 - accuracy: 0.7535
46/92 [==============>……………] - ETA: 2s - loss: 0.6715 - accuracy: 0.7514
47/92 [==============>……………] - ETA: 2s - loss: 0.6701 - accuracy: 0.7520
48/92 [==============>……………] - ETA: 2s - loss: 0.6688 - accuracy: 0.7552
49/92 [==============>……………] - ETA: 2s - loss: 0.6664 - accuracy: 0.7551
50/92 [===============>…………..] - ETA: 2s - loss: 0.6705 - accuracy: 0.7531
51/92 [===============>…………..] - ETA: 2s - loss: 0.6677 - accuracy: 0.7537
52/92 [===============>…………..] - ETA: 2s - loss: 0.6683 - accuracy: 0.7512
53/92 [================>………….] - ETA: 2s - loss: 0.6632 - accuracy: 0.7535
54/92 [================>………….] - ETA: 2s - loss: 0.6671 - accuracy: 0.7512
55/92 [================>………….] - ETA: 2s - loss: 0.6669 - accuracy: 0.7517
56/92 [=================>…………] - ETA: 2s - loss: 0.6650 - accuracy: 0.7522
57/92 [=================>…………] - ETA: 2s - loss: 0.6592 - accuracy: 0.7549
58/92 [=================>…………] - ETA: 1s - loss: 0.6616 - accuracy: 0.7522
59/92 [==================>………..] - ETA: 1s - loss: 0.6631 - accuracy: 0.7526
60/92 [==================>………..] - ETA: 1s - loss: 0.6615 - accuracy: 0.7547
61/92 [==================>………..] - ETA: 1s - loss: 0.6653 - accuracy: 0.7536
62/92 [===================>……….] - ETA: 1s - loss: 0.6664 - accuracy: 0.7525
63/92 [===================>……….] - ETA: 1s - loss: 0.6658 - accuracy: 0.7520
64/92 [===================>……….] - ETA: 1s - loss: 0.6643 - accuracy: 0.7520
65/92 [====================>………] - ETA: 1s - loss: 0.6618 - accuracy: 0.7543
66/92 [====================>………] - ETA: 1s - loss: 0.6601 - accuracy: 0.7547
67/92 [====================>………] - ETA: 1s - loss: 0.6585 - accuracy: 0.7551
68/92 [=====================>……..] - ETA: 1s - loss: 0.6609 - accuracy: 0.7523
69/92 [=====================>……..] - ETA: 1s - loss: 0.6676 - accuracy: 0.7486
71/92 [======================>…….] - ETA: 1s - loss: 0.6645 - accuracy: 0.7509
72/92 [======================>…….] - ETA: 1s - loss: 0.6621 - accuracy: 0.7522
73/92 [======================>…….] - ETA: 1s - loss: 0.6610 - accuracy: 0.7526
74/92 [=======================>……] - ETA: 1s - loss: 0.6592 - accuracy: 0.7530
75/92 [=======================>……] - ETA: 0s - loss: 0.6584 - accuracy: 0.7533
76/92 [=======================>……] - ETA: 0s - loss: 0.6593 - accuracy: 0.7541
77/92 [========================>…..] - ETA: 0s - loss: 0.6574 - accuracy: 0.7545
78/92 [========================>…..] - ETA: 0s - loss: 0.6558 - accuracy: 0.7552
79/92 [========================>…..] - ETA: 0s - loss: 0.6547 - accuracy: 0.7552
80/92 [=========================>….] - ETA: 0s - loss: 0.6563 - accuracy: 0.7551
81/92 [=========================>….] - ETA: 0s - loss: 0.6566 - accuracy: 0.7550
82/92 [=========================>….] - ETA: 0s - loss: 0.6598 - accuracy: 0.7550
83/92 [==========================>…] - ETA: 0s - loss: 0.6627 - accuracy: 0.7538
84/92 [==========================>…] - ETA: 0s - loss: 0.6615 - accuracy: 0.7537
85/92 [==========================>…] - ETA: 0s - loss: 0.6648 - accuracy: 0.7515
86/92 [===========================>..] - ETA: 0s - loss: 0.6672 - accuracy: 0.7489
87/92 [===========================>..] - ETA: 0s - loss: 0.6659 - accuracy: 0.7493
88/92 [===========================>..] - ETA: 0s - loss: 0.6646 - accuracy: 0.7500
89/92 [============================>.] - ETA: 0s - loss: 0.6653 - accuracy: 0.7489
90/92 [============================>.] - ETA: 0s - loss: 0.6656 - accuracy: 0.7479
91/92 [============================>.] - ETA: 0s - loss: 0.6670 - accuracy: 0.7483
92/92 [==============================] - ETA: 0s - loss: 0.6711 - accuracy: 0.7480
92/92 [==============================] - 6s 64ms/step - loss: 0.6711 - accuracy: 0.7480 - val_loss: 0.8020 - val_accuracy: 0.6962
Epoch 10/15
1/92 [..............................] - ETA: 7s - loss: 0.9496 - accuracy: 0.6250
2/92 [..............................] - ETA: 5s - loss: 0.9003 - accuracy: 0.5938
3/92 [..............................] - ETA: 5s - loss: 0.7666 - accuracy: 0.6771
4/92 [>.............................] - ETA: 5s - loss: 0.7763 - accuracy: 0.6875
5/92 [>.............................] - ETA: 5s - loss: 0.7471 - accuracy: 0.7000
6/92 [>.............................] - ETA: 4s - loss: 0.7708 - accuracy: 0.6771
7/92 [=>............................] - ETA: 4s - loss: 0.7444 - accuracy: 0.7009
8/92 [=>............................] - ETA: 4s - loss: 0.7261 - accuracy: 0.7109
9/92 [=>............................] - ETA: 4s - loss: 0.7338 - accuracy: 0.7014
10/92 [==>………………………] - ETA: 4s - loss: 0.7229 - accuracy: 0.7125
11/92 [==>………………………] - ETA: 4s - loss: 0.7093 - accuracy: 0.7244
12/92 [==>………………………] - ETA: 4s - loss: 0.7040 - accuracy: 0.7292
13/92 [===>……………………..] - ETA: 4s - loss: 0.6978 - accuracy: 0.7284
14/92 [===>……………………..] - ETA: 4s - loss: 0.6941 - accuracy: 0.7277
15/92 [===>……………………..] - ETA: 4s - loss: 0.6810 - accuracy: 0.7354
16/92 [====>…………………….] - ETA: 4s - loss: 0.6905 - accuracy: 0.7305
17/92 [====>…………………….] - ETA: 4s - loss: 0.6876 - accuracy: 0.7261
18/92 [====>…………………….] - ETA: 4s - loss: 0.6831 - accuracy: 0.7326
19/92 [=====>……………………] - ETA: 4s - loss: 0.6771 - accuracy: 0.7368
20/92 [=====>……………………] - ETA: 4s - loss: 0.6791 - accuracy: 0.7375
21/92 [=====>……………………] - ETA: 4s - loss: 0.6655 - accuracy: 0.7470
22/92 [======>…………………..] - ETA: 4s - loss: 0.6608 - accuracy: 0.7486
23/92 [======>…………………..] - ETA: 3s - loss: 0.6494 - accuracy: 0.7514
24/92 [======>…………………..] - ETA: 3s - loss: 0.6466 - accuracy: 0.7552
25/92 [=======>………………….] - ETA: 3s - loss: 0.6500 - accuracy: 0.7538
26/92 [=======>………………….] - ETA: 3s - loss: 0.6396 - accuracy: 0.7572
27/92 [=======>………………….] - ETA: 3s - loss: 0.6376 - accuracy: 0.7558
29/92 [========>…………………] - ETA: 3s - loss: 0.6443 - accuracy: 0.7533
30/92 [========>…………………] - ETA: 3s - loss: 0.6406 - accuracy: 0.7563
31/92 [=========>………………..] - ETA: 3s - loss: 0.6387 - accuracy: 0.7561
32/92 [=========>………………..] - ETA: 3s - loss: 0.6517 - accuracy: 0.7510
33/92 [=========>………………..] - ETA: 3s - loss: 0.6472 - accuracy: 0.7529
34/92 [==========>……………….] - ETA: 3s - loss: 0.6590 - accuracy: 0.7472
35/92 [==========>……………….] - ETA: 3s - loss: 0.6533 - accuracy: 0.7500
36/92 [==========>……………….] - ETA: 3s - loss: 0.6531 - accuracy: 0.7500
37/92 [===========>………………] - ETA: 3s - loss: 0.6553 - accuracy: 0.7509
38/92 [===========>………………] - ETA: 3s - loss: 0.6546 - accuracy: 0.7517
39/92 [===========>………………] - ETA: 3s - loss: 0.6583 - accuracy: 0.7500
40/92 [============>……………..] - ETA: 3s - loss: 0.6534 - accuracy: 0.7516
41/92 [============>……………..] - ETA: 2s - loss: 0.6557 - accuracy: 0.7492
42/92 [============>……………..] - ETA: 2s - loss: 0.6575 - accuracy: 0.7485
43/92 [=============>…………….] - ETA: 2s - loss: 0.6557 - accuracy: 0.7493
44/92 [=============>…………….] - ETA: 2s - loss: 0.6548 - accuracy: 0.7500
45/92 [=============>…………….] - ETA: 2s - loss: 0.6536 - accuracy: 0.7521
46/92 [==============>……………] - ETA: 2s - loss: 0.6548 - accuracy: 0.7514
47/92 [==============>……………] - ETA: 2s - loss: 0.6491 - accuracy: 0.7540
48/92 [==============>……………] - ETA: 2s - loss: 0.6510 - accuracy: 0.7533
49/92 [==============>……………] - ETA: 2s - loss: 0.6537 - accuracy: 0.7526
50/92 [===============>…………..] - ETA: 2s - loss: 0.6507 - accuracy: 0.7525
51/92 [===============>…………..] - ETA: 2s - loss: 0.6462 - accuracy: 0.7549
52/92 [===============>…………..] - ETA: 2s - loss: 0.6456 - accuracy: 0.7554
53/92 [================>………….] - ETA: 2s - loss: 0.6444 - accuracy: 0.7536
54/92 [================>………….] - ETA: 2s - loss: 0.6474 - accuracy: 0.7535
55/92 [================>………….] - ETA: 2s - loss: 0.6433 - accuracy: 0.7551
56/92 [=================>…………] - ETA: 2s - loss: 0.6406 - accuracy: 0.7556
57/92 [=================>…………] - ETA: 2s - loss: 0.6381 - accuracy: 0.7577
58/92 [=================>…………] - ETA: 1s - loss: 0.6361 - accuracy: 0.7581
59/92 [==================>………..] - ETA: 1s - loss: 0.6355 - accuracy: 0.7601
60/92 [==================>………..] - ETA: 1s - loss: 0.6337 - accuracy: 0.7605
61/92 [==================>………..] - ETA: 1s - loss: 0.6314 - accuracy: 0.7603
62/92 [===================>……….] - ETA: 1s - loss: 0.6352 - accuracy: 0.7591
63/92 [===================>……….] - ETA: 1s - loss: 0.6361 - accuracy: 0.7575
64/92 [===================>……….] - ETA: 1s - loss: 0.6343 - accuracy: 0.7583
65/92 [====================>………] - ETA: 1s - loss: 0.6303 - accuracy: 0.7597
66/92 [====================>………] - ETA: 1s - loss: 0.6301 - accuracy: 0.7595
67/92 [====================>………] - ETA: 1s - loss: 0.6309 - accuracy: 0.7589
68/92 [=====================>……..] - ETA: 1s - loss: 0.6335 - accuracy: 0.7574
69/92 [=====================>……..] - ETA: 1s - loss: 0.6331 - accuracy: 0.7573
70/92 [=====================>……..] - ETA: 1s - loss: 0.6354 - accuracy: 0.7563
71/92 [======================>…….] - ETA: 1s - loss: 0.6353 - accuracy: 0.7562
72/92 [======================>…….] - ETA: 1s - loss: 0.6337 - accuracy: 0.7574
73/92 [======================>…….] - ETA: 1s - loss: 0.6341 - accuracy: 0.7569
74/92 [=======================>……] - ETA: 1s - loss: 0.6345 - accuracy: 0.7564
75/92 [=======================>……] - ETA: 0s - loss: 0.6398 - accuracy: 0.7529
76/92 [=======================>……] - ETA: 0s - loss: 0.6416 - accuracy: 0.7525
77/92 [========================>…..] - ETA: 0s - loss: 0.6396 - accuracy: 0.7541
78/92 [========================>…..] - ETA: 0s - loss: 0.6385 - accuracy: 0.7548
79/92 [========================>…..] - ETA: 0s - loss: 0.6400 - accuracy: 0.7544
80/92 [=========================>….] - ETA: 0s - loss: 0.6401 - accuracy: 0.7539
81/92 [=========================>….] - ETA: 0s - loss: 0.6398 - accuracy: 0.7539
82/92 [=========================>….] - ETA: 0s - loss: 0.6389 - accuracy: 0.7538
83/92 [==========================>…] - ETA: 0s - loss: 0.6372 - accuracy: 0.7545
84/92 [==========================>…] - ETA: 0s - loss: 0.6382 - accuracy: 0.7552
85/92 [==========================>…] - ETA: 0s - loss: 0.6419 - accuracy: 0.7541
86/92 [===========================>..] - ETA: 0s - loss: 0.6430 - accuracy: 0.7533
87/92 [===========================>..] - ETA: 0s - loss: 0.6418 - accuracy: 0.7536
88/92 [===========================>..] - ETA: 0s - loss: 0.6407 - accuracy: 0.7539
89/92 [============================>.] - ETA: 0s - loss: 0.6416 - accuracy: 0.7546
90/92 [============================>.] - ETA: 0s - loss: 0.6424 - accuracy: 0.7552
91/92 [============================>.] - ETA: 0s - loss: 0.6426 - accuracy: 0.7555
92/92 [==============================] - ETA: 0s - loss: 0.6408 - accuracy: 0.7554
92/92 [==============================] - 6s 63ms/step - loss: 0.6408 - accuracy: 0.7554 - val_loss: 0.7525 - val_accuracy: 0.7112
Epoch 11/15
1/92 [..............................] - ETA: 7s - loss: 0.4288 - accuracy: 0.8125
2/92 [..............................] - ETA: 5s - loss: 0.5379 - accuracy: 0.7656
3/92 [..............................] - ETA: 5s - loss: 0.6520 - accuracy: 0.7083
4/92 [>.............................] - ETA: 5s - loss: 0.6468 - accuracy: 0.7266
5/92 [>.............................] - ETA: 5s - loss: 0.5931 - accuracy: 0.7625
6/92 [>.............................] - ETA: 5s - loss: 0.6002 - accuracy: 0.7708
7/92 [=>............................] - ETA: 4s - loss: 0.5982 - accuracy: 0.7768
8/92 [=>............................] - ETA: 4s - loss: 0.6006 - accuracy: 0.7852
9/92 [=>............................] - ETA: 4s - loss: 0.5806 - accuracy: 0.7951
10/92 [==>………………………] - ETA: 4s - loss: 0.5907 - accuracy: 0.7844
11/92 [==>………………………] - ETA: 4s - loss: 0.5741 - accuracy: 0.7898
12/92 [==>………………………] - ETA: 4s - loss: 0.5914 - accuracy: 0.7786
13/92 [===>……………………..] - ETA: 4s - loss: 0.5997 - accuracy: 0.7764
14/92 [===>……………………..] - ETA: 4s - loss: 0.5910 - accuracy: 0.7746
15/92 [===>……………………..] - ETA: 4s - loss: 0.5887 - accuracy: 0.7812
16/92 [====>…………………….] - ETA: 4s - loss: 0.5754 - accuracy: 0.7891
17/92 [====>…………………….] - ETA: 4s - loss: 0.5765 - accuracy: 0.7868
18/92 [====>…………………….] - ETA: 4s - loss: 0.5621 - accuracy: 0.7917
19/92 [=====>……………………] - ETA: 4s - loss: 0.5564 - accuracy: 0.7944
20/92 [=====>……………………] - ETA: 4s - loss: 0.5581 - accuracy: 0.7906
21/92 [=====>……………………] - ETA: 4s - loss: 0.5506 - accuracy: 0.7946
22/92 [======>…………………..] - ETA: 4s - loss: 0.5530 - accuracy: 0.7940
23/92 [======>…………………..] - ETA: 3s - loss: 0.5548 - accuracy: 0.7935
24/92 [======>…………………..] - ETA: 3s - loss: 0.5654 - accuracy: 0.7852
25/92 [=======>………………….] - ETA: 3s - loss: 0.5796 - accuracy: 0.7800
26/92 [=======>………………….] - ETA: 3s - loss: 0.5784 - accuracy: 0.7776
27/92 [=======>………………….] - ETA: 3s - loss: 0.5750 - accuracy: 0.7778
28/92 [========>…………………] - ETA: 3s - loss: 0.5857 - accuracy: 0.7723
29/92 [========>…………………] - ETA: 3s - loss: 0.5865 - accuracy: 0.7737
30/92 [========>…………………] - ETA: 3s - loss: 0.5927 - accuracy: 0.7688
31/92 [=========>………………..] - ETA: 3s - loss: 0.5894 - accuracy: 0.7692
32/92 [=========>………………..] - ETA: 3s - loss: 0.5873 - accuracy: 0.7695
33/92 [=========>………………..] - ETA: 3s - loss: 0.5872 - accuracy: 0.7689
34/92 [==========>……………….] - ETA: 3s - loss: 0.5840 - accuracy: 0.7702
35/92 [==========>……………….] - ETA: 3s - loss: 0.5864 - accuracy: 0.7688
36/92 [==========>……………….] - ETA: 3s - loss: 0.5859 - accuracy: 0.7682
37/92 [===========>………………] - ETA: 3s - loss: 0.5904 - accuracy: 0.7652
38/92 [===========>………………] - ETA: 3s - loss: 0.5938 - accuracy: 0.7640
39/92 [===========>………………] - ETA: 3s - loss: 0.5931 - accuracy: 0.7636
41/92 [============>……………..] - ETA: 2s - loss: 0.5935 - accuracy: 0.7630
42/92 [============>……………..] - ETA: 2s - loss: 0.5924 - accuracy: 0.7657
43/92 [=============>…………….] - ETA: 2s - loss: 0.5920 - accuracy: 0.7675
44/92 [=============>…………….] - ETA: 2s - loss: 0.5865 - accuracy: 0.7707
45/92 [=============>…………….] - ETA: 2s - loss: 0.5830 - accuracy: 0.7737
46/92 [==============>……………] - ETA: 2s - loss: 0.5833 - accuracy: 0.7753
47/92 [==============>……………] - ETA: 2s - loss: 0.5803 - accuracy: 0.7754
48/92 [==============>……………] - ETA: 2s - loss: 0.5783 - accuracy: 0.7762
49/92 [==============>……………] - ETA: 2s - loss: 0.5784 - accuracy: 0.7763
50/92 [===============>…………..] - ETA: 2s - loss: 0.5813 - accuracy: 0.7758
51/92 [===============>…………..] - ETA: 2s - loss: 0.5816 - accuracy: 0.7746
52/92 [===============>…………..] - ETA: 2s - loss: 0.5823 - accuracy: 0.7748
53/92 [================>………….] - ETA: 2s - loss: 0.5830 - accuracy: 0.7749
54/92 [================>………….] - ETA: 2s - loss: 0.5879 - accuracy: 0.7733
55/92 [================>………….] - ETA: 2s - loss: 0.5928 - accuracy: 0.7728
56/92 [=================>…………] - ETA: 2s - loss: 0.5927 - accuracy: 0.7724
57/92 [=================>…………] - ETA: 2s - loss: 0.5908 - accuracy: 0.7731
58/92 [=================>…………] - ETA: 1s - loss: 0.5892 - accuracy: 0.7727
59/92 [==================>………..] - ETA: 1s - loss: 0.5875 - accuracy: 0.7734
60/92 [==================>………..] - ETA: 1s - loss: 0.5894 - accuracy: 0.7730
61/92 [==================>………..] - ETA: 1s - loss: 0.5912 - accuracy: 0.7716
62/92 [===================>……….] - ETA: 1s - loss: 0.5880 - accuracy: 0.7723
63/92 [===================>……….] - ETA: 1s - loss: 0.5862 - accuracy: 0.7719
64/92 [===================>……….] - ETA: 1s - loss: 0.5880 - accuracy: 0.7711
65/92 [====================>………] - ETA: 1s - loss: 0.5874 - accuracy: 0.7722
66/92 [====================>………] - ETA: 1s - loss: 0.5865 - accuracy: 0.7723
67/92 [====================>………] - ETA: 1s - loss: 0.5869 - accuracy: 0.7720
68/92 [=====================>……..] - ETA: 1s - loss: 0.5846 - accuracy: 0.7726
69/92 [=====================>……..] - ETA: 1s - loss: 0.5852 - accuracy: 0.7723
70/92 [=====================>……..] - ETA: 1s - loss: 0.5853 - accuracy: 0.7720
71/92 [======================>…….] - ETA: 1s - loss: 0.5856 - accuracy: 0.7725
72/92 [======================>…….] - ETA: 1s - loss: 0.5948 - accuracy: 0.7687
73/92 [======================>…….] - ETA: 1s - loss: 0.5983 - accuracy: 0.7676
74/92 [=======================>……] - ETA: 1s - loss: 0.5987 - accuracy: 0.7669
75/92 [=======================>……] - ETA: 0s - loss: 0.5982 - accuracy: 0.7671
76/92 [=======================>……] - ETA: 0s - loss: 0.5964 - accuracy: 0.7677
77/92 [========================>…..] - ETA: 0s - loss: 0.5961 - accuracy: 0.7687
78/92 [========================>…..] - ETA: 0s - loss: 0.5969 - accuracy: 0.7681
79/92 [========================>…..] - ETA: 0s - loss: 0.6019 - accuracy: 0.7675
80/92 [=========================>….] - ETA: 0s - loss: 0.6002 - accuracy: 0.7684
81/92 [=========================>….] - ETA: 0s - loss: 0.5994 - accuracy: 0.7697
82/92 [=========================>….] - ETA: 0s - loss: 0.6019 - accuracy: 0.7695
83/92 [==========================>…] - ETA: 0s - loss: 0.6030 - accuracy: 0.7685
84/92 [==========================>…] - ETA: 0s - loss: 0.6032 - accuracy: 0.7679
85/92 [==========================>…] - ETA: 0s - loss: 0.6072 - accuracy: 0.7662
86/92 [===========================>..] - ETA: 0s - loss: 0.6082 - accuracy: 0.7657
87/92 [===========================>..] - ETA: 0s - loss: 0.6130 - accuracy: 0.7630
88/92 [===========================>..] - ETA: 0s - loss: 0.6125 - accuracy: 0.7632
89/92 [============================>.] - ETA: 0s - loss: 0.6117 - accuracy: 0.7641
90/92 [============================>.] - ETA: 0s - loss: 0.6107 - accuracy: 0.7643
91/92 [============================>.] - ETA: 0s - loss: 0.6086 - accuracy: 0.7645
92/92 [==============================] - ETA: 0s - loss: 0.6071 - accuracy: 0.7657
92/92 [==============================] - 6s 64ms/step - loss: 0.6071 - accuracy: 0.7657 - val_loss: 0.6969 - val_accuracy: 0.7316
Epoch 12/15
1/92 [..............................] - ETA: 6s - loss: 0.4407 - accuracy: 0.8125
2/92 [..............................] - ETA: 5s - loss: 0.4792 - accuracy: 0.8281
3/92 [..............................] - ETA: 5s - loss: 0.4696 - accuracy: 0.8229
4/92 [>.............................] - ETA: 5s - loss: 0.4918 - accuracy: 0.8047
5/92 [>.............................] - ETA: 5s - loss: 0.5115 - accuracy: 0.8000
6/92 [>.............................] - ETA: 5s - loss: 0.5140 - accuracy: 0.8021
7/92 [=>............................] - ETA: 4s - loss: 0.5059 - accuracy: 0.8036
8/92 [=>............................] - ETA: 4s - loss: 0.5174 - accuracy: 0.8086
9/92 [=>............................] - ETA: 4s - loss: 0.4993 - accuracy: 0.8160
10/92 [==>………………………] - ETA: 4s - loss: 0.5029 - accuracy: 0.8188
11/92 [==>………………………] - ETA: 4s - loss: 0.5025 - accuracy: 0.8210
12/92 [==>………………………] - ETA: 4s - loss: 0.5059 - accuracy: 0.8125
13/92 [===>……………………..] - ETA: 4s - loss: 0.5044 - accuracy: 0.8101
14/92 [===>……………………..] - ETA: 4s - loss: 0.5114 - accuracy: 0.8080
15/92 [===>……………………..] - ETA: 4s - loss: 0.5230 - accuracy: 0.8021
16/92 [====>…………………….] - ETA: 4s - loss: 0.5226 - accuracy: 0.7988
17/92 [====>…………………….] - ETA: 4s - loss: 0.5257 - accuracy: 0.7960
18/92 [====>…………………….] - ETA: 4s - loss: 0.5255 - accuracy: 0.7951
19/92 [=====>……………………] - ETA: 4s - loss: 0.5267 - accuracy: 0.7944
20/92 [=====>……………………] - ETA: 4s - loss: 0.5281 - accuracy: 0.7937
21/92 [=====>……………………] - ETA: 4s - loss: 0.5360 - accuracy: 0.7917
22/92 [======>…………………..] - ETA: 4s - loss: 0.5355 - accuracy: 0.7940
23/92 [======>…………………..] - ETA: 4s - loss: 0.5374 - accuracy: 0.7908
24/92 [======>…………………..] - ETA: 3s - loss: 0.5475 - accuracy: 0.7878
25/92 [=======>………………….] - ETA: 3s - loss: 0.5563 - accuracy: 0.7875
26/92 [=======>………………….] - ETA: 3s - loss: 0.5524 - accuracy: 0.7897
27/92 [=======>………………….] - ETA: 3s - loss: 0.5504 - accuracy: 0.7905
28/92 [========>…………………] - ETA: 3s - loss: 0.5477 - accuracy: 0.7924
29/92 [========>…………………] - ETA: 3s - loss: 0.5441 - accuracy: 0.7931
30/92 [========>…………………] - ETA: 3s - loss: 0.5505 - accuracy: 0.7885
31/92 [=========>………………..] - ETA: 3s - loss: 0.5585 - accuracy: 0.7853
32/92 [=========>………………..] - ETA: 3s - loss: 0.5641 - accuracy: 0.7832
33/92 [=========>………………..] - ETA: 3s - loss: 0.5631 - accuracy: 0.7831
34/92 [==========>……………….] - ETA: 3s - loss: 0.5656 - accuracy: 0.7831
35/92 [==========>……………….] - ETA: 3s - loss: 0.5645 - accuracy: 0.7830
36/92 [==========>……………….] - ETA: 3s - loss: 0.5611 - accuracy: 0.7830
37/92 [===========>………………] - ETA: 3s - loss: 0.5690 - accuracy: 0.7796
38/92 [===========>………………] - ETA: 3s - loss: 0.5690 - accuracy: 0.7804
39/92 [===========>………………] - ETA: 3s - loss: 0.5674 - accuracy: 0.7788
40/92 [============>……………..] - ETA: 3s - loss: 0.5766 - accuracy: 0.7750
41/92 [============>……………..] - ETA: 2s - loss: 0.5767 - accuracy: 0.7721
42/92 [============>……………..] - ETA: 2s - loss: 0.5762 - accuracy: 0.7731
43/92 [=============>…………….] - ETA: 2s - loss: 0.5790 - accuracy: 0.7703
44/92 [=============>…………….] - ETA: 2s - loss: 0.5790 - accuracy: 0.7713
45/92 [=============>…………….] - ETA: 2s - loss: 0.5772 - accuracy: 0.7722
46/92 [==============>……………] - ETA: 2s - loss: 0.5747 - accuracy: 0.7745
47/92 [==============>……………] - ETA: 2s - loss: 0.5772 - accuracy: 0.7733
48/92 [==============>……………] - ETA: 2s - loss: 0.5753 - accuracy: 0.7747
49/92 [==============>……………] - ETA: 2s - loss: 0.5834 - accuracy: 0.7710
50/92 [===============>…………..] - ETA: 2s - loss: 0.5847 - accuracy: 0.7700
51/92 [===============>…………..] - ETA: 2s - loss: 0.5822 - accuracy: 0.7727
52/92 [===============>…………..] - ETA: 2s - loss: 0.5826 - accuracy: 0.7728
53/92 [================>………….] - ETA: 2s - loss: 0.5850 - accuracy: 0.7712
54/92 [================>………….] - ETA: 2s - loss: 0.5859 - accuracy: 0.7703
55/92 [================>………….] - ETA: 2s - loss: 0.5846 - accuracy: 0.7710
56/92 [=================>…………] - ETA: 2s - loss: 0.5853 - accuracy: 0.7695
57/92 [=================>…………] - ETA: 2s - loss: 0.5860 - accuracy: 0.7692
58/92 [=================>…………] - ETA: 1s - loss: 0.5864 - accuracy: 0.7683
59/92 [==================>………..] - ETA: 1s - loss: 0.5843 - accuracy: 0.7691
60/92 [==================>………..] - ETA: 1s - loss: 0.5904 - accuracy: 0.7672
61/92 [==================>………..] - ETA: 1s - loss: 0.5940 - accuracy: 0.7659
62/92 [===================>……….] - ETA: 1s - loss: 0.5946 - accuracy: 0.7661
64/92 [===================>……….] - ETA: 1s - loss: 0.5941 - accuracy: 0.7662
65/92 [====================>………] - ETA: 1s - loss: 0.5945 - accuracy: 0.7669
66/92 [====================>………] - ETA: 1s - loss: 0.5928 - accuracy: 0.7671
67/92 [====================>………] - ETA: 1s - loss: 0.5914 - accuracy: 0.7669
68/92 [=====================>……..] - ETA: 1s - loss: 0.5956 - accuracy: 0.7648
69/92 [=====================>……..] - ETA: 1s - loss: 0.5948 - accuracy: 0.7655
70/92 [=====================>……..] - ETA: 1s - loss: 0.5943 - accuracy: 0.7657
71/92 [======================>…….] - ETA: 1s - loss: 0.5981 - accuracy: 0.7650
72/92 [======================>…….] - ETA: 1s - loss: 0.5964 - accuracy: 0.7652
73/92 [======================>…….] - ETA: 1s - loss: 0.5972 - accuracy: 0.7655
74/92 [=======================>……] - ETA: 1s - loss: 0.5965 - accuracy: 0.7653
75/92 [=======================>……] - ETA: 0s - loss: 0.5957 - accuracy: 0.7663
76/92 [=======================>……] - ETA: 0s - loss: 0.5946 - accuracy: 0.7673
77/92 [========================>…..] - ETA: 0s - loss: 0.5946 - accuracy: 0.7683
78/92 [========================>…..] - ETA: 0s - loss: 0.5945 - accuracy: 0.7681
79/92 [========================>…..] - ETA: 0s - loss: 0.5910 - accuracy: 0.7702
80/92 [=========================>….] - ETA: 0s - loss: 0.5894 - accuracy: 0.7708
81/92 [=========================>….] - ETA: 0s - loss: 0.5895 - accuracy: 0.7721
82/92 [=========================>….] - ETA: 0s - loss: 0.5885 - accuracy: 0.7722
83/92 [==========================>…] - ETA: 0s - loss: 0.5864 - accuracy: 0.7730
84/92 [==========================>…] - ETA: 0s - loss: 0.5844 - accuracy: 0.7743
85/92 [==========================>…] - ETA: 0s - loss: 0.5862 - accuracy: 0.7729
86/92 [===========================>..] - ETA: 0s - loss: 0.5881 - accuracy: 0.7715
87/92 [===========================>..] - ETA: 0s - loss: 0.5890 - accuracy: 0.7705
88/92 [===========================>..] - ETA: 0s - loss: 0.5906 - accuracy: 0.7699
89/92 [============================>.] - ETA: 0s - loss: 0.5901 - accuracy: 0.7708
90/92 [============================>.] - ETA: 0s - loss: 0.5904 - accuracy: 0.7702
91/92 [============================>.] - ETA: 0s - loss: 0.5886 - accuracy: 0.7710
92/92 [==============================] - ETA: 0s - loss: 0.5875 - accuracy: 0.7721
92/92 [==============================] - 6s 64ms/step - loss: 0.5875 - accuracy: 0.7721 - val_loss: 0.7032 - val_accuracy: 0.7384
Epoch 13/15
1/92 [..............................] - ETA: 7s - loss: 0.3616 - accuracy: 0.9062
2/92 [..............................] - ETA: 5s - loss: 0.4409 - accuracy: 0.8750
3/92 [..............................] - ETA: 5s - loss: 0.4413 - accuracy: 0.8750
4/92 [>.............................] - ETA: 5s - loss: 0.5168 - accuracy: 0.8438
5/92 [>.............................] - ETA: 5s - loss: 0.5532 - accuracy: 0.8125
6/92 [>.............................] - ETA: 5s - loss: 0.5588 - accuracy: 0.7969
7/92 [=>............................] - ETA: 4s - loss: 0.5653 - accuracy: 0.7946
8/92 [=>............................] - ETA: 4s - loss: 0.5709 - accuracy: 0.7969
9/92 [=>............................] - ETA: 4s - loss: 0.6174 - accuracy: 0.7743
10/92 [==>………………………] - ETA: 4s - loss: 0.5993 - accuracy: 0.7875
11/92 [==>………………………] - ETA: 4s - loss: 0.5979 - accuracy: 0.7812
12/92 [==>………………………] - ETA: 4s - loss: 0.5813 - accuracy: 0.7891
13/92 [===>……………………..] - ETA: 4s - loss: 0.5770 - accuracy: 0.7933
14/92 [===>……………………..] - ETA: 4s - loss: 0.5728 - accuracy: 0.7924
15/92 [===>……………………..] - ETA: 4s - loss: 0.5796 - accuracy: 0.7833
16/92 [====>…………………….] - ETA: 4s - loss: 0.5670 - accuracy: 0.7852
17/92 [====>…………………….] - ETA: 4s - loss: 0.5622 - accuracy: 0.7849
18/92 [====>…………………….] - ETA: 4s - loss: 0.5590 - accuracy: 0.7847
19/92 [=====>……………………] - ETA: 4s - loss: 0.5540 - accuracy: 0.7878
20/92 [=====>……………………] - ETA: 4s - loss: 0.5541 - accuracy: 0.7875
21/92 [=====>……………………] - ETA: 4s - loss: 0.5493 - accuracy: 0.7902
22/92 [======>…………………..] - ETA: 4s - loss: 0.5467 - accuracy: 0.7869
23/92 [======>…………………..] - ETA: 4s - loss: 0.5461 - accuracy: 0.7894
24/92 [======>…………………..] - ETA: 3s - loss: 0.5429 - accuracy: 0.7930
25/92 [=======>………………….] - ETA: 3s - loss: 0.5391 - accuracy: 0.7962
26/92 [=======>………………….] - ETA: 3s - loss: 0.5366 - accuracy: 0.7945
27/92 [=======>………………….] - ETA: 3s - loss: 0.5330 - accuracy: 0.7975
28/92 [========>…………………] - ETA: 3s - loss: 0.5354 - accuracy: 0.7958
29/92 [========>…………………] - ETA: 3s - loss: 0.5403 - accuracy: 0.7920
30/92 [========>…………………] - ETA: 3s - loss: 0.5424 - accuracy: 0.7917
31/92 [=========>………………..] - ETA: 3s - loss: 0.5444 - accuracy: 0.7923
32/92 [=========>………………..] - ETA: 3s - loss: 0.5420 - accuracy: 0.7930
33/92 [=========>………………..] - ETA: 3s - loss: 0.5413 - accuracy: 0.7936
34/92 [==========>……………….] - ETA: 3s - loss: 0.5392 - accuracy: 0.7941
35/92 [==========>……………….] - ETA: 3s - loss: 0.5409 - accuracy: 0.7937
36/92 [==========>……………….] - ETA: 3s - loss: 0.5409 - accuracy: 0.7943
37/92 [===========>………………] - ETA: 3s - loss: 0.5421 - accuracy: 0.7956
38/92 [===========>………………] - ETA: 3s - loss: 0.5464 - accuracy: 0.7944
39/92 [===========>………………] - ETA: 3s - loss: 0.5520 - accuracy: 0.7909
40/92 [============>……………..] - ETA: 3s - loss: 0.5568 - accuracy: 0.7867
41/92 [============>……………..] - ETA: 2s - loss: 0.5623 - accuracy: 0.7828
42/92 [============>……………..] - ETA: 2s - loss: 0.5603 - accuracy: 0.7850
43/92 [=============>…………….] - ETA: 2s - loss: 0.5618 - accuracy: 0.7842
44/92 [=============>…………….] - ETA: 2s - loss: 0.5662 - accuracy: 0.7827
45/92 [=============>…………….] - ETA: 2s - loss: 0.5643 - accuracy: 0.7840
46/92 [==============>……………] - ETA: 2s - loss: 0.5687 - accuracy: 0.7826
48/92 [==============>……………] - ETA: 2s - loss: 0.5722 - accuracy: 0.7840
49/92 [==============>……………] - ETA: 2s - loss: 0.5726 - accuracy: 0.7840
50/92 [===============>…………..] - ETA: 2s - loss: 0.5729 - accuracy: 0.7839
51/92 [===============>…………..] - ETA: 2s - loss: 0.5721 - accuracy: 0.7833
52/92 [===============>…………..] - ETA: 2s - loss: 0.5763 - accuracy: 0.7820
53/92 [================>………….] - ETA: 2s - loss: 0.5782 - accuracy: 0.7814
54/92 [================>………….] - ETA: 2s - loss: 0.5750 - accuracy: 0.7831
55/92 [================>………….] - ETA: 2s - loss: 0.5739 - accuracy: 0.7837
56/92 [=================>…………] - ETA: 2s - loss: 0.5775 - accuracy: 0.7831
57/92 [=================>…………] - ETA: 2s - loss: 0.5768 - accuracy: 0.7830
58/92 [=================>…………] - ETA: 1s - loss: 0.5750 - accuracy: 0.7841
59/92 [==================>………..] - ETA: 1s - loss: 0.5743 - accuracy: 0.7840
60/92 [==================>………..] - ETA: 1s - loss: 0.5729 - accuracy: 0.7856
61/92 [==================>………..] - ETA: 1s - loss: 0.5686 - accuracy: 0.7876
62/92 [===================>……….] - ETA: 1s - loss: 0.5710 - accuracy: 0.7874
63/92 [===================>……….] - ETA: 1s - loss: 0.5717 - accuracy: 0.7869
64/92 [===================>……….] - ETA: 1s - loss: 0.5701 - accuracy: 0.7877
65/92 [====================>………] - ETA: 1s - loss: 0.5698 - accuracy: 0.7876
66/92 [====================>………] - ETA: 1s - loss: 0.5658 - accuracy: 0.7899
67/92 [====================>………] - ETA: 1s - loss: 0.5665 - accuracy: 0.7898
68/92 [=====================>……..] - ETA: 1s - loss: 0.5672 - accuracy: 0.7897
69/92 [=====================>……..] - ETA: 1s - loss: 0.5672 - accuracy: 0.7900
70/92 [=====================>……..] - ETA: 1s - loss: 0.5653 - accuracy: 0.7903
71/92 [======================>…….] - ETA: 1s - loss: 0.5679 - accuracy: 0.7893
72/92 [======================>…….] - ETA: 1s - loss: 0.5667 - accuracy: 0.7892
73/92 [======================>…….] - ETA: 1s - loss: 0.5643 - accuracy: 0.7908
74/92 [=======================>……] - ETA: 1s - loss: 0.5644 - accuracy: 0.7907
75/92 [=======================>……] - ETA: 0s - loss: 0.5694 - accuracy: 0.7880
76/92 [=======================>……] - ETA: 0s - loss: 0.5691 - accuracy: 0.7875
77/92 [========================>…..] - ETA: 0s - loss: 0.5663 - accuracy: 0.7887
78/92 [========================>…..] - ETA: 0s - loss: 0.5676 - accuracy: 0.7866
79/92 [========================>…..] - ETA: 0s - loss: 0.5672 - accuracy: 0.7869
80/92 [=========================>….] - ETA: 0s - loss: 0.5638 - accuracy: 0.7884
81/92 [=========================>….] - ETA: 0s - loss: 0.5651 - accuracy: 0.7872
82/92 [=========================>….] - ETA: 0s - loss: 0.5663 - accuracy: 0.7871
83/92 [==========================>…] - ETA: 0s - loss: 0.5657 - accuracy: 0.7874
84/92 [==========================>…] - ETA: 0s - loss: 0.5643 - accuracy: 0.7881
85/92 [==========================>…] - ETA: 0s - loss: 0.5631 - accuracy: 0.7883
86/92 [===========================>..] - ETA: 0s - loss: 0.5607 - accuracy: 0.7894
87/92 [===========================>..] - ETA: 0s - loss: 0.5604 - accuracy: 0.7893
88/92 [===========================>..] - ETA: 0s - loss: 0.5636 - accuracy: 0.7892
89/92 [============================>.] - ETA: 0s - loss: 0.5628 - accuracy: 0.7901
90/92 [============================>.] - ETA: 0s - loss: 0.5636 - accuracy: 0.7907
91/92 [============================>.] - ETA: 0s - loss: 0.5645 - accuracy: 0.7906
92/92 [==============================] - ETA: 0s - loss: 0.5650 - accuracy: 0.7899
92/92 [==============================] - 6s 64ms/step - loss: 0.5650 - accuracy: 0.7899 - val_loss: 0.6865 - val_accuracy: 0.7384
Epoch 14/15
1/92 [..............................] - ETA: 6s - loss: 0.5964 - accuracy: 0.8125
2/92 [..............................] - ETA: 5s - loss: 0.5740 - accuracy: 0.7656
3/92 [..............................] - ETA: 5s - loss: 0.5757 - accuracy: 0.7604
4/92 [>.............................] - ETA: 5s - loss: 0.5654 - accuracy: 0.7812
5/92 [>.............................] - ETA: 5s - loss: 0.5495 - accuracy: 0.8000
6/92 [>.............................] - ETA: 4s - loss: 0.5494 - accuracy: 0.8021
7/92 [=>............................] - ETA: 4s - loss: 0.5713 - accuracy: 0.7857
8/92 [=>............................] - ETA: 4s - loss: 0.5753 - accuracy: 0.7930
9/92 [=>............................] - ETA: 4s - loss: 0.5722 - accuracy: 0.7882
10/92 [==>………………………] - ETA: 4s - loss: 0.5584 - accuracy: 0.8000
11/92 [==>………………………] - ETA: 4s - loss: 0.5542 - accuracy: 0.8040
12/92 [==>………………………] - ETA: 4s - loss: 0.5490 - accuracy: 0.8073
13/92 [===>……………………..] - ETA: 4s - loss: 0.5517 - accuracy: 0.8101
14/92 [===>……………………..] - ETA: 4s - loss: 0.5408 - accuracy: 0.8125
15/92 [===>……………………..] - ETA: 4s - loss: 0.5433 - accuracy: 0.8125
16/92 [====>…………………….] - ETA: 4s - loss: 0.5399 - accuracy: 0.8125
17/92 [====>…………………….] - ETA: 4s - loss: 0.5283 - accuracy: 0.8143
18/92 [====>…………………….] - ETA: 4s - loss: 0.5167 - accuracy: 0.8194
19/92 [=====>……………………] - ETA: 4s - loss: 0.5163 - accuracy: 0.8174
20/92 [=====>……………………] - ETA: 4s - loss: 0.5131 - accuracy: 0.8203
21/92 [=====>……………………] - ETA: 4s - loss: 0.5171 - accuracy: 0.8170
22/92 [======>…………………..] - ETA: 4s - loss: 0.5165 - accuracy: 0.8168
23/92 [======>…………………..] - ETA: 4s - loss: 0.5095 - accuracy: 0.8166
24/92 [======>…………………..] - ETA: 3s - loss: 0.5024 - accuracy: 0.8164
25/92 [=======>………………….] - ETA: 3s - loss: 0.5024 - accuracy: 0.8175
26/92 [=======>………………….] - ETA: 3s - loss: 0.5048 - accuracy: 0.8173
27/92 [=======>………………….] - ETA: 3s - loss: 0.4967 - accuracy: 0.8218
28/92 [========>…………………] - ETA: 3s - loss: 0.4993 - accuracy: 0.8203
29/92 [========>…………………] - ETA: 3s - loss: 0.5085 - accuracy: 0.8179
30/92 [========>…………………] - ETA: 3s - loss: 0.5115 - accuracy: 0.8167
31/92 [=========>………………..] - ETA: 3s - loss: 0.5066 - accuracy: 0.8196
32/92 [=========>………………..] - ETA: 3s - loss: 0.5124 - accuracy: 0.8174
33/92 [=========>………………..] - ETA: 3s - loss: 0.5149 - accuracy: 0.8134
34/92 [==========>……………….] - ETA: 3s - loss: 0.5102 - accuracy: 0.8143
35/92 [==========>……………….] - ETA: 3s - loss: 0.5071 - accuracy: 0.8161
36/92 [==========>……………….] - ETA: 3s - loss: 0.5083 - accuracy: 0.8168
37/92 [===========>………………] - ETA: 3s - loss: 0.5078 - accuracy: 0.8159
38/92 [===========>………………] - ETA: 3s - loss: 0.5065 - accuracy: 0.8150
39/92 [===========>………………] - ETA: 3s - loss: 0.5074 - accuracy: 0.8133
40/92 [============>……………..] - ETA: 3s - loss: 0.5130 - accuracy: 0.8102
41/92 [============>……………..] - ETA: 2s - loss: 0.5113 - accuracy: 0.8110
42/92 [============>……………..] - ETA: 2s - loss: 0.5091 - accuracy: 0.8118
43/92 [=============>…………….] - ETA: 2s - loss: 0.5025 - accuracy: 0.8154
44/92 [=============>…………….] - ETA: 2s - loss: 0.4998 - accuracy: 0.8168
45/92 [=============>…………….] - ETA: 2s - loss: 0.5032 - accuracy: 0.8160
46/92 [==============>……………] - ETA: 2s - loss: 0.5006 - accuracy: 0.8173
47/92 [==============>……………] - ETA: 2s - loss: 0.4978 - accuracy: 0.8178
48/92 [==============>……………] - ETA: 2s - loss: 0.5000 - accuracy: 0.8151
49/92 [==============>……………] - ETA: 2s - loss: 0.5077 - accuracy: 0.8119
50/92 [===============>…………..] - ETA: 2s - loss: 0.5054 - accuracy: 0.8119
51/92 [===============>…………..] - ETA: 2s - loss: 0.5055 - accuracy: 0.8119
52/92 [===============>…………..] - ETA: 2s - loss: 0.5013 - accuracy: 0.8131
53/92 [================>………….] - ETA: 2s - loss: 0.5074 - accuracy: 0.8096
54/92 [================>………….] - ETA: 2s - loss: 0.5076 - accuracy: 0.8108
55/92 [================>………….] - ETA: 2s - loss: 0.5025 - accuracy: 0.8131
56/92 [=================>…………] - ETA: 2s - loss: 0.5015 - accuracy: 0.8142
57/92 [=================>…………] - ETA: 2s - loss: 0.5071 - accuracy: 0.8114
58/92 [=================>…………] - ETA: 1s - loss: 0.5062 - accuracy: 0.8109
59/92 [==================>………..] - ETA: 1s - loss: 0.5082 - accuracy: 0.8099
60/92 [==================>………..] - ETA: 1s - loss: 0.5128 - accuracy: 0.8083
61/92 [==================>………..] - ETA: 1s - loss: 0.5145 - accuracy: 0.8079
62/92 [===================>……….] - ETA: 1s - loss: 0.5156 - accuracy: 0.8075
63/92 [===================>……….] - ETA: 1s - loss: 0.5146 - accuracy: 0.8085
64/92 [===================>……….] - ETA: 1s - loss: 0.5137 - accuracy: 0.8081
65/92 [====================>………] - ETA: 1s - loss: 0.5141 - accuracy: 0.8072
66/92 [====================>………] - ETA: 1s - loss: 0.5159 - accuracy: 0.8068
67/92 [====================>………] - ETA: 1s - loss: 0.5163 - accuracy: 0.8060
68/92 [=====================>……..] - ETA: 1s - loss: 0.5170 - accuracy: 0.8033
69/92 [=====================>……..] - ETA: 1s - loss: 0.5140 - accuracy: 0.8043
70/92 [=====================>……..] - ETA: 1s - loss: 0.5126 - accuracy: 0.8049
71/92 [======================>…….] - ETA: 1s - loss: 0.5156 - accuracy: 0.8024
72/92 [======================>…….] - ETA: 1s - loss: 0.5149 - accuracy: 0.8021
73/92 [======================>…….] - ETA: 1s - loss: 0.5143 - accuracy: 0.8027
74/92 [=======================>……] - ETA: 1s - loss: 0.5171 - accuracy: 0.8007
75/92 [=======================>……] - ETA: 0s - loss: 0.5207 - accuracy: 0.7987
76/92 [=======================>……] - ETA: 0s - loss: 0.5188 - accuracy: 0.7998
77/92 [========================>…..] - ETA: 0s - loss: 0.5171 - accuracy: 0.8007
78/92 [========================>…..] - ETA: 0s - loss: 0.5175 - accuracy: 0.8005
79/92 [========================>…..] - ETA: 0s - loss: 0.5181 - accuracy: 0.8002
80/92 [=========================>….] - ETA: 0s - loss: 0.5183 - accuracy: 0.8000
81/92 [=========================>….] - ETA: 0s - loss: 0.5224 - accuracy: 0.7986
83/92 [==========================>…] - ETA: 0s - loss: 0.5184 - accuracy: 0.8006
84/92 [==========================>…] - ETA: 0s - loss: 0.5171 - accuracy: 0.8015
85/92 [==========================>…] - ETA: 0s - loss: 0.5180 - accuracy: 0.8020
86/92 [===========================>..] - ETA: 0s - loss: 0.5204 - accuracy: 0.8014
87/92 [===========================>..] - ETA: 0s - loss: 0.5185 - accuracy: 0.8022
88/92 [===========================>..] - ETA: 0s - loss: 0.5177 - accuracy: 0.8027
89/92 [============================>.] - ETA: 0s - loss: 0.5209 - accuracy: 0.8018
90/92 [============================>.] - ETA: 0s - loss: 0.5180 - accuracy: 0.8029
91/92 [============================>.] - ETA: 0s - loss: 0.5165 - accuracy: 0.8034
92/92 [==============================] - ETA: 0s - loss: 0.5172 - accuracy: 0.8031
92/92 [==============================] - 6s 64ms/step - loss: 0.5172 - accuracy: 0.8031 - val_loss: 0.7685 - val_accuracy: 0.7207
Epoch 15/15
1/92 [..............................] - ETA: 7s - loss: 0.3400 - accuracy: 0.8438
2/92 [..............................] - ETA: 5s - loss: 0.5200 - accuracy: 0.7812
3/92 [..............................] - ETA: 5s - loss: 0.4460 - accuracy: 0.8229
4/92 [>.............................] - ETA: 5s - loss: 0.4709 - accuracy: 0.8281
5/92 [>.............................] - ETA: 5s - loss: 0.5184 - accuracy: 0.7937
6/92 [>.............................] - ETA: 4s - loss: 0.5586 - accuracy: 0.7760
7/92 [=>............................] - ETA: 4s - loss: 0.5329 - accuracy: 0.7902
8/92 [=>............................] - ETA: 4s - loss: 0.5370 - accuracy: 0.7930
9/92 [=>............................] - ETA: 4s - loss: 0.5265 - accuracy: 0.7951
10/92 [==>………………………] - ETA: 4s - loss: 0.5387 - accuracy: 0.7844
11/92 [==>………………………] - ETA: 4s - loss: 0.5586 - accuracy: 0.7784
12/92 [==>………………………] - ETA: 4s - loss: 0.5551 - accuracy: 0.7839
13/92 [===>……………………..] - ETA: 4s - loss: 0.5399 - accuracy: 0.7909
14/92 [===>……………………..] - ETA: 4s - loss: 0.5232 - accuracy: 0.7969
15/92 [===>……………………..] - ETA: 4s - loss: 0.5354 - accuracy: 0.7917
16/92 [====>…………………….] - ETA: 4s - loss: 0.5393 - accuracy: 0.7852
17/92 [====>…………………….] - ETA: 4s - loss: 0.5327 - accuracy: 0.7904
18/92 [====>…………………….] - ETA: 4s - loss: 0.5186 - accuracy: 0.7986
19/92 [=====>……………………] - ETA: 4s - loss: 0.5108 - accuracy: 0.8026
20/92 [=====>……………………] - ETA: 4s - loss: 0.5105 - accuracy: 0.8047
21/92 [=====>……………………] - ETA: 4s - loss: 0.5022 - accuracy: 0.8110
22/92 [======>…………………..] - ETA: 4s - loss: 0.5011 - accuracy: 0.8111
24/92 [======>…………………..] - ETA: 3s - loss: 0.4977 - accuracy: 0.8118
25/92 [=======>………………….] - ETA: 3s - loss: 0.5022 - accuracy: 0.8093
26/92 [=======>………………….] - ETA: 3s - loss: 0.4976 - accuracy: 0.8107
27/92 [=======>………………….] - ETA: 3s - loss: 0.5001 - accuracy: 0.8119
28/92 [========>…………………] - ETA: 3s - loss: 0.4987 - accuracy: 0.8119
29/92 [========>…………………] - ETA: 3s - loss: 0.4996 - accuracy: 0.8098
30/92 [========>…………………] - ETA: 3s - loss: 0.5062 - accuracy: 0.8088
31/92 [=========>………………..] - ETA: 3s - loss: 0.5003 - accuracy: 0.8110
32/92 [=========>………………..] - ETA: 3s - loss: 0.4968 - accuracy: 0.8120
33/92 [=========>………………..] - ETA: 3s - loss: 0.4972 - accuracy: 0.8120
34/92 [==========>……………….] - ETA: 3s - loss: 0.5130 - accuracy: 0.8065
35/92 [==========>……………….] - ETA: 3s - loss: 0.5042 - accuracy: 0.8103
36/92 [==========>……………….] - ETA: 3s - loss: 0.5037 - accuracy: 0.8086
37/92 [===========>………………] - ETA: 3s - loss: 0.5029 - accuracy: 0.8087
38/92 [===========>………………] - ETA: 3s - loss: 0.5110 - accuracy: 0.8063
39/92 [===========>………………] - ETA: 3s - loss: 0.5150 - accuracy: 0.8032
40/92 [============>……………..] - ETA: 2s - loss: 0.5086 - accuracy: 0.8050
41/92 [============>……………..] - ETA: 2s - loss: 0.5060 - accuracy: 0.8052
42/92 [============>……………..] - ETA: 2s - loss: 0.5078 - accuracy: 0.8046
43/92 [=============>…………….] - ETA: 2s - loss: 0.5087 - accuracy: 0.8041
44/92 [=============>…………….] - ETA: 2s - loss: 0.5077 - accuracy: 0.8043
45/92 [=============>…………….] - ETA: 2s - loss: 0.5100 - accuracy: 0.8038
46/92 [==============>……………] - ETA: 2s - loss: 0.5142 - accuracy: 0.8012
47/92 [==============>……………] - ETA: 2s - loss: 0.5095 - accuracy: 0.8021
48/92 [==============>……………] - ETA: 2s - loss: 0.5086 - accuracy: 0.8037
49/92 [==============>……………] - ETA: 2s - loss: 0.5117 - accuracy: 0.8019
50/92 [===============>…………..] - ETA: 2s - loss: 0.5146 - accuracy: 0.8003
51/92 [===============>…………..] - ETA: 2s - loss: 0.5168 - accuracy: 0.8005
52/92 [===============>…………..] - ETA: 2s - loss: 0.5155 - accuracy: 0.8001
53/92 [================>………….] - ETA: 2s - loss: 0.5158 - accuracy: 0.8004
54/92 [================>………….] - ETA: 2s - loss: 0.5139 - accuracy: 0.8012
55/92 [================>………….] - ETA: 2s - loss: 0.5207 - accuracy: 0.7997
56/92 [=================>…………] - ETA: 2s - loss: 0.5199 - accuracy: 0.8016
57/92 [=================>…………] - ETA: 2s - loss: 0.5180 - accuracy: 0.8034
58/92 [=================>…………] - ETA: 1s - loss: 0.5204 - accuracy: 0.8014
59/92 [==================>………..] - ETA: 1s - loss: 0.5194 - accuracy: 0.8016
60/92 [==================>………..] - ETA: 1s - loss: 0.5176 - accuracy: 0.8023
61/92 [==================>………..] - ETA: 1s - loss: 0.5157 - accuracy: 0.8030
62/92 [===================>……….] - ETA: 1s - loss: 0.5148 - accuracy: 0.8036
63/92 [===================>……….] - ETA: 1s - loss: 0.5116 - accuracy: 0.8058
64/92 [===================>……….] - ETA: 1s - loss: 0.5100 - accuracy: 0.8069
65/92 [====================>………] - ETA: 1s - loss: 0.5105 - accuracy: 0.8060
66/92 [====================>………] - ETA: 1s - loss: 0.5124 - accuracy: 0.8023
67/92 [====================>………] - ETA: 1s - loss: 0.5127 - accuracy: 0.8034
68/92 [=====================>……..] - ETA: 1s - loss: 0.5081 - accuracy: 0.8044
69/92 [=====================>……..] - ETA: 1s - loss: 0.5059 - accuracy: 0.8055
70/92 [=====================>……..] - ETA: 1s - loss: 0.5087 - accuracy: 0.8024
71/92 [======================>…….] - ETA: 1s - loss: 0.5087 - accuracy: 0.8030
72/92 [======================>…….] - ETA: 1s - loss: 0.5055 - accuracy: 0.8049
73/92 [======================>…….] - ETA: 1s - loss: 0.5050 - accuracy: 0.8050
74/92 [=======================>……] - ETA: 1s - loss: 0.5085 - accuracy: 0.8025
75/92 [=======================>……] - ETA: 0s - loss: 0.5105 - accuracy: 0.8014
76/92 [=======================>……] - ETA: 0s - loss: 0.5090 - accuracy: 0.8020
77/92 [========================>…..] - ETA: 0s - loss: 0.5121 - accuracy: 0.8013
78/92 [========================>…..] - ETA: 0s - loss: 0.5111 - accuracy: 0.8010
79/92 [========================>…..] - ETA: 0s - loss: 0.5103 - accuracy: 0.8016
80/92 [=========================>….] - ETA: 0s - loss: 0.5117 - accuracy: 0.7998
81/92 [=========================>….] - ETA: 0s - loss: 0.5111 - accuracy: 0.7995
82/92 [=========================>….] - ETA: 0s - loss: 0.5085 - accuracy: 0.8012
83/92 [==========================>…] - ETA: 0s - loss: 0.5101 - accuracy: 0.8002
84/92 [==========================>…] - ETA: 0s - loss: 0.5108 - accuracy: 0.8004
85/92 [==========================>…] - ETA: 0s - loss: 0.5135 - accuracy: 0.7994
86/92 [===========================>..] - ETA: 0s - loss: 0.5154 - accuracy: 0.7992
87/92 [===========================>..] - ETA: 0s - loss: 0.5170 - accuracy: 0.7986
88/92 [===========================>..] - ETA: 0s - loss: 0.5180 - accuracy: 0.7988
89/92 [============================>.] - ETA: 0s - loss: 0.5165 - accuracy: 0.7996
90/92 [============================>.] - ETA: 0s - loss: 0.5162 - accuracy: 0.8005
91/92 [============================>.] - ETA: 0s - loss: 0.5184 - accuracy: 0.7992
92/92 [==============================] - ETA: 0s - loss: 0.5179 - accuracy: 0.7994
92/92 [==============================] - 6s 64ms/step - loss: 0.5179 - accuracy: 0.7994 - val_loss: 0.7059 - val_accuracy: 0.7357
Visualize Training Results¶
After applying data augmentation and Dropout, there is less overfitting than before, and training and validation accuracy are closer aligned.
acc = history.history['accuracy']
val_acc = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']
epochs_range = range(epochs)
plt.figure(figsize=(8, 8))
plt.subplot(1, 2, 1)
plt.plot(epochs_range, acc, label='Training Accuracy')
plt.plot(epochs_range, val_acc, label='Validation Accuracy')
plt.legend(loc='lower right')
plt.title('Training and Validation Accuracy')
plt.subplot(1, 2, 2)
plt.plot(epochs_range, loss, label='Training Loss')
plt.plot(epochs_range, val_loss, label='Validation Loss')
plt.legend(loc='upper right')
plt.title('Training and Validation Loss')
plt.show()
Predict on New Data¶
Finally, let us use the model to classify an image that was not included in the training or validation sets.
Note: Data augmentation and Dropout layers are inactive at inference time.
sunflower_url = "https://storage.googleapis.com/download.tensorflow.org/example_images/592px-Red_sunflower.jpg"
sunflower_path = tf.keras.utils.get_file('Red_sunflower', origin=sunflower_url)
img = keras.preprocessing.image.load_img(
sunflower_path, target_size=(img_height, img_width)
)
img_array = keras.preprocessing.image.img_to_array(img)
img_array = tf.expand_dims(img_array, 0) # Create a batch
predictions = model.predict(img_array)
score = tf.nn.softmax(predictions[0])
print(
"This image most likely belongs to {} with a {:.2f} percent confidence."
.format(class_names[np.argmax(score)], 100 * np.max(score))
)
1/1 [==============================] - ETA: 0s
1/1 [==============================] - 0s 99ms/step
This image most likely belongs to sunflowers with a 96.93 percent confidence.
Save the TensorFlow Model¶
#save the trained model - a new folder flower will be created
#and the file "saved_model.pb" is the pre-trained model
model_dir = "model"
saved_model_dir = f"{model_dir}/flower/saved_model"
model.save(saved_model_dir)
2024-03-13 01:04:01.596956: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'random_flip_input' with dtype float and shape [?,180,180,3]
[[{{node random_flip_input}}]]
2024-03-13 01:04:01.681918: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-03-13 01:04:01.691876: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'random_flip_input' with dtype float and shape [?,180,180,3]
[[{{node random_flip_input}}]]
2024-03-13 01:04:01.702720: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-03-13 01:04:01.709631: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-03-13 01:04:01.716729: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-03-13 01:04:01.728191: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-03-13 01:04:01.769744: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'sequential_1_input' with dtype float and shape [?,180,180,3]
[[{{node sequential_1_input}}]]
2024-03-13 01:04:01.859999: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-03-13 01:04:01.880443: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'sequential_1_input' with dtype float and shape [?,180,180,3]
[[{{node sequential_1_input}}]]
2024-03-13 01:04:01.919461: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,22,22,64]
[[{{node inputs}}]]
2024-03-13 01:04:01.944149: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-03-13 01:04:02.016302: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-03-13 01:04:02.156624: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-03-13 01:04:02.294259: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,22,22,64]
[[{{node inputs}}]]
2024-03-13 01:04:02.328020: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-03-13 01:04:02.355628: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-03-13 01:04:02.402015: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
WARNING:absl:Found untraced functions such as _jit_compiled_convolution_op, _jit_compiled_convolution_op, _jit_compiled_convolution_op, _update_step_xla while saving (showing 4 of 4). These functions will not be directly callable after loading.
INFO:tensorflow:Assets written to: model/flower/saved_model/assets
INFO:tensorflow:Assets written to: model/flower/saved_model/assets
Convert the TensorFlow model with OpenVINO Model Conversion API¶
To convert the model to
OpenVINO IR with FP16
precision, use model conversion Python API.
# Convert the model to ir model format and save it.
ir_model_path = Path("model/flower")
ir_model_path.mkdir(parents=True, exist_ok=True)
ir_model = ov.convert_model(saved_model_dir, input=[1,180,180,3])
ov.save_model(ir_model, ir_model_path / "flower_ir.xml")
Preprocessing Image Function¶
def pre_process_image(imagePath, img_height=180):
# Model input format
n, h, w, c = [1, img_height, img_height, 3]
image = Image.open(imagePath)
image = image.resize((h, w), resample=Image.BILINEAR)
# Convert to array and change data layout from HWC to CHW
image = np.array(image)
input_image = image.reshape((n, h, w, c))
return input_image
OpenVINO Runtime Setup¶
Select inference device¶
select device from dropdown list for running inference using OpenVINO
import ipywidgets as widgets
# Initialize OpenVINO runtime
core = ov.Core()
device = widgets.Dropdown(
options=core.available_devices + ["AUTO"],
value='AUTO',
description='Device:',
disabled=False,
)
device
Dropdown(description='Device:', index=1, options=('CPU', 'AUTO'), value='AUTO')
class_names=["daisy", "dandelion", "roses", "sunflowers", "tulips"]
compiled_model = core.compile_model(model=ir_model, device_name=device.value)
del ir_model
input_layer = compiled_model.input(0)
output_layer = compiled_model.output(0)
Run the Inference Step¶
# Run inference on the input image...
inp_img_url = "https://upload.wikimedia.org/wikipedia/commons/4/48/A_Close_Up_Photo_of_a_Dandelion.jpg"
OUTPUT_DIR = "output"
inp_file_name = f"A_Close_Up_Photo_of_a_Dandelion.jpg"
file_path = Path(OUTPUT_DIR)/Path(inp_file_name)
os.makedirs(OUTPUT_DIR, exist_ok=True)
# Download the image
download_file(inp_img_url, inp_file_name, directory=OUTPUT_DIR)
# Pre-process the image and get it ready for inference.
input_image = pre_process_image(file_path)
print(input_image.shape)
print(input_layer.shape)
res = compiled_model([input_image])[output_layer]
score = tf.nn.softmax(res[0])
# Show the results
image = Image.open(file_path)
plt.imshow(image)
print(
"This image most likely belongs to {} with a {:.2f} percent confidence."
.format(class_names[np.argmax(score)], 100 * np.max(score))
)
'output/A_Close_Up_Photo_of_a_Dandelion.jpg' already exists.
(1, 180, 180, 3)
[1,180,180,3]
This image most likely belongs to dandelion with a 95.08 percent confidence.
The Next Steps¶
This tutorial showed how to train a TensorFlow model, how to convert that model to OpenVINO’s IR format, and how to do inference on the converted model. For faster inference speed, you can quantize the IR model. To see how to quantize this model with OpenVINO’s Post-training Quantization with NNCF Tool, check out the Post-Training Quantization with TensorFlow Classification Model notebook.