From Training to Deployment with TensorFlow and OpenVINO™¶
This Jupyter notebook can be launched after a local installation only.
Table of contents:¶
# @title Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Copyright 2018 The TensorFlow Authors
#
# Modified for OpenVINO Notebooks
This tutorial demonstrates how to train, convert, and deploy an image classification model with TensorFlow and OpenVINO. This particular notebook shows the process where we perform the inference step on the freshly trained model that is converted to OpenVINO IR with model conversion API. For faster inference speed on the model created in this notebook, check out the Post-Training Quantization with TensorFlow Classification Model notebook.
This training code comprises the official TensorFlow Image Classification Tutorial in its entirety.
The flower_ir.bin
and flower_ir.xml
(pre-trained models) can be
obtained by executing the code with ‘Runtime->Run All’ or the
Ctrl+F9
command.
%pip install -q "openvino>=2023.1.0"
DEPRECATION: pytorch-lightning 1.6.5 has a non-standard dependency specifier torch>=1.8.*. pip 24.1 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of pytorch-lightning or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at https://github.com/pypa/pip/issues/12063
Note: you may need to restart the kernel to use updated packages.
TensorFlow Image Classification Training¶
The first part of the tutorial shows how to classify images of flowers
(based on the TensorFlow’s official tutorial). It creates an image
classifier using a keras.Sequential
model, and loads data using
preprocessing.image_dataset_from_directory
. You will gain practical
experience with the following concepts:
Efficiently loading a dataset off disk.
Identifying overfitting and applying techniques to mitigate it, including data augmentation and Dropout.
This tutorial follows a basic machine learning workflow:
Examine and understand data
Build an input pipeline
Build the model
Train the model
Test the model
Import TensorFlow and Other Libraries¶
import os
import sys
from pathlib import Path
import PIL
import matplotlib.pyplot as plt
import numpy as np
import tensorflow as tf
from PIL import Image
import openvino as ov
from tensorflow import keras
from tensorflow.keras import layers
from tensorflow.keras.models import Sequential
sys.path.append("../utils")
from notebook_utils import download_file
2024-02-10 01:12:04.614496: I tensorflow/core/util/port.cc:110] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable TF_ENABLE_ONEDNN_OPTS=0. 2024-02-10 01:12:04.649325: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2024-02-10 01:12:05.161353: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Download and Explore the Dataset¶
This tutorial uses a dataset of about 3,700 photos of flowers. The dataset contains 5 sub-directories, one per class:
flower_photo/
daisy/
dandelion/
roses/
sunflowers/
tulips/
import pathlib
dataset_url = "https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz"
data_dir = tf.keras.utils.get_file('flower_photos', origin=dataset_url, untar=True)
data_dir = pathlib.Path(data_dir)
After downloading, you should now have a copy of the dataset available. There are 3,670 total images:
image_count = len(list(data_dir.glob('*/*.jpg')))
print(image_count)
3670
Here are some roses:
roses = list(data_dir.glob('roses/*'))
PIL.Image.open(str(roses[0]))
PIL.Image.open(str(roses[1]))
And some tulips:
tulips = list(data_dir.glob('tulips/*'))
PIL.Image.open(str(tulips[0]))
PIL.Image.open(str(tulips[1]))
Load Using keras.preprocessing¶
Let’s load these images off disk using the helpful
image_dataset_from_directory
utility. This will take you from a directory of images on disk to a
tf.data.Dataset
in just a couple lines of code. If you like, you can
also write your own data loading code from scratch by visiting the load
images
tutorial.
Create a Dataset¶
Define some parameters for the loader:
batch_size = 32
img_height = 180
img_width = 180
It’s good practice to use a validation split when developing your model. Let’s use 80% of the images for training, and 20% for validation.
train_ds = tf.keras.preprocessing.image_dataset_from_directory(
data_dir,
validation_split=0.2,
subset="training",
seed=123,
image_size=(img_height, img_width),
batch_size=batch_size)
Found 3670 files belonging to 5 classes.
Using 2936 files for training.
2024-02-10 01:12:08.217732: E tensorflow/compiler/xla/stream_executor/cuda/cuda_driver.cc:266] failed call to cuInit: CUDA_ERROR_COMPAT_NOT_SUPPORTED_ON_DEVICE: forward compatibility was attempted on non supported HW
2024-02-10 01:12:08.217763: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:168] retrieving CUDA diagnostic information for host: iotg-dev-workstation-07
2024-02-10 01:12:08.217767: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:175] hostname: iotg-dev-workstation-07
2024-02-10 01:12:08.217894: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:199] libcuda reported version is: 470.223.2
2024-02-10 01:12:08.217909: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:203] kernel reported version is: 470.182.3
2024-02-10 01:12:08.217913: E tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:312] kernel version 470.182.3 does not match DSO version 470.223.2 -- cannot find working devices in this configuration
val_ds = tf.keras.preprocessing.image_dataset_from_directory(
data_dir,
validation_split=0.2,
subset="validation",
seed=123,
image_size=(img_height, img_width),
batch_size=batch_size)
Found 3670 files belonging to 5 classes.
Using 734 files for validation.
You can find the class names in the class_names
attribute on these
datasets. These correspond to the directory names in alphabetical order.
class_names = train_ds.class_names
print(class_names)
['daisy', 'dandelion', 'roses', 'sunflowers', 'tulips']
Visualize the Data¶
Here are the first 9 images from the training dataset.
plt.figure(figsize=(10, 10))
for images, labels in train_ds.take(1):
for i in range(9):
ax = plt.subplot(3, 3, i + 1)
plt.imshow(images[i].numpy().astype("uint8"))
plt.title(class_names[labels[i]])
plt.axis("off")
2024-02-10 01:12:08.550492: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
[[{{node Placeholder/_0}}]]
2024-02-10 01:12:08.550818: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
[[{{node Placeholder/_0}}]]
You will train a model using these datasets by passing them to
model.fit
in a moment. If you like, you can also manually iterate
over the dataset and retrieve batches of images:
for image_batch, labels_batch in train_ds:
print(image_batch.shape)
print(labels_batch.shape)
break
(32, 180, 180, 3)
(32,)
2024-02-10 01:12:09.380029: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
[[{{node Placeholder/_4}}]]
2024-02-10 01:12:09.380404: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
[[{{node Placeholder/_4}}]]
The image_batch
is a tensor of the shape (32, 180, 180, 3)
. This
is a batch of 32 images of shape 180x180x3
(the last dimension
refers to color channels RGB). The label_batch
is a tensor of the
shape (32,)
, these are corresponding labels to the 32 images.
You can call .numpy()
on the image_batch
and labels_batch
tensors to convert them to a numpy.ndarray
.
Configure the Dataset for Performance¶
Let’s make sure to use buffered prefetching so you can yield data from disk without having I/O become blocking. These are two important methods you should use when loading data.
Dataset.cache()
keeps the images in memory after they’re loaded off
disk during the first epoch. This will ensure the dataset does not
become a bottleneck while training your model. If your dataset is too
large to fit into memory, you can also use this method to create a
performant on-disk cache.
Dataset.prefetch()
overlaps data preprocessing and model execution
while training.
Interested readers can learn more about both methods, as well as how to cache data to disk in the data performance guide.
AUTOTUNE = tf.data.AUTOTUNE
train_ds = train_ds.cache().shuffle(1000).prefetch(buffer_size=AUTOTUNE)
val_ds = val_ds.cache().prefetch(buffer_size=AUTOTUNE)
Standardize the Data¶
The RGB channel values are in the [0, 255]
range. This is not ideal
for a neural network; in general you should seek to make your input
values small. Here, you will standardize values to be in the [0, 1]
range by using a Rescaling layer.
normalization_layer = layers.Rescaling(1./255)
Note: The Keras Preprocessing utilities and layers introduced in this section are currently experimental and may change.
There are two ways to use this layer. You can apply it to the dataset by calling map:
normalized_ds = train_ds.map(lambda x, y: (normalization_layer(x), y))
image_batch, labels_batch = next(iter(normalized_ds))
first_image = image_batch[0]
# Notice the pixels values are now in `[0,1]`.
print(np.min(first_image), np.max(first_image))
2024-02-10 01:12:09.568220: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
[[{{node Placeholder/_4}}]]
2024-02-10 01:12:09.568598: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
[[{{node Placeholder/_4}}]]
0.0 1.0
Or, you can include the layer inside your model definition, which can simplify deployment. Let’s use the second approach here.
Note: you previously resized images using the image_size
argument of
image_dataset_from_directory
. If you want to include the resizing
logic in your model as well, you can use the
Resizing
layer.
Create the Model¶
The model consists of three convolution blocks with a max pool layer in
each of them. There’s a fully connected layer with 128 units on top of
it that is activated by a relu
activation function. This model has
not been tuned for high accuracy, the goal of this tutorial is to show a
standard approach.
num_classes = 5
model = Sequential([
layers.experimental.preprocessing.Rescaling(1./255, input_shape=(img_height, img_width, 3)),
layers.Conv2D(16, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Conv2D(32, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Conv2D(64, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Flatten(),
layers.Dense(128, activation='relu'),
layers.Dense(num_classes)
])
Compile the Model¶
For this tutorial, choose the optimizers.Adam
optimizer and
losses.SparseCategoricalCrossentropy
loss function. To view training
and validation accuracy for each training epoch, pass the metrics
argument.
model.compile(optimizer='adam',
loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
metrics=['accuracy'])
Model Summary¶
View all the layers of the network using the model’s summary
method.
NOTE: This section is commented out for performance reasons. Please feel free to uncomment these to compare the results.
# model.summary()
Train the Model¶
# epochs=10
# history = model.fit(
# train_ds,
# validation_data=val_ds,
# epochs=epochs
# )
Visualize Training Results¶
Create plots of loss and accuracy on the training and validation sets.
# acc = history.history['accuracy']
# val_acc = history.history['val_accuracy']
# loss = history.history['loss']
# val_loss = history.history['val_loss']
# epochs_range = range(epochs)
# plt.figure(figsize=(8, 8))
# plt.subplot(1, 2, 1)
# plt.plot(epochs_range, acc, label='Training Accuracy')
# plt.plot(epochs_range, val_acc, label='Validation Accuracy')
# plt.legend(loc='lower right')
# plt.title('Training and Validation Accuracy')
# plt.subplot(1, 2, 2)
# plt.plot(epochs_range, loss, label='Training Loss')
# plt.plot(epochs_range, val_loss, label='Validation Loss')
# plt.legend(loc='upper right')
# plt.title('Training and Validation Loss')
# plt.show()
As you can see from the plots, training accuracy and validation accuracy are off by large margin and the model has achieved only around 60% accuracy on the validation set.
Let’s look at what went wrong and try to increase the overall performance of the model.
Overfitting¶
In the plots above, the training accuracy is increasing linearly over time, whereas validation accuracy stalls around 60% in the training process. Also, the difference in accuracy between training and validation accuracy is noticeable — a sign of overfitting.
When there are a small number of training examples, the model sometimes learns from noises or unwanted details from training examples—to an extent that it negatively impacts the performance of the model on new examples. This phenomenon is known as overfitting. It means that the model will have a difficult time generalizing on a new dataset.
There are multiple ways to fight overfitting in the training process. In this tutorial, you’ll use data augmentation and add Dropout to your model.
Data Augmentation¶
Overfitting generally occurs when there are a small number of training examples. Data augmentation takes the approach of generating additional training data from your existing examples by augmenting them using random transformations that yield believable-looking images. This helps expose the model to more aspects of the data and generalize better.
You will implement data augmentation using the layers from
tf.keras.layers.experimental.preprocessing
. These can be included
inside your model like other layers, and run on the GPU.
data_augmentation = keras.Sequential(
[
layers.RandomFlip("horizontal",
input_shape=(img_height,
img_width,
3)),
layers.RandomRotation(0.1),
layers.RandomZoom(0.1),
]
)
Let’s visualize what a few augmented examples look like by applying data augmentation to the same image several times:
plt.figure(figsize=(10, 10))
for images, _ in train_ds.take(1):
for i in range(9):
augmented_images = data_augmentation(images)
ax = plt.subplot(3, 3, i + 1)
plt.imshow(augmented_images[0].numpy().astype("uint8"))
plt.axis("off")
2024-02-10 01:12:10.342151: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
[[{{node Placeholder/_0}}]]
2024-02-10 01:12:10.342455: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
[[{{node Placeholder/_0}}]]
You will use data augmentation to train a model in a moment.
Dropout¶
Another technique to reduce overfitting is to introduce Dropout to the network, a form of regularization.
When you apply Dropout to a layer it randomly drops out (by setting the activation to zero) a number of output units from the layer during the training process. Dropout takes a fractional number as its input value, in the form such as 0.1, 0.2, 0.4, etc. This means dropping out 10%, 20% or 40% of the output units randomly from the applied layer.
Let’s create a new neural network using layers.Dropout
, then train
it using augmented images.
model = Sequential([
data_augmentation,
layers.Rescaling(1./255),
layers.Conv2D(16, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Conv2D(32, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Conv2D(64, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Dropout(0.2),
layers.Flatten(),
layers.Dense(128, activation='relu'),
layers.Dense(num_classes, name="outputs")
])
Compile and Train the Model¶
model.compile(optimizer='adam',
loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
metrics=['accuracy'])
model.summary()
Model: "sequential_2"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
sequential_1 (Sequential) (None, 180, 180, 3) 0
rescaling_2 (Rescaling) (None, 180, 180, 3) 0
conv2d_3 (Conv2D) (None, 180, 180, 16) 448
max_pooling2d_3 (MaxPooling (None, 90, 90, 16) 0
2D)
conv2d_4 (Conv2D) (None, 90, 90, 32) 4640
max_pooling2d_4 (MaxPooling (None, 45, 45, 32) 0
2D)
conv2d_5 (Conv2D) (None, 45, 45, 64) 18496
max_pooling2d_5 (MaxPooling (None, 22, 22, 64) 0
2D)
dropout (Dropout) (None, 22, 22, 64) 0
flatten_1 (Flatten) (None, 30976) 0
dense_2 (Dense) (None, 128) 3965056
outputs (Dense) (None, 5) 645
=================================================================
Total params: 3,989,285
Trainable params: 3,989,285
Non-trainable params: 0
_________________________________________________________________
epochs = 15
history = model.fit(
train_ds,
validation_data=val_ds,
epochs=epochs
)
Epoch 1/15
2024-02-10 01:12:11.537227: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
[[{{node Placeholder/_0}}]]
2024-02-10 01:12:11.537529: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
[[{{node Placeholder/_4}}]]
1/92 [..............................] - ETA: 1:26 - loss: 1.6124 - accuracy: 0.2500
2/92 [..............................] - ETA: 5s - loss: 2.0113 - accuracy: 0.2344
3/92 [..............................] - ETA: 5s - loss: 2.0181 - accuracy: 0.1979
4/92 [>.............................] - ETA: 5s - loss: 1.9673 - accuracy: 0.1953
5/92 [>.............................] - ETA: 5s - loss: 1.8983 - accuracy: 0.2125
6/92 [>.............................] - ETA: 5s - loss: 1.8645 - accuracy: 0.2240
7/92 [=>............................] - ETA: 5s - loss: 1.8276 - accuracy: 0.2188
8/92 [=>............................] - ETA: 5s - loss: 1.8073 - accuracy: 0.2070
9/92 [=>............................] - ETA: 4s - loss: 1.7923 - accuracy: 0.1979
10/92 [==>………………………] - ETA: 4s - loss: 1.7763 - accuracy: 0.1969
11/92 [==>………………………] - ETA: 4s - loss: 1.7599 - accuracy: 0.1989
12/92 [==>………………………] - ETA: 4s - loss: 1.7465 - accuracy: 0.1953
13/92 [===>……………………..] - ETA: 4s - loss: 1.7323 - accuracy: 0.1995
14/92 [===>……………………..] - ETA: 4s - loss: 1.7228 - accuracy: 0.1964
15/92 [===>……………………..] - ETA: 4s - loss: 1.7165 - accuracy: 0.1937
16/92 [====>…………………….] - ETA: 4s - loss: 1.7071 - accuracy: 0.2012
17/92 [====>…………………….] - ETA: 4s - loss: 1.6996 - accuracy: 0.2022
18/92 [====>…………………….] - ETA: 4s - loss: 1.6922 - accuracy: 0.2031
19/92 [=====>……………………] - ETA: 4s - loss: 1.6856 - accuracy: 0.2039
20/92 [=====>……………………] - ETA: 4s - loss: 1.6816 - accuracy: 0.2109
21/92 [=====>……………………] - ETA: 4s - loss: 1.6764 - accuracy: 0.2158
22/92 [======>…………………..] - ETA: 4s - loss: 1.6696 - accuracy: 0.2188
23/92 [======>…………………..] - ETA: 4s - loss: 1.6659 - accuracy: 0.2188
24/92 [======>…………………..] - ETA: 4s - loss: 1.6625 - accuracy: 0.2279
25/92 [=======>………………….] - ETA: 3s - loss: 1.6613 - accuracy: 0.2250
26/92 [=======>………………….] - ETA: 3s - loss: 1.6581 - accuracy: 0.2236
27/92 [=======>………………….] - ETA: 3s - loss: 1.6524 - accuracy: 0.2280
28/92 [========>…………………] - ETA: 3s - loss: 1.6476 - accuracy: 0.2288
29/92 [========>…………………] - ETA: 3s - loss: 1.6450 - accuracy: 0.2284
30/92 [========>…………………] - ETA: 3s - loss: 1.6415 - accuracy: 0.2292
31/92 [=========>………………..] - ETA: 3s - loss: 1.6390 - accuracy: 0.2288
32/92 [=========>………………..] - ETA: 3s - loss: 1.6343 - accuracy: 0.2344
33/92 [=========>………………..] - ETA: 3s - loss: 1.6313 - accuracy: 0.2358
34/92 [==========>……………….] - ETA: 3s - loss: 1.6276 - accuracy: 0.2371
35/92 [==========>……………….] - ETA: 3s - loss: 1.6229 - accuracy: 0.2384
36/92 [==========>……………….] - ETA: 3s - loss: 1.6208 - accuracy: 0.2378
37/92 [===========>………………] - ETA: 3s - loss: 1.6166 - accuracy: 0.2432
38/92 [===========>………………] - ETA: 3s - loss: 1.6133 - accuracy: 0.2442
39/92 [===========>………………] - ETA: 3s - loss: 1.6112 - accuracy: 0.2420
40/92 [============>……………..] - ETA: 3s - loss: 1.6055 - accuracy: 0.2453
41/92 [============>……………..] - ETA: 3s - loss: 1.6035 - accuracy: 0.2462
42/92 [============>……………..] - ETA: 2s - loss: 1.6018 - accuracy: 0.2448
43/92 [=============>…………….] - ETA: 2s - loss: 1.5969 - accuracy: 0.2464
44/92 [=============>…………….] - ETA: 2s - loss: 1.5921 - accuracy: 0.2507
45/92 [=============>…………….] - ETA: 2s - loss: 1.5882 - accuracy: 0.2569
46/92 [==============>……………] - ETA: 2s - loss: 1.5821 - accuracy: 0.2615
47/92 [==============>……………] - ETA: 2s - loss: 1.5754 - accuracy: 0.2620
48/92 [==============>……………] - ETA: 2s - loss: 1.5700 - accuracy: 0.2637
49/92 [==============>……………] - ETA: 2s - loss: 1.5665 - accuracy: 0.2659
50/92 [===============>…………..] - ETA: 2s - loss: 1.5562 - accuracy: 0.2725
51/92 [===============>…………..] - ETA: 2s - loss: 1.5479 - accuracy: 0.2757
52/92 [===============>…………..] - ETA: 2s - loss: 1.5424 - accuracy: 0.2800
53/92 [================>………….] - ETA: 2s - loss: 1.5411 - accuracy: 0.2789
54/92 [================>………….] - ETA: 2s - loss: 1.5413 - accuracy: 0.2807
55/92 [================>………….] - ETA: 2s - loss: 1.5364 - accuracy: 0.2847
56/92 [=================>…………] - ETA: 2s - loss: 1.5344 - accuracy: 0.2868
57/92 [=================>…………] - ETA: 2s - loss: 1.5273 - accuracy: 0.2906
58/92 [=================>…………] - ETA: 2s - loss: 1.5258 - accuracy: 0.2909
59/92 [==================>………..] - ETA: 1s - loss: 1.5204 - accuracy: 0.2966
60/92 [==================>………..] - ETA: 1s - loss: 1.5154 - accuracy: 0.3016
61/92 [==================>………..] - ETA: 1s - loss: 1.5101 - accuracy: 0.3053
62/92 [===================>……….] - ETA: 1s - loss: 1.5026 - accuracy: 0.3115
63/92 [===================>……….] - ETA: 1s - loss: 1.4993 - accuracy: 0.3125
64/92 [===================>……….] - ETA: 1s - loss: 1.4941 - accuracy: 0.3159
65/92 [====================>………] - ETA: 1s - loss: 1.4912 - accuracy: 0.3178
66/92 [====================>………] - ETA: 1s - loss: 1.4849 - accuracy: 0.3205
67/92 [====================>………] - ETA: 1s - loss: 1.4824 - accuracy: 0.3223
68/92 [=====================>……..] - ETA: 1s - loss: 1.4789 - accuracy: 0.3235
69/92 [=====================>……..] - ETA: 1s - loss: 1.4753 - accuracy: 0.3252
70/92 [=====================>……..] - ETA: 1s - loss: 1.4727 - accuracy: 0.3272
71/92 [======================>…….] - ETA: 1s - loss: 1.4665 - accuracy: 0.3288
72/92 [======================>…….] - ETA: 1s - loss: 1.4654 - accuracy: 0.3307
73/92 [======================>…….] - ETA: 1s - loss: 1.4620 - accuracy: 0.3313
74/92 [=======================>……] - ETA: 1s - loss: 1.4587 - accuracy: 0.3336
75/92 [=======================>……] - ETA: 1s - loss: 1.4537 - accuracy: 0.3363
76/92 [=======================>……] - ETA: 0s - loss: 1.4491 - accuracy: 0.3392
77/92 [========================>…..] - ETA: 0s - loss: 1.4422 - accuracy: 0.3421
78/92 [========================>…..] - ETA: 0s - loss: 1.4408 - accuracy: 0.3438
79/92 [========================>…..] - ETA: 0s - loss: 1.4371 - accuracy: 0.3465
80/92 [=========================>….] - ETA: 0s - loss: 1.4337 - accuracy: 0.3484
81/92 [=========================>….] - ETA: 0s - loss: 1.4300 - accuracy: 0.3499
82/92 [=========================>….] - ETA: 0s - loss: 1.4264 - accuracy: 0.3518
83/92 [==========================>…] - ETA: 0s - loss: 1.4237 - accuracy: 0.3524
84/92 [==========================>…] - ETA: 0s - loss: 1.4204 - accuracy: 0.3534
85/92 [==========================>…] - ETA: 0s - loss: 1.4163 - accuracy: 0.3555
86/92 [===========================>..] - ETA: 0s - loss: 1.4132 - accuracy: 0.3586
87/92 [===========================>..] - ETA: 0s - loss: 1.4104 - accuracy: 0.3602
88/92 [===========================>..] - ETA: 0s - loss: 1.4067 - accuracy: 0.3629
89/92 [============================>.] - ETA: 0s - loss: 1.4013 - accuracy: 0.3655
90/92 [============================>.] - ETA: 0s - loss: 1.3969 - accuracy: 0.3677
91/92 [============================>.] - ETA: 0s - loss: 1.3957 - accuracy: 0.3678
92/92 [==============================] - ETA: 0s - loss: 1.3942 - accuracy: 0.3682
2024-02-10 01:12:17.857731: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [734]
[[{{node Placeholder/_0}}]]
2024-02-10 01:12:17.858066: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [734]
[[{{node Placeholder/_0}}]]
92/92 [==============================] - 7s 67ms/step - loss: 1.3942 - accuracy: 0.3682 - val_loss: 1.2278 - val_accuracy: 0.4864
Epoch 2/15
1/92 [..............................] - ETA: 7s - loss: 1.1103 - accuracy: 0.5000
2/92 [..............................] - ETA: 5s - loss: 1.1211 - accuracy: 0.5781
3/92 [..............................] - ETA: 5s - loss: 1.1221 - accuracy: 0.5521
4/92 [>.............................] - ETA: 5s - loss: 1.1369 - accuracy: 0.5469
5/92 [>.............................] - ETA: 5s - loss: 1.1210 - accuracy: 0.5312
6/92 [>.............................] - ETA: 5s - loss: 1.1376 - accuracy: 0.5260
7/92 [=>............................] - ETA: 5s - loss: 1.0852 - accuracy: 0.5402
8/92 [=>............................] - ETA: 4s - loss: 1.0779 - accuracy: 0.5312
9/92 [=>............................] - ETA: 4s - loss: 1.1305 - accuracy: 0.5208
10/92 [==>………………………] - ETA: 4s - loss: 1.1464 - accuracy: 0.5094
11/92 [==>………………………] - ETA: 4s - loss: 1.1565 - accuracy: 0.5028
12/92 [==>………………………] - ETA: 4s - loss: 1.1657 - accuracy: 0.5052
14/92 [===>……………………..] - ETA: 4s - loss: 1.1414 - accuracy: 0.5114
15/92 [===>……………………..] - ETA: 4s - loss: 1.1527 - accuracy: 0.5064
16/92 [====>…………………….] - ETA: 4s - loss: 1.1726 - accuracy: 0.4980
17/92 [====>…………………….] - ETA: 4s - loss: 1.1768 - accuracy: 0.4907
18/92 [====>…………………….] - ETA: 4s - loss: 1.1829 - accuracy: 0.4894
19/92 [=====>……………………] - ETA: 4s - loss: 1.1829 - accuracy: 0.4917
20/92 [=====>……………………] - ETA: 4s - loss: 1.1813 - accuracy: 0.4968
21/92 [=====>……………………] - ETA: 4s - loss: 1.1744 - accuracy: 0.5030
22/92 [======>…………………..] - ETA: 4s - loss: 1.1793 - accuracy: 0.5000
23/92 [======>…………………..] - ETA: 3s - loss: 1.1772 - accuracy: 0.5014
24/92 [======>…………………..] - ETA: 3s - loss: 1.1748 - accuracy: 0.5039
25/92 [=======>………………….] - ETA: 3s - loss: 1.1731 - accuracy: 0.5038
26/92 [=======>………………….] - ETA: 3s - loss: 1.1667 - accuracy: 0.5109
27/92 [=======>………………….] - ETA: 3s - loss: 1.1709 - accuracy: 0.5093
28/92 [========>…………………] - ETA: 3s - loss: 1.1625 - accuracy: 0.5146
29/92 [========>…………………] - ETA: 3s - loss: 1.1613 - accuracy: 0.5152
30/92 [========>…………………] - ETA: 3s - loss: 1.1667 - accuracy: 0.5147
31/92 [=========>………………..] - ETA: 3s - loss: 1.1620 - accuracy: 0.5193
32/92 [=========>………………..] - ETA: 3s - loss: 1.1589 - accuracy: 0.5177
33/92 [=========>………………..] - ETA: 3s - loss: 1.1593 - accuracy: 0.5181
34/92 [==========>……………….] - ETA: 3s - loss: 1.1560 - accuracy: 0.5167
35/92 [==========>……………….] - ETA: 3s - loss: 1.1544 - accuracy: 0.5162
36/92 [==========>……………….] - ETA: 3s - loss: 1.1475 - accuracy: 0.5210
37/92 [===========>………………] - ETA: 3s - loss: 1.1431 - accuracy: 0.5213
38/92 [===========>………………] - ETA: 3s - loss: 1.1386 - accuracy: 0.5265
39/92 [===========>………………] - ETA: 3s - loss: 1.1383 - accuracy: 0.5282
40/92 [============>……………..] - ETA: 3s - loss: 1.1376 - accuracy: 0.5275
41/92 [============>……………..] - ETA: 2s - loss: 1.1386 - accuracy: 0.5268
42/92 [============>……………..] - ETA: 2s - loss: 1.1374 - accuracy: 0.5262
43/92 [=============>…………….] - ETA: 2s - loss: 1.1333 - accuracy: 0.5278
44/92 [=============>…………….] - ETA: 2s - loss: 1.1271 - accuracy: 0.5293
45/92 [=============>…………….] - ETA: 2s - loss: 1.1306 - accuracy: 0.5258
46/92 [==============>……………] - ETA: 2s - loss: 1.1306 - accuracy: 0.5246
47/92 [==============>……………] - ETA: 2s - loss: 1.1274 - accuracy: 0.5261
48/92 [==============>……………] - ETA: 2s - loss: 1.1235 - accuracy: 0.5268
49/92 [==============>……………] - ETA: 2s - loss: 1.1230 - accuracy: 0.5282
50/92 [===============>…………..] - ETA: 2s - loss: 1.1240 - accuracy: 0.5283
51/92 [===============>…………..] - ETA: 2s - loss: 1.1228 - accuracy: 0.5302
52/92 [===============>…………..] - ETA: 2s - loss: 1.1179 - accuracy: 0.5326
53/92 [================>………….] - ETA: 2s - loss: 1.1113 - accuracy: 0.5367
54/92 [================>………….] - ETA: 2s - loss: 1.1151 - accuracy: 0.5378
55/92 [================>………….] - ETA: 2s - loss: 1.1139 - accuracy: 0.5388
56/92 [=================>…………] - ETA: 2s - loss: 1.1117 - accuracy: 0.5392
57/92 [=================>…………] - ETA: 2s - loss: 1.1080 - accuracy: 0.5413
58/92 [=================>…………] - ETA: 1s - loss: 1.1064 - accuracy: 0.5438
59/92 [==================>………..] - ETA: 1s - loss: 1.1016 - accuracy: 0.5452
60/92 [==================>………..] - ETA: 1s - loss: 1.1005 - accuracy: 0.5455
61/92 [==================>………..] - ETA: 1s - loss: 1.1002 - accuracy: 0.5448
62/92 [===================>……….] - ETA: 1s - loss: 1.0986 - accuracy: 0.5461
63/92 [===================>……….] - ETA: 1s - loss: 1.0959 - accuracy: 0.5483
64/92 [===================>……….] - ETA: 1s - loss: 1.0937 - accuracy: 0.5505
65/92 [====================>………] - ETA: 1s - loss: 1.0917 - accuracy: 0.5502
66/92 [====================>………] - ETA: 1s - loss: 1.0901 - accuracy: 0.5490
67/92 [====================>………] - ETA: 1s - loss: 1.0901 - accuracy: 0.5501
68/92 [=====================>……..] - ETA: 1s - loss: 1.0897 - accuracy: 0.5517
69/92 [=====================>……..] - ETA: 1s - loss: 1.0857 - accuracy: 0.5523
70/92 [=====================>……..] - ETA: 1s - loss: 1.0831 - accuracy: 0.5538
71/92 [======================>…….] - ETA: 1s - loss: 1.0809 - accuracy: 0.5548
72/92 [======================>…….] - ETA: 1s - loss: 1.0771 - accuracy: 0.5575
73/92 [======================>…….] - ETA: 1s - loss: 1.0745 - accuracy: 0.5597
74/92 [=======================>……] - ETA: 1s - loss: 1.0762 - accuracy: 0.5593
75/92 [=======================>……] - ETA: 0s - loss: 1.0742 - accuracy: 0.5594
76/92 [=======================>……] - ETA: 0s - loss: 1.0738 - accuracy: 0.5586
77/92 [========================>…..] - ETA: 0s - loss: 1.0748 - accuracy: 0.5574
78/92 [========================>…..] - ETA: 0s - loss: 1.0728 - accuracy: 0.5579
79/92 [========================>…..] - ETA: 0s - loss: 1.0742 - accuracy: 0.5595
80/92 [=========================>….] - ETA: 0s - loss: 1.0765 - accuracy: 0.5588
81/92 [=========================>….] - ETA: 0s - loss: 1.0769 - accuracy: 0.5600
82/92 [=========================>….] - ETA: 0s - loss: 1.0754 - accuracy: 0.5596
83/92 [==========================>…] - ETA: 0s - loss: 1.0743 - accuracy: 0.5597
84/92 [==========================>…] - ETA: 0s - loss: 1.0725 - accuracy: 0.5619
85/92 [==========================>…] - ETA: 0s - loss: 1.0725 - accuracy: 0.5608
86/92 [===========================>..] - ETA: 0s - loss: 1.0737 - accuracy: 0.5601
87/92 [===========================>..] - ETA: 0s - loss: 1.0731 - accuracy: 0.5602
88/92 [===========================>..] - ETA: 0s - loss: 1.0762 - accuracy: 0.5580
89/92 [============================>.] - ETA: 0s - loss: 1.0749 - accuracy: 0.5588
90/92 [============================>.] - ETA: 0s - loss: 1.0765 - accuracy: 0.5578
91/92 [============================>.] - ETA: 0s - loss: 1.0753 - accuracy: 0.5579
92/92 [==============================] - ETA: 0s - loss: 1.0742 - accuracy: 0.5589
92/92 [==============================] - 6s 64ms/step - loss: 1.0742 - accuracy: 0.5589 - val_loss: 1.0685 - val_accuracy: 0.5627
Epoch 3/15
1/92 [..............................] - ETA: 7s - loss: 0.9481 - accuracy: 0.6250
2/92 [..............................] - ETA: 5s - loss: 1.0400 - accuracy: 0.5312
3/92 [..............................] - ETA: 5s - loss: 1.0083 - accuracy: 0.5833
4/92 [>.............................] - ETA: 5s - loss: 0.9945 - accuracy: 0.6016
5/92 [>.............................] - ETA: 5s - loss: 0.9999 - accuracy: 0.5875
6/92 [>.............................] - ETA: 4s - loss: 0.9974 - accuracy: 0.5833
7/92 [=>............................] - ETA: 4s - loss: 0.9819 - accuracy: 0.5982
8/92 [=>............................] - ETA: 4s - loss: 0.9862 - accuracy: 0.5938
9/92 [=>............................] - ETA: 4s - loss: 0.9919 - accuracy: 0.6007
10/92 [==>………………………] - ETA: 4s - loss: 0.9875 - accuracy: 0.6062
11/92 [==>………………………] - ETA: 4s - loss: 0.9882 - accuracy: 0.6080
12/92 [==>………………………] - ETA: 4s - loss: 0.9726 - accuracy: 0.6120
13/92 [===>……………………..] - ETA: 4s - loss: 0.9736 - accuracy: 0.6034
14/92 [===>……………………..] - ETA: 4s - loss: 0.9760 - accuracy: 0.5982
15/92 [===>……………………..] - ETA: 4s - loss: 0.9785 - accuracy: 0.5979
16/92 [====>…………………….] - ETA: 4s - loss: 0.9527 - accuracy: 0.6172
17/92 [====>…………………….] - ETA: 4s - loss: 0.9492 - accuracy: 0.6213
18/92 [====>…………………….] - ETA: 4s - loss: 0.9475 - accuracy: 0.6215
19/92 [=====>……………………] - ETA: 4s - loss: 0.9402 - accuracy: 0.6234
20/92 [=====>……………………] - ETA: 4s - loss: 0.9313 - accuracy: 0.6281
21/92 [=====>……………………] - ETA: 4s - loss: 0.9270 - accuracy: 0.6324
22/92 [======>…………………..] - ETA: 4s - loss: 0.9411 - accuracy: 0.6264
23/92 [======>…………………..] - ETA: 4s - loss: 0.9370 - accuracy: 0.6264
24/92 [======>…………………..] - ETA: 3s - loss: 0.9346 - accuracy: 0.6276
25/92 [=======>………………….] - ETA: 3s - loss: 0.9300 - accuracy: 0.6300
26/92 [=======>………………….] - ETA: 3s - loss: 0.9249 - accuracy: 0.6298
27/92 [=======>………………….] - ETA: 3s - loss: 0.9217 - accuracy: 0.6319
28/92 [========>…………………] - ETA: 3s - loss: 0.9224 - accuracy: 0.6283
29/92 [========>…………………] - ETA: 3s - loss: 0.9219 - accuracy: 0.6261
30/92 [========>…………………] - ETA: 3s - loss: 0.9185 - accuracy: 0.6250
31/92 [=========>………………..] - ETA: 3s - loss: 0.9241 - accuracy: 0.6220
32/92 [=========>………………..] - ETA: 3s - loss: 0.9325 - accuracy: 0.6182
33/92 [=========>………………..] - ETA: 3s - loss: 0.9336 - accuracy: 0.6231
34/92 [==========>……………….] - ETA: 3s - loss: 0.9345 - accuracy: 0.6241
35/92 [==========>……………….] - ETA: 3s - loss: 0.9392 - accuracy: 0.6232
36/92 [==========>……………….] - ETA: 3s - loss: 0.9411 - accuracy: 0.6224
37/92 [===========>………………] - ETA: 3s - loss: 0.9436 - accuracy: 0.6225
38/92 [===========>………………] - ETA: 3s - loss: 0.9506 - accuracy: 0.6201
39/92 [===========>………………] - ETA: 3s - loss: 0.9491 - accuracy: 0.6234
40/92 [============>……………..] - ETA: 3s - loss: 0.9481 - accuracy: 0.6227
41/92 [============>……………..] - ETA: 2s - loss: 0.9429 - accuracy: 0.6250
42/92 [============>……………..] - ETA: 2s - loss: 0.9385 - accuracy: 0.6272
43/92 [=============>…………….] - ETA: 2s - loss: 0.9361 - accuracy: 0.6279
44/92 [=============>…………….] - ETA: 2s - loss: 0.9339 - accuracy: 0.6271
45/92 [=============>…………….] - ETA: 2s - loss: 0.9319 - accuracy: 0.6278
46/92 [==============>……………] - ETA: 2s - loss: 0.9336 - accuracy: 0.6277
47/92 [==============>……………] - ETA: 2s - loss: 0.9366 - accuracy: 0.6270
48/92 [==============>……………] - ETA: 2s - loss: 0.9346 - accuracy: 0.6263
49/92 [==============>……………] - ETA: 2s - loss: 0.9313 - accuracy: 0.6276
50/92 [===============>…………..] - ETA: 2s - loss: 0.9317 - accuracy: 0.6263
51/92 [===============>…………..] - ETA: 2s - loss: 0.9299 - accuracy: 0.6256
52/92 [===============>…………..] - ETA: 2s - loss: 0.9252 - accuracy: 0.6274
53/92 [================>………….] - ETA: 2s - loss: 0.9252 - accuracy: 0.6268
54/92 [================>………….] - ETA: 2s - loss: 0.9259 - accuracy: 0.6279
55/92 [================>………….] - ETA: 2s - loss: 0.9261 - accuracy: 0.6273
56/92 [=================>…………] - ETA: 2s - loss: 0.9282 - accuracy: 0.6278
57/92 [=================>…………] - ETA: 2s - loss: 0.9275 - accuracy: 0.6283
58/92 [=================>…………] - ETA: 1s - loss: 0.9368 - accuracy: 0.6245
59/92 [==================>………..] - ETA: 1s - loss: 0.9373 - accuracy: 0.6229
60/92 [==================>………..] - ETA: 1s - loss: 0.9358 - accuracy: 0.6234
61/92 [==================>………..] - ETA: 1s - loss: 0.9381 - accuracy: 0.6224
62/92 [===================>……….] - ETA: 1s - loss: 0.9375 - accuracy: 0.6235
63/92 [===================>……….] - ETA: 1s - loss: 0.9379 - accuracy: 0.6240
64/92 [===================>……….] - ETA: 1s - loss: 0.9418 - accuracy: 0.6230
65/92 [====================>………] - ETA: 1s - loss: 0.9404 - accuracy: 0.6221
66/92 [====================>………] - ETA: 1s - loss: 0.9375 - accuracy: 0.6245
67/92 [====================>………] - ETA: 1s - loss: 0.9395 - accuracy: 0.6227
68/92 [=====================>……..] - ETA: 1s - loss: 0.9382 - accuracy: 0.6232
69/92 [=====================>……..] - ETA: 1s - loss: 0.9388 - accuracy: 0.6236
70/92 [=====================>……..] - ETA: 1s - loss: 0.9366 - accuracy: 0.6246
71/92 [======================>…….] - ETA: 1s - loss: 0.9352 - accuracy: 0.6250
72/92 [======================>…….] - ETA: 1s - loss: 0.9365 - accuracy: 0.6233
73/92 [======================>…….] - ETA: 1s - loss: 0.9340 - accuracy: 0.6250
74/92 [=======================>……] - ETA: 1s - loss: 0.9335 - accuracy: 0.6258
75/92 [=======================>……] - ETA: 0s - loss: 0.9347 - accuracy: 0.6254
76/92 [=======================>……] - ETA: 0s - loss: 0.9330 - accuracy: 0.6262
77/92 [========================>…..] - ETA: 0s - loss: 0.9301 - accuracy: 0.6291
78/92 [========================>…..] - ETA: 0s - loss: 0.9306 - accuracy: 0.6290
80/92 [=========================>….] - ETA: 0s - loss: 0.9272 - accuracy: 0.6313
81/92 [=========================>….] - ETA: 0s - loss: 0.9265 - accuracy: 0.6316
82/92 [=========================>….] - ETA: 0s - loss: 0.9277 - accuracy: 0.6307
83/92 [==========================>…] - ETA: 0s - loss: 0.9297 - accuracy: 0.6295
84/92 [==========================>…] - ETA: 0s - loss: 0.9302 - accuracy: 0.6299
85/92 [==========================>…] - ETA: 0s - loss: 0.9299 - accuracy: 0.6309
86/92 [===========================>..] - ETA: 0s - loss: 0.9320 - accuracy: 0.6301
87/92 [===========================>..] - ETA: 0s - loss: 0.9358 - accuracy: 0.6279
88/92 [===========================>..] - ETA: 0s - loss: 0.9352 - accuracy: 0.6289
89/92 [============================>.] - ETA: 0s - loss: 0.9329 - accuracy: 0.6292
90/92 [============================>.] - ETA: 0s - loss: 0.9313 - accuracy: 0.6295
91/92 [============================>.] - ETA: 0s - loss: 0.9285 - accuracy: 0.6309
92/92 [==============================] - ETA: 0s - loss: 0.9316 - accuracy: 0.6291
92/92 [==============================] - 6s 64ms/step - loss: 0.9316 - accuracy: 0.6291 - val_loss: 0.8948 - val_accuracy: 0.6540
Epoch 4/15
1/92 [..............................] - ETA: 6s - loss: 0.9437 - accuracy: 0.6250
2/92 [..............................] - ETA: 5s - loss: 0.9400 - accuracy: 0.6406
3/92 [..............................] - ETA: 5s - loss: 1.0213 - accuracy: 0.6146
4/92 [>.............................] - ETA: 5s - loss: 1.0144 - accuracy: 0.6016
5/92 [>.............................] - ETA: 5s - loss: 0.9660 - accuracy: 0.6250
6/92 [>.............................] - ETA: 5s - loss: 0.9476 - accuracy: 0.6510
7/92 [=>............................] - ETA: 4s - loss: 0.9413 - accuracy: 0.6518
8/92 [=>............................] - ETA: 4s - loss: 0.9381 - accuracy: 0.6484
9/92 [=>............................] - ETA: 4s - loss: 0.9303 - accuracy: 0.6389
10/92 [==>………………………] - ETA: 4s - loss: 0.9133 - accuracy: 0.6469
11/92 [==>………………………] - ETA: 4s - loss: 0.9262 - accuracy: 0.6420
12/92 [==>………………………] - ETA: 4s - loss: 0.9131 - accuracy: 0.6458
13/92 [===>……………………..] - ETA: 4s - loss: 0.9015 - accuracy: 0.6538
14/92 [===>……………………..] - ETA: 4s - loss: 0.8883 - accuracy: 0.6585
15/92 [===>……………………..] - ETA: 4s - loss: 0.8822 - accuracy: 0.6562
16/92 [====>…………………….] - ETA: 4s - loss: 0.8838 - accuracy: 0.6602
17/92 [====>…………………….] - ETA: 4s - loss: 0.9114 - accuracy: 0.6489
18/92 [====>…………………….] - ETA: 4s - loss: 0.9011 - accuracy: 0.6510
19/92 [=====>……………………] - ETA: 4s - loss: 0.8960 - accuracy: 0.6530
20/92 [=====>……………………] - ETA: 4s - loss: 0.8969 - accuracy: 0.6516
21/92 [=====>……………………] - ETA: 4s - loss: 0.8903 - accuracy: 0.6503
22/92 [======>…………………..] - ETA: 4s - loss: 0.8889 - accuracy: 0.6477
23/92 [======>…………………..] - ETA: 4s - loss: 0.8916 - accuracy: 0.6467
24/92 [======>…………………..] - ETA: 3s - loss: 0.8925 - accuracy: 0.6471
25/92 [=======>………………….] - ETA: 3s - loss: 0.8842 - accuracy: 0.6513
26/92 [=======>………………….] - ETA: 3s - loss: 0.8855 - accuracy: 0.6502
27/92 [=======>………………….] - ETA: 3s - loss: 0.8771 - accuracy: 0.6539
28/92 [========>…………………] - ETA: 3s - loss: 0.8767 - accuracy: 0.6518
29/92 [========>…………………] - ETA: 3s - loss: 0.8698 - accuracy: 0.6541
30/92 [========>…………………] - ETA: 3s - loss: 0.8710 - accuracy: 0.6542
31/92 [=========>………………..] - ETA: 3s - loss: 0.8748 - accuracy: 0.6532
32/92 [=========>………………..] - ETA: 3s - loss: 0.8705 - accuracy: 0.6543
33/92 [=========>………………..] - ETA: 3s - loss: 0.8692 - accuracy: 0.6562
34/92 [==========>……………….] - ETA: 3s - loss: 0.8665 - accuracy: 0.6599
36/92 [==========>……………….] - ETA: 3s - loss: 0.8706 - accuracy: 0.6617
37/92 [===========>………………] - ETA: 3s - loss: 0.8676 - accuracy: 0.6624
38/92 [===========>………………] - ETA: 3s - loss: 0.8652 - accuracy: 0.6623
39/92 [===========>………………] - ETA: 3s - loss: 0.8623 - accuracy: 0.6629
40/92 [============>……………..] - ETA: 3s - loss: 0.8594 - accuracy: 0.6651
41/92 [============>……………..] - ETA: 2s - loss: 0.8566 - accuracy: 0.6672
42/92 [============>……………..] - ETA: 2s - loss: 0.8507 - accuracy: 0.6707
43/92 [=============>…………….] - ETA: 2s - loss: 0.8462 - accuracy: 0.6718
44/92 [=============>…………….] - ETA: 2s - loss: 0.8465 - accuracy: 0.6693
45/92 [=============>…………….] - ETA: 2s - loss: 0.8456 - accuracy: 0.6690
46/92 [==============>……………] - ETA: 2s - loss: 0.8501 - accuracy: 0.6653
47/92 [==============>……………] - ETA: 2s - loss: 0.8534 - accuracy: 0.6651
48/92 [==============>……………] - ETA: 2s - loss: 0.8511 - accuracy: 0.6656
49/92 [==============>……………] - ETA: 2s - loss: 0.8530 - accuracy: 0.6654
50/92 [===============>…………..] - ETA: 2s - loss: 0.8499 - accuracy: 0.6658
51/92 [===============>…………..] - ETA: 2s - loss: 0.8518 - accuracy: 0.6644
52/92 [===============>…………..] - ETA: 2s - loss: 0.8521 - accuracy: 0.6636
53/92 [================>………….] - ETA: 2s - loss: 0.8538 - accuracy: 0.6629
54/92 [================>………….] - ETA: 2s - loss: 0.8603 - accuracy: 0.6605
55/92 [================>………….] - ETA: 2s - loss: 0.8678 - accuracy: 0.6592
56/92 [=================>…………] - ETA: 2s - loss: 0.8674 - accuracy: 0.6592
57/92 [=================>…………] - ETA: 2s - loss: 0.8660 - accuracy: 0.6591
58/92 [=================>…………] - ETA: 1s - loss: 0.8648 - accuracy: 0.6607
59/92 [==================>………..] - ETA: 1s - loss: 0.8631 - accuracy: 0.6617
60/92 [==================>………..] - ETA: 1s - loss: 0.8645 - accuracy: 0.6595
61/92 [==================>………..] - ETA: 1s - loss: 0.8637 - accuracy: 0.6600
62/92 [===================>……….] - ETA: 1s - loss: 0.8635 - accuracy: 0.6594
63/92 [===================>……….] - ETA: 1s - loss: 0.8610 - accuracy: 0.6599
64/92 [===================>……….] - ETA: 1s - loss: 0.8574 - accuracy: 0.6623
65/92 [====================>………] - ETA: 1s - loss: 0.8597 - accuracy: 0.6612
66/92 [====================>………] - ETA: 1s - loss: 0.8600 - accuracy: 0.6602
67/92 [====================>………] - ETA: 1s - loss: 0.8582 - accuracy: 0.6606
68/92 [=====================>……..] - ETA: 1s - loss: 0.8573 - accuracy: 0.6601
69/92 [=====================>……..] - ETA: 1s - loss: 0.8575 - accuracy: 0.6600
70/92 [=====================>……..] - ETA: 1s - loss: 0.8576 - accuracy: 0.6604
71/92 [======================>…….] - ETA: 1s - loss: 0.8571 - accuracy: 0.6617
72/92 [======================>…….] - ETA: 1s - loss: 0.8587 - accuracy: 0.6598
73/92 [======================>…….] - ETA: 1s - loss: 0.8593 - accuracy: 0.6598
74/92 [=======================>……] - ETA: 1s - loss: 0.8549 - accuracy: 0.6610
75/92 [=======================>……] - ETA: 0s - loss: 0.8569 - accuracy: 0.6610
76/92 [=======================>……] - ETA: 0s - loss: 0.8579 - accuracy: 0.6605
77/92 [========================>…..] - ETA: 0s - loss: 0.8619 - accuracy: 0.6584
78/92 [========================>…..] - ETA: 0s - loss: 0.8620 - accuracy: 0.6580
79/92 [========================>…..] - ETA: 0s - loss: 0.8613 - accuracy: 0.6583
80/92 [=========================>….] - ETA: 0s - loss: 0.8616 - accuracy: 0.6591
81/92 [=========================>….] - ETA: 0s - loss: 0.8579 - accuracy: 0.6602
82/92 [=========================>….] - ETA: 0s - loss: 0.8597 - accuracy: 0.6594
83/92 [==========================>…] - ETA: 0s - loss: 0.8589 - accuracy: 0.6597
84/92 [==========================>…] - ETA: 0s - loss: 0.8584 - accuracy: 0.6601
85/92 [==========================>…] - ETA: 0s - loss: 0.8609 - accuracy: 0.6578
86/92 [===========================>..] - ETA: 0s - loss: 0.8626 - accuracy: 0.6567
87/92 [===========================>..] - ETA: 0s - loss: 0.8617 - accuracy: 0.6574
88/92 [===========================>..] - ETA: 0s - loss: 0.8618 - accuracy: 0.6571
89/92 [============================>.] - ETA: 0s - loss: 0.8622 - accuracy: 0.6553
90/92 [============================>.] - ETA: 0s - loss: 0.8624 - accuracy: 0.6542
91/92 [============================>.] - ETA: 0s - loss: 0.8621 - accuracy: 0.6553
92/92 [==============================] - ETA: 0s - loss: 0.8626 - accuracy: 0.6550
92/92 [==============================] - 6s 64ms/step - loss: 0.8626 - accuracy: 0.6550 - val_loss: 0.8452 - val_accuracy: 0.6540
Epoch 5/15
1/92 [..............................] - ETA: 7s - loss: 0.6427 - accuracy: 0.7188
2/92 [..............................] - ETA: 5s - loss: 0.6912 - accuracy: 0.7188
3/92 [..............................] - ETA: 5s - loss: 0.7137 - accuracy: 0.6875
4/92 [>.............................] - ETA: 5s - loss: 0.7362 - accuracy: 0.6875
5/92 [>.............................] - ETA: 5s - loss: 0.7649 - accuracy: 0.6812
6/92 [>.............................] - ETA: 5s - loss: 0.8160 - accuracy: 0.6562
7/92 [=>............................] - ETA: 4s - loss: 0.8130 - accuracy: 0.6562
8/92 [=>............................] - ETA: 4s - loss: 0.7964 - accuracy: 0.6680
9/92 [=>............................] - ETA: 4s - loss: 0.8136 - accuracy: 0.6632
10/92 [==>………………………] - ETA: 4s - loss: 0.8277 - accuracy: 0.6562
11/92 [==>………………………] - ETA: 4s - loss: 0.8489 - accuracy: 0.6477
12/92 [==>………………………] - ETA: 4s - loss: 0.8382 - accuracy: 0.6510
13/92 [===>……………………..] - ETA: 4s - loss: 0.8242 - accuracy: 0.6611
14/92 [===>……………………..] - ETA: 4s - loss: 0.8200 - accuracy: 0.6607
15/92 [===>……………………..] - ETA: 4s - loss: 0.8231 - accuracy: 0.6562
16/92 [====>…………………….] - ETA: 4s - loss: 0.8219 - accuracy: 0.6602
17/92 [====>…………………….] - ETA: 4s - loss: 0.8155 - accuracy: 0.6654
18/92 [====>…………………….] - ETA: 4s - loss: 0.8175 - accuracy: 0.6667
19/92 [=====>……………………] - ETA: 4s - loss: 0.8136 - accuracy: 0.6727
20/92 [=====>……………………] - ETA: 4s - loss: 0.8088 - accuracy: 0.6719
21/92 [=====>……………………] - ETA: 4s - loss: 0.7999 - accuracy: 0.6771
22/92 [======>…………………..] - ETA: 4s - loss: 0.8036 - accuracy: 0.6747
23/92 [======>…………………..] - ETA: 4s - loss: 0.8034 - accuracy: 0.6780
24/92 [======>…………………..] - ETA: 3s - loss: 0.8009 - accuracy: 0.6797
25/92 [=======>………………….] - ETA: 3s - loss: 0.7954 - accuracy: 0.6837
26/92 [=======>………………….] - ETA: 3s - loss: 0.7901 - accuracy: 0.6875
27/92 [=======>………………….] - ETA: 3s - loss: 0.7908 - accuracy: 0.6852
28/92 [========>…………………] - ETA: 3s - loss: 0.7907 - accuracy: 0.6864
29/92 [========>…………………] - ETA: 3s - loss: 0.7913 - accuracy: 0.6864
30/92 [========>…………………] - ETA: 3s - loss: 0.7988 - accuracy: 0.6823
31/92 [=========>………………..] - ETA: 3s - loss: 0.8004 - accuracy: 0.6804
32/92 [=========>………………..] - ETA: 3s - loss: 0.8030 - accuracy: 0.6797
33/92 [=========>………………..] - ETA: 3s - loss: 0.7989 - accuracy: 0.6809
34/92 [==========>……………….] - ETA: 3s - loss: 0.8025 - accuracy: 0.6792
35/92 [==========>……………….] - ETA: 3s - loss: 0.7999 - accuracy: 0.6786
36/92 [==========>……………….] - ETA: 3s - loss: 0.7912 - accuracy: 0.6832
37/92 [===========>………………] - ETA: 3s - loss: 0.7929 - accuracy: 0.6833
38/92 [===========>………………] - ETA: 3s - loss: 0.7941 - accuracy: 0.6826
39/92 [===========>………………] - ETA: 3s - loss: 0.7943 - accuracy: 0.6843
40/92 [============>……………..] - ETA: 3s - loss: 0.8040 - accuracy: 0.6828
41/92 [============>……………..] - ETA: 2s - loss: 0.8066 - accuracy: 0.6822
42/92 [============>……………..] - ETA: 2s - loss: 0.8043 - accuracy: 0.6838
43/92 [=============>…………….] - ETA: 2s - loss: 0.8004 - accuracy: 0.6860
44/92 [=============>…………….] - ETA: 2s - loss: 0.7954 - accuracy: 0.6875
45/92 [=============>…………….] - ETA: 2s - loss: 0.7981 - accuracy: 0.6875
46/92 [==============>……………] - ETA: 2s - loss: 0.7944 - accuracy: 0.6889
47/92 [==============>……………] - ETA: 2s - loss: 0.7967 - accuracy: 0.6875
48/92 [==============>……………] - ETA: 2s - loss: 0.7933 - accuracy: 0.6888
49/92 [==============>……………] - ETA: 2s - loss: 0.7877 - accuracy: 0.6913
50/92 [===============>…………..] - ETA: 2s - loss: 0.7844 - accuracy: 0.6919
51/92 [===============>…………..] - ETA: 2s - loss: 0.7897 - accuracy: 0.6893
53/92 [================>………….] - ETA: 2s - loss: 0.7869 - accuracy: 0.6902
54/92 [================>………….] - ETA: 2s - loss: 0.7861 - accuracy: 0.6913
55/92 [================>………….] - ETA: 2s - loss: 0.7849 - accuracy: 0.6912
56/92 [=================>…………] - ETA: 2s - loss: 0.7852 - accuracy: 0.6917
57/92 [=================>…………] - ETA: 2s - loss: 0.7868 - accuracy: 0.6916
58/92 [=================>…………] - ETA: 1s - loss: 0.7890 - accuracy: 0.6916
59/92 [==================>………..] - ETA: 1s - loss: 0.7928 - accuracy: 0.6910
60/92 [==================>………..] - ETA: 1s - loss: 0.7903 - accuracy: 0.6930
61/92 [==================>………..] - ETA: 1s - loss: 0.7904 - accuracy: 0.6939
62/92 [===================>……….] - ETA: 1s - loss: 0.7863 - accuracy: 0.6969
63/92 [===================>……….] - ETA: 1s - loss: 0.7856 - accuracy: 0.6977
64/92 [===================>……….] - ETA: 1s - loss: 0.7854 - accuracy: 0.6971
65/92 [====================>………] - ETA: 1s - loss: 0.7943 - accuracy: 0.6940
66/92 [====================>………] - ETA: 1s - loss: 0.7963 - accuracy: 0.6939
67/92 [====================>………] - ETA: 1s - loss: 0.8017 - accuracy: 0.6910
68/92 [=====================>……..] - ETA: 1s - loss: 0.8011 - accuracy: 0.6923
69/92 [=====================>……..] - ETA: 1s - loss: 0.8030 - accuracy: 0.6918
70/92 [=====================>……..] - ETA: 1s - loss: 0.8016 - accuracy: 0.6913
71/92 [======================>…….] - ETA: 1s - loss: 0.8007 - accuracy: 0.6917
72/92 [======================>…….] - ETA: 1s - loss: 0.8018 - accuracy: 0.6908
73/92 [======================>…….] - ETA: 1s - loss: 0.8020 - accuracy: 0.6920
74/92 [=======================>……] - ETA: 1s - loss: 0.8055 - accuracy: 0.6903
75/92 [=======================>……] - ETA: 0s - loss: 0.8092 - accuracy: 0.6894
76/92 [=======================>……] - ETA: 0s - loss: 0.8083 - accuracy: 0.6914
77/92 [========================>…..] - ETA: 0s - loss: 0.8063 - accuracy: 0.6930
78/92 [========================>…..] - ETA: 0s - loss: 0.8076 - accuracy: 0.6929
79/92 [========================>…..] - ETA: 0s - loss: 0.8080 - accuracy: 0.6925
80/92 [=========================>….] - ETA: 0s - loss: 0.8081 - accuracy: 0.6932
81/92 [=========================>….] - ETA: 0s - loss: 0.8083 - accuracy: 0.6916
82/92 [=========================>….] - ETA: 0s - loss: 0.8071 - accuracy: 0.6919
83/92 [==========================>…] - ETA: 0s - loss: 0.8083 - accuracy: 0.6915
84/92 [==========================>…] - ETA: 0s - loss: 0.8077 - accuracy: 0.6914
85/92 [==========================>…] - ETA: 0s - loss: 0.8083 - accuracy: 0.6910
86/92 [===========================>..] - ETA: 0s - loss: 0.8087 - accuracy: 0.6902
87/92 [===========================>..] - ETA: 0s - loss: 0.8088 - accuracy: 0.6891
88/92 [===========================>..] - ETA: 0s - loss: 0.8067 - accuracy: 0.6898
89/92 [============================>.] - ETA: 0s - loss: 0.8064 - accuracy: 0.6887
90/92 [============================>.] - ETA: 0s - loss: 0.8070 - accuracy: 0.6891
91/92 [============================>.] - ETA: 0s - loss: 0.8046 - accuracy: 0.6901
92/92 [==============================] - ETA: 0s - loss: 0.8027 - accuracy: 0.6911
92/92 [==============================] - 6s 64ms/step - loss: 0.8027 - accuracy: 0.6911 - val_loss: 0.9385 - val_accuracy: 0.6540
Epoch 6/15
1/92 [..............................] - ETA: 7s - loss: 0.8309 - accuracy: 0.6875
2/92 [..............................] - ETA: 5s - loss: 0.8032 - accuracy: 0.7031
3/92 [..............................] - ETA: 5s - loss: 0.8544 - accuracy: 0.6979
4/92 [>.............................] - ETA: 5s - loss: 0.7712 - accuracy: 0.7422
5/92 [>.............................] - ETA: 5s - loss: 0.7464 - accuracy: 0.7375
6/92 [>.............................] - ETA: 5s - loss: 0.7709 - accuracy: 0.7396
7/92 [=>............................] - ETA: 4s - loss: 0.7587 - accuracy: 0.7411
8/92 [=>............................] - ETA: 4s - loss: 0.8016 - accuracy: 0.7227
9/92 [=>............................] - ETA: 4s - loss: 0.7806 - accuracy: 0.7153
10/92 [==>………………………] - ETA: 4s - loss: 0.7750 - accuracy: 0.7188
11/92 [==>………………………] - ETA: 4s - loss: 0.7794 - accuracy: 0.7159
12/92 [==>………………………] - ETA: 4s - loss: 0.7689 - accuracy: 0.7161
13/92 [===>……………………..] - ETA: 4s - loss: 0.7657 - accuracy: 0.7115
14/92 [===>……………………..] - ETA: 4s - loss: 0.7456 - accuracy: 0.7165
15/92 [===>……………………..] - ETA: 4s - loss: 0.7359 - accuracy: 0.7188
16/92 [====>…………………….] - ETA: 4s - loss: 0.7275 - accuracy: 0.7207
17/92 [====>…………………….] - ETA: 4s - loss: 0.7406 - accuracy: 0.7151
18/92 [====>…………………….] - ETA: 4s - loss: 0.7406 - accuracy: 0.7135
19/92 [=====>……………………] - ETA: 4s - loss: 0.7385 - accuracy: 0.7138
20/92 [=====>……………………] - ETA: 4s - loss: 0.7366 - accuracy: 0.7141
21/92 [=====>……………………] - ETA: 4s - loss: 0.7355 - accuracy: 0.7143
22/92 [======>…………………..] - ETA: 4s - loss: 0.7315 - accuracy: 0.7145
23/92 [======>…………………..] - ETA: 4s - loss: 0.7283 - accuracy: 0.7160
24/92 [======>…………………..] - ETA: 3s - loss: 0.7425 - accuracy: 0.7070
25/92 [=======>………………….] - ETA: 3s - loss: 0.7453 - accuracy: 0.7088
26/92 [=======>………………….] - ETA: 3s - loss: 0.7423 - accuracy: 0.7151
27/92 [=======>………………….] - ETA: 3s - loss: 0.7448 - accuracy: 0.7118
28/92 [========>…………………] - ETA: 3s - loss: 0.7458 - accuracy: 0.7121
29/92 [========>…………………] - ETA: 3s - loss: 0.7452 - accuracy: 0.7144
30/92 [========>…………………] - ETA: 3s - loss: 0.7466 - accuracy: 0.7135
31/92 [=========>………………..] - ETA: 3s - loss: 0.7409 - accuracy: 0.7157
32/92 [=========>………………..] - ETA: 3s - loss: 0.7407 - accuracy: 0.7168
33/92 [=========>………………..] - ETA: 3s - loss: 0.7351 - accuracy: 0.7197
34/92 [==========>……………….] - ETA: 3s - loss: 0.7291 - accuracy: 0.7215
35/92 [==========>……………….] - ETA: 3s - loss: 0.7285 - accuracy: 0.7223
36/92 [==========>……………….] - ETA: 3s - loss: 0.7361 - accuracy: 0.7188
37/92 [===========>………………] - ETA: 3s - loss: 0.7333 - accuracy: 0.7204
38/92 [===========>………………] - ETA: 3s - loss: 0.7315 - accuracy: 0.7204
39/92 [===========>………………] - ETA: 3s - loss: 0.7309 - accuracy: 0.7204
40/92 [============>……………..] - ETA: 3s - loss: 0.7319 - accuracy: 0.7211
41/92 [============>……………..] - ETA: 2s - loss: 0.7360 - accuracy: 0.7180
42/92 [============>……………..] - ETA: 2s - loss: 0.7382 - accuracy: 0.7173
43/92 [=============>…………….] - ETA: 2s - loss: 0.7401 - accuracy: 0.7151
44/92 [=============>…………….] - ETA: 2s - loss: 0.7342 - accuracy: 0.7180
45/92 [=============>…………….] - ETA: 2s - loss: 0.7370 - accuracy: 0.7174
46/92 [==============>……………] - ETA: 2s - loss: 0.7370 - accuracy: 0.7174
47/92 [==============>……………] - ETA: 2s - loss: 0.7349 - accuracy: 0.7201
48/92 [==============>……………] - ETA: 2s - loss: 0.7335 - accuracy: 0.7181
49/92 [==============>……………] - ETA: 2s - loss: 0.7344 - accuracy: 0.7162
50/92 [===============>…………..] - ETA: 2s - loss: 0.7317 - accuracy: 0.7163
51/92 [===============>…………..] - ETA: 2s - loss: 0.7332 - accuracy: 0.7157
52/92 [===============>…………..] - ETA: 2s - loss: 0.7387 - accuracy: 0.7133
53/92 [================>………….] - ETA: 2s - loss: 0.7450 - accuracy: 0.7117
54/92 [================>………….] - ETA: 2s - loss: 0.7455 - accuracy: 0.7124
55/92 [================>………….] - ETA: 2s - loss: 0.7468 - accuracy: 0.7114
56/92 [=================>…………] - ETA: 2s - loss: 0.7489 - accuracy: 0.7121
58/92 [=================>…………] - ETA: 1s - loss: 0.7542 - accuracy: 0.7094
59/92 [==================>………..] - ETA: 1s - loss: 0.7542 - accuracy: 0.7090
60/92 [==================>………..] - ETA: 1s - loss: 0.7528 - accuracy: 0.7113
61/92 [==================>………..] - ETA: 1s - loss: 0.7565 - accuracy: 0.7088
62/92 [===================>……….] - ETA: 1s - loss: 0.7541 - accuracy: 0.7110
63/92 [===================>……….] - ETA: 1s - loss: 0.7551 - accuracy: 0.7097
64/92 [===================>……….] - ETA: 1s - loss: 0.7536 - accuracy: 0.7108
65/92 [====================>………] - ETA: 1s - loss: 0.7494 - accuracy: 0.7128
66/92 [====================>………] - ETA: 1s - loss: 0.7460 - accuracy: 0.7144
67/92 [====================>………] - ETA: 1s - loss: 0.7464 - accuracy: 0.7140
68/92 [=====================>……..] - ETA: 1s - loss: 0.7455 - accuracy: 0.7145
69/92 [=====================>……..] - ETA: 1s - loss: 0.7467 - accuracy: 0.7136
70/92 [=====================>……..] - ETA: 1s - loss: 0.7478 - accuracy: 0.7142
71/92 [======================>…….] - ETA: 1s - loss: 0.7466 - accuracy: 0.7155
72/92 [======================>…….] - ETA: 1s - loss: 0.7430 - accuracy: 0.7160
73/92 [======================>…….] - ETA: 1s - loss: 0.7404 - accuracy: 0.7169
74/92 [=======================>……] - ETA: 1s - loss: 0.7433 - accuracy: 0.7148
75/92 [=======================>……] - ETA: 0s - loss: 0.7427 - accuracy: 0.7153
76/92 [=======================>……] - ETA: 0s - loss: 0.7442 - accuracy: 0.7149
77/92 [========================>…..] - ETA: 0s - loss: 0.7437 - accuracy: 0.7146
78/92 [========================>…..] - ETA: 0s - loss: 0.7435 - accuracy: 0.7150
79/92 [========================>…..] - ETA: 0s - loss: 0.7461 - accuracy: 0.7151
80/92 [=========================>….] - ETA: 0s - loss: 0.7502 - accuracy: 0.7143
81/92 [=========================>….] - ETA: 0s - loss: 0.7494 - accuracy: 0.7148
82/92 [=========================>….] - ETA: 0s - loss: 0.7490 - accuracy: 0.7156
83/92 [==========================>…] - ETA: 0s - loss: 0.7489 - accuracy: 0.7156
84/92 [==========================>…] - ETA: 0s - loss: 0.7506 - accuracy: 0.7134
85/92 [==========================>…] - ETA: 0s - loss: 0.7490 - accuracy: 0.7146
86/92 [===========================>..] - ETA: 0s - loss: 0.7500 - accuracy: 0.7143
87/92 [===========================>..] - ETA: 0s - loss: 0.7500 - accuracy: 0.7143
88/92 [===========================>..] - ETA: 0s - loss: 0.7505 - accuracy: 0.7140
89/92 [============================>.] - ETA: 0s - loss: 0.7555 - accuracy: 0.7116
90/92 [============================>.] - ETA: 0s - loss: 0.7560 - accuracy: 0.7117
91/92 [============================>.] - ETA: 0s - loss: 0.7572 - accuracy: 0.7111
92/92 [==============================] - ETA: 0s - loss: 0.7571 - accuracy: 0.7115
92/92 [==============================] - 6s 64ms/step - loss: 0.7571 - accuracy: 0.7115 - val_loss: 0.7898 - val_accuracy: 0.6757
Epoch 7/15
1/92 [..............................] - ETA: 7s - loss: 1.0297 - accuracy: 0.5312
2/92 [..............................] - ETA: 5s - loss: 0.9824 - accuracy: 0.6250
3/92 [..............................] - ETA: 5s - loss: 0.9507 - accuracy: 0.6250
4/92 [>.............................] - ETA: 5s - loss: 0.8653 - accuracy: 0.6797
5/92 [>.............................] - ETA: 5s - loss: 0.8340 - accuracy: 0.6875
6/92 [>.............................] - ETA: 5s - loss: 0.7951 - accuracy: 0.7135
7/92 [=>............................] - ETA: 5s - loss: 0.7668 - accuracy: 0.7321
8/92 [=>............................] - ETA: 4s - loss: 0.7473 - accuracy: 0.7383
9/92 [=>............................] - ETA: 4s - loss: 0.7561 - accuracy: 0.7257
10/92 [==>………………………] - ETA: 4s - loss: 0.7352 - accuracy: 0.7344
11/92 [==>………………………] - ETA: 4s - loss: 0.7213 - accuracy: 0.7443
12/92 [==>………………………] - ETA: 4s - loss: 0.7234 - accuracy: 0.7396
13/92 [===>……………………..] - ETA: 4s - loss: 0.7168 - accuracy: 0.7380
14/92 [===>……………………..] - ETA: 4s - loss: 0.7381 - accuracy: 0.7344
15/92 [===>……………………..] - ETA: 4s - loss: 0.7442 - accuracy: 0.7292
16/92 [====>…………………….] - ETA: 4s - loss: 0.7399 - accuracy: 0.7305
17/92 [====>…………………….] - ETA: 4s - loss: 0.7363 - accuracy: 0.7316
18/92 [====>…………………….] - ETA: 4s - loss: 0.7337 - accuracy: 0.7292
19/92 [=====>……………………] - ETA: 4s - loss: 0.7476 - accuracy: 0.7286
20/92 [=====>……………………] - ETA: 4s - loss: 0.7442 - accuracy: 0.7297
21/92 [=====>……………………] - ETA: 4s - loss: 0.7557 - accuracy: 0.7217
22/92 [======>…………………..] - ETA: 4s - loss: 0.7469 - accuracy: 0.7244
23/92 [======>…………………..] - ETA: 4s - loss: 0.7375 - accuracy: 0.7296
24/92 [======>…………………..] - ETA: 3s - loss: 0.7313 - accuracy: 0.7318
25/92 [=======>………………….] - ETA: 3s - loss: 0.7247 - accuracy: 0.7337
26/92 [=======>………………….] - ETA: 3s - loss: 0.7243 - accuracy: 0.7344
27/92 [=======>………………….] - ETA: 3s - loss: 0.7181 - accuracy: 0.7350
28/92 [========>…………………] - ETA: 3s - loss: 0.7166 - accuracy: 0.7333
29/92 [========>…………………] - ETA: 3s - loss: 0.7185 - accuracy: 0.7328
30/92 [========>…………………] - ETA: 3s - loss: 0.7122 - accuracy: 0.7354
31/92 [=========>………………..] - ETA: 3s - loss: 0.7137 - accuracy: 0.7349
32/92 [=========>………………..] - ETA: 3s - loss: 0.7101 - accuracy: 0.7363
33/92 [=========>………………..] - ETA: 3s - loss: 0.7042 - accuracy: 0.7377
34/92 [==========>……………….] - ETA: 3s - loss: 0.7058 - accuracy: 0.7362
35/92 [==========>……………….] - ETA: 3s - loss: 0.7068 - accuracy: 0.7339
36/92 [==========>……………….] - ETA: 3s - loss: 0.7114 - accuracy: 0.7318
37/92 [===========>………………] - ETA: 3s - loss: 0.7073 - accuracy: 0.7348
38/92 [===========>………………] - ETA: 3s - loss: 0.7022 - accuracy: 0.7360
39/92 [===========>………………] - ETA: 3s - loss: 0.7105 - accuracy: 0.7340
40/92 [============>……………..] - ETA: 3s - loss: 0.7085 - accuracy: 0.7344
41/92 [============>……………..] - ETA: 2s - loss: 0.7062 - accuracy: 0.7348
42/92 [============>……………..] - ETA: 2s - loss: 0.7022 - accuracy: 0.7359
43/92 [=============>…………….] - ETA: 2s - loss: 0.7012 - accuracy: 0.7355
44/92 [=============>…………….] - ETA: 2s - loss: 0.7048 - accuracy: 0.7358
45/92 [=============>…………….] - ETA: 2s - loss: 0.7104 - accuracy: 0.7312
46/92 [==============>……………] - ETA: 2s - loss: 0.7085 - accuracy: 0.7303
47/92 [==============>……………] - ETA: 2s - loss: 0.7083 - accuracy: 0.7307
48/92 [==============>……………] - ETA: 2s - loss: 0.7098 - accuracy: 0.7298
49/92 [==============>……………] - ETA: 2s - loss: 0.7082 - accuracy: 0.7315
50/92 [===============>…………..] - ETA: 2s - loss: 0.7100 - accuracy: 0.7319
51/92 [===============>…………..] - ETA: 2s - loss: 0.7118 - accuracy: 0.7292
52/92 [===============>…………..] - ETA: 2s - loss: 0.7132 - accuracy: 0.7290
53/92 [================>………….] - ETA: 2s - loss: 0.7155 - accuracy: 0.7270
54/92 [================>………….] - ETA: 2s - loss: 0.7165 - accuracy: 0.7274
55/92 [================>………….] - ETA: 2s - loss: 0.7120 - accuracy: 0.7312
56/92 [=================>…………] - ETA: 2s - loss: 0.7100 - accuracy: 0.7321
57/92 [=================>…………] - ETA: 2s - loss: 0.7080 - accuracy: 0.7330
58/92 [=================>…………] - ETA: 1s - loss: 0.7078 - accuracy: 0.7322
59/92 [==================>………..] - ETA: 1s - loss: 0.7053 - accuracy: 0.7325
60/92 [==================>………..] - ETA: 1s - loss: 0.7040 - accuracy: 0.7323
61/92 [==================>………..] - ETA: 1s - loss: 0.7034 - accuracy: 0.7336
62/92 [===================>……….] - ETA: 1s - loss: 0.7052 - accuracy: 0.7329
63/92 [===================>……….] - ETA: 1s - loss: 0.7045 - accuracy: 0.7331
64/92 [===================>……….] - ETA: 1s - loss: 0.7029 - accuracy: 0.7329
65/92 [====================>………] - ETA: 1s - loss: 0.7010 - accuracy: 0.7327
66/92 [====================>………] - ETA: 1s - loss: 0.7028 - accuracy: 0.7320
67/92 [====================>………] - ETA: 1s - loss: 0.6986 - accuracy: 0.7341
68/92 [=====================>……..] - ETA: 1s - loss: 0.7025 - accuracy: 0.7335
69/92 [=====================>……..] - ETA: 1s - loss: 0.7025 - accuracy: 0.7337
70/92 [=====================>……..] - ETA: 1s - loss: 0.7004 - accuracy: 0.7339
71/92 [======================>…….] - ETA: 1s - loss: 0.7004 - accuracy: 0.7337
72/92 [======================>…….] - ETA: 1s - loss: 0.7039 - accuracy: 0.7326
73/92 [======================>…….] - ETA: 1s - loss: 0.7038 - accuracy: 0.7324
74/92 [=======================>……] - ETA: 1s - loss: 0.7075 - accuracy: 0.7314
75/92 [=======================>……] - ETA: 0s - loss: 0.7079 - accuracy: 0.7308
76/92 [=======================>……] - ETA: 0s - loss: 0.7064 - accuracy: 0.7315
77/92 [========================>…..] - ETA: 0s - loss: 0.7045 - accuracy: 0.7321
78/92 [========================>…..] - ETA: 0s - loss: 0.7033 - accuracy: 0.7324
79/92 [========================>…..] - ETA: 0s - loss: 0.7059 - accuracy: 0.7318
80/92 [=========================>….] - ETA: 0s - loss: 0.7084 - accuracy: 0.7305
81/92 [=========================>….] - ETA: 0s - loss: 0.7065 - accuracy: 0.7307
82/92 [=========================>….] - ETA: 0s - loss: 0.7112 - accuracy: 0.7287
83/92 [==========================>…] - ETA: 0s - loss: 0.7109 - accuracy: 0.7282
85/92 [==========================>…] - ETA: 0s - loss: 0.7080 - accuracy: 0.7301
86/92 [===========================>..] - ETA: 0s - loss: 0.7118 - accuracy: 0.7281
87/92 [===========================>..] - ETA: 0s - loss: 0.7095 - accuracy: 0.7287
88/92 [===========================>..] - ETA: 0s - loss: 0.7078 - accuracy: 0.7297
89/92 [============================>.] - ETA: 0s - loss: 0.7082 - accuracy: 0.7299
90/92 [============================>.] - ETA: 0s - loss: 0.7092 - accuracy: 0.7295
91/92 [============================>.] - ETA: 0s - loss: 0.7118 - accuracy: 0.7276
92/92 [==============================] - ETA: 0s - loss: 0.7101 - accuracy: 0.7279
92/92 [==============================] - 6s 64ms/step - loss: 0.7101 - accuracy: 0.7279 - val_loss: 0.7492 - val_accuracy: 0.7125
Epoch 8/15
1/92 [..............................] - ETA: 7s - loss: 0.4827 - accuracy: 0.8438
2/92 [..............................] - ETA: 5s - loss: 0.5721 - accuracy: 0.7969
3/92 [..............................] - ETA: 5s - loss: 0.5558 - accuracy: 0.7812
4/92 [>.............................] - ETA: 5s - loss: 0.5700 - accuracy: 0.7578
5/92 [>.............................] - ETA: 5s - loss: 0.5991 - accuracy: 0.7625
6/92 [>.............................] - ETA: 5s - loss: 0.6347 - accuracy: 0.7552
7/92 [=>............................] - ETA: 4s - loss: 0.6193 - accuracy: 0.7634
8/92 [=>............................] - ETA: 4s - loss: 0.6489 - accuracy: 0.7461
9/92 [=>............................] - ETA: 4s - loss: 0.6611 - accuracy: 0.7361
10/92 [==>………………………] - ETA: 4s - loss: 0.6498 - accuracy: 0.7375
11/92 [==>………………………] - ETA: 4s - loss: 0.6517 - accuracy: 0.7330
12/92 [==>………………………] - ETA: 4s - loss: 0.6420 - accuracy: 0.7448
13/92 [===>……………………..] - ETA: 4s - loss: 0.6429 - accuracy: 0.7476
14/92 [===>……………………..] - ETA: 4s - loss: 0.6445 - accuracy: 0.7455
15/92 [===>……………………..] - ETA: 4s - loss: 0.6421 - accuracy: 0.7437
16/92 [====>…………………….] - ETA: 4s - loss: 0.6356 - accuracy: 0.7461
17/92 [====>…………………….] - ETA: 4s - loss: 0.6272 - accuracy: 0.7500
18/92 [====>…………………….] - ETA: 4s - loss: 0.6430 - accuracy: 0.7396
19/92 [=====>……………………] - ETA: 4s - loss: 0.6390 - accuracy: 0.7434
20/92 [=====>……………………] - ETA: 4s - loss: 0.6434 - accuracy: 0.7422
21/92 [=====>……………………] - ETA: 4s - loss: 0.6388 - accuracy: 0.7470
22/92 [======>…………………..] - ETA: 4s - loss: 0.6338 - accuracy: 0.7472
23/92 [======>…………………..] - ETA: 4s - loss: 0.6420 - accuracy: 0.7418
24/92 [======>…………………..] - ETA: 3s - loss: 0.6480 - accuracy: 0.7383
25/92 [=======>………………….] - ETA: 3s - loss: 0.6484 - accuracy: 0.7425
26/92 [=======>………………….] - ETA: 3s - loss: 0.6535 - accuracy: 0.7440
27/92 [=======>………………….] - ETA: 3s - loss: 0.6577 - accuracy: 0.7407
28/92 [========>…………………] - ETA: 3s - loss: 0.6681 - accuracy: 0.7388
29/92 [========>…………………] - ETA: 3s - loss: 0.6671 - accuracy: 0.7414
30/92 [========>…………………] - ETA: 3s - loss: 0.6700 - accuracy: 0.7396
31/92 [=========>………………..] - ETA: 3s - loss: 0.6742 - accuracy: 0.7389
32/92 [=========>………………..] - ETA: 3s - loss: 0.6706 - accuracy: 0.7393
33/92 [=========>………………..] - ETA: 3s - loss: 0.6776 - accuracy: 0.7367
34/92 [==========>……………….] - ETA: 3s - loss: 0.6775 - accuracy: 0.7362
35/92 [==========>……………….] - ETA: 3s - loss: 0.6801 - accuracy: 0.7366
36/92 [==========>……………….] - ETA: 3s - loss: 0.6821 - accuracy: 0.7361
37/92 [===========>………………] - ETA: 3s - loss: 0.6873 - accuracy: 0.7356
38/92 [===========>………………] - ETA: 3s - loss: 0.6916 - accuracy: 0.7336
39/92 [===========>………………] - ETA: 3s - loss: 0.6910 - accuracy: 0.7324
40/92 [============>……………..] - ETA: 3s - loss: 0.6938 - accuracy: 0.7305
41/92 [============>……………..] - ETA: 2s - loss: 0.6958 - accuracy: 0.7294
42/92 [============>……………..] - ETA: 2s - loss: 0.6957 - accuracy: 0.7307
43/92 [=============>…………….] - ETA: 2s - loss: 0.6969 - accuracy: 0.7289
44/92 [=============>…………….] - ETA: 2s - loss: 0.6972 - accuracy: 0.7301
45/92 [=============>…………….] - ETA: 2s - loss: 0.6984 - accuracy: 0.7292
46/92 [==============>……………] - ETA: 2s - loss: 0.6993 - accuracy: 0.7283
47/92 [==============>……………] - ETA: 2s - loss: 0.6983 - accuracy: 0.7287
48/92 [==============>……………] - ETA: 2s - loss: 0.6976 - accuracy: 0.7298
49/92 [==============>……………] - ETA: 2s - loss: 0.6959 - accuracy: 0.7321
50/92 [===============>…………..] - ETA: 2s - loss: 0.6940 - accuracy: 0.7325
51/92 [===============>…………..] - ETA: 2s - loss: 0.6960 - accuracy: 0.7298
52/92 [===============>…………..] - ETA: 2s - loss: 0.6936 - accuracy: 0.7320
53/92 [================>………….] - ETA: 2s - loss: 0.6952 - accuracy: 0.7317
54/92 [================>………….] - ETA: 2s - loss: 0.6960 - accuracy: 0.7321
55/92 [================>………….] - ETA: 2s - loss: 0.6993 - accuracy: 0.7301
56/92 [=================>…………] - ETA: 2s - loss: 0.6994 - accuracy: 0.7299
57/92 [=================>…………] - ETA: 2s - loss: 0.7015 - accuracy: 0.7303
58/92 [=================>…………] - ETA: 1s - loss: 0.7040 - accuracy: 0.7279
59/92 [==================>………..] - ETA: 1s - loss: 0.7009 - accuracy: 0.7293
60/92 [==================>………..] - ETA: 1s - loss: 0.6984 - accuracy: 0.7318
61/92 [==================>………..] - ETA: 1s - loss: 0.6969 - accuracy: 0.7321
62/92 [===================>……….] - ETA: 1s - loss: 0.7041 - accuracy: 0.7308
63/92 [===================>……….] - ETA: 1s - loss: 0.7038 - accuracy: 0.7326
64/92 [===================>……….] - ETA: 1s - loss: 0.7015 - accuracy: 0.7349
65/92 [====================>………] - ETA: 1s - loss: 0.7055 - accuracy: 0.7327
66/92 [====================>………] - ETA: 1s - loss: 0.7060 - accuracy: 0.7311
67/92 [====================>………] - ETA: 1s - loss: 0.7073 - accuracy: 0.7304
68/92 [=====================>……..] - ETA: 1s - loss: 0.7051 - accuracy: 0.7316
69/92 [=====================>……..] - ETA: 1s - loss: 0.7052 - accuracy: 0.7310
70/92 [=====================>……..] - ETA: 1s - loss: 0.7040 - accuracy: 0.7312
71/92 [======================>…….] - ETA: 1s - loss: 0.7014 - accuracy: 0.7315
72/92 [======================>…….] - ETA: 1s - loss: 0.6997 - accuracy: 0.7313
73/92 [======================>…….] - ETA: 1s - loss: 0.7002 - accuracy: 0.7320
74/92 [=======================>……] - ETA: 1s - loss: 0.7024 - accuracy: 0.7306
75/92 [=======================>……] - ETA: 0s - loss: 0.7007 - accuracy: 0.7304
76/92 [=======================>……] - ETA: 0s - loss: 0.6981 - accuracy: 0.7315
77/92 [========================>…..] - ETA: 0s - loss: 0.6973 - accuracy: 0.7309
78/92 [========================>…..] - ETA: 0s - loss: 0.6992 - accuracy: 0.7304
79/92 [========================>…..] - ETA: 0s - loss: 0.7035 - accuracy: 0.7282
80/92 [=========================>….] - ETA: 0s - loss: 0.7038 - accuracy: 0.7277
81/92 [=========================>….] - ETA: 0s - loss: 0.7013 - accuracy: 0.7296
82/92 [=========================>….] - ETA: 0s - loss: 0.7029 - accuracy: 0.7287
83/92 [==========================>…] - ETA: 0s - loss: 0.7047 - accuracy: 0.7282
84/92 [==========================>…] - ETA: 0s - loss: 0.7022 - accuracy: 0.7295
85/92 [==========================>…] - ETA: 0s - loss: 0.7017 - accuracy: 0.7298
87/92 [===========================>..] - ETA: 0s - loss: 0.7005 - accuracy: 0.7298
88/92 [===========================>..] - ETA: 0s - loss: 0.7001 - accuracy: 0.7293
89/92 [============================>.] - ETA: 0s - loss: 0.6992 - accuracy: 0.7292
90/92 [============================>.] - ETA: 0s - loss: 0.7045 - accuracy: 0.7270
91/92 [============================>.] - ETA: 0s - loss: 0.7020 - accuracy: 0.7280
92/92 [==============================] - ETA: 0s - loss: 0.7008 - accuracy: 0.7285
92/92 [==============================] - 6s 64ms/step - loss: 0.7008 - accuracy: 0.7285 - val_loss: 0.7422 - val_accuracy: 0.7139
Epoch 9/15
1/92 [..............................] - ETA: 7s - loss: 0.5952 - accuracy: 0.8125
2/92 [..............................] - ETA: 5s - loss: 0.6358 - accuracy: 0.7656
3/92 [..............................] - ETA: 5s - loss: 0.6093 - accuracy: 0.7812
4/92 [>.............................] - ETA: 5s - loss: 0.6300 - accuracy: 0.7812
5/92 [>.............................] - ETA: 5s - loss: 0.6096 - accuracy: 0.7875
6/92 [>.............................] - ETA: 4s - loss: 0.6280 - accuracy: 0.7812
7/92 [=>............................] - ETA: 4s - loss: 0.6085 - accuracy: 0.7902
8/92 [=>............................] - ETA: 4s - loss: 0.5959 - accuracy: 0.7930
9/92 [=>............................] - ETA: 4s - loss: 0.6384 - accuracy: 0.7674
10/92 [==>………………………] - ETA: 4s - loss: 0.6578 - accuracy: 0.7594
11/92 [==>………………………] - ETA: 4s - loss: 0.6381 - accuracy: 0.7557
12/92 [==>………………………] - ETA: 4s - loss: 0.6323 - accuracy: 0.7526
13/92 [===>……………………..] - ETA: 4s - loss: 0.6182 - accuracy: 0.7620
14/92 [===>……………………..] - ETA: 4s - loss: 0.6125 - accuracy: 0.7679
15/92 [===>……………………..] - ETA: 4s - loss: 0.6359 - accuracy: 0.7604
16/92 [====>…………………….] - ETA: 4s - loss: 0.6226 - accuracy: 0.7676
17/92 [====>…………………….] - ETA: 4s - loss: 0.6165 - accuracy: 0.7665
18/92 [====>…………………….] - ETA: 4s - loss: 0.6314 - accuracy: 0.7622
19/92 [=====>……………………] - ETA: 4s - loss: 0.6526 - accuracy: 0.7549
20/92 [=====>……………………] - ETA: 4s - loss: 0.6501 - accuracy: 0.7578
21/92 [=====>……………………] - ETA: 4s - loss: 0.6442 - accuracy: 0.7574
22/92 [======>…………………..] - ETA: 4s - loss: 0.6517 - accuracy: 0.7557
23/92 [======>…………………..] - ETA: 4s - loss: 0.6544 - accuracy: 0.7554
24/92 [======>…………………..] - ETA: 3s - loss: 0.6612 - accuracy: 0.7526
25/92 [=======>………………….] - ETA: 3s - loss: 0.6606 - accuracy: 0.7538
26/92 [=======>………………….] - ETA: 3s - loss: 0.6734 - accuracy: 0.7476
27/92 [=======>………………….] - ETA: 3s - loss: 0.6781 - accuracy: 0.7465
28/92 [========>…………………] - ETA: 3s - loss: 0.6773 - accuracy: 0.7478
29/92 [========>…………………] - ETA: 3s - loss: 0.6764 - accuracy: 0.7468
30/92 [========>…………………] - ETA: 3s - loss: 0.6786 - accuracy: 0.7448
31/92 [=========>………………..] - ETA: 3s - loss: 0.6748 - accuracy: 0.7470
32/92 [=========>………………..] - ETA: 3s - loss: 0.6695 - accuracy: 0.7500
33/92 [=========>………………..] - ETA: 3s - loss: 0.6737 - accuracy: 0.7500
34/92 [==========>……………….] - ETA: 3s - loss: 0.6742 - accuracy: 0.7528
35/92 [==========>……………….] - ETA: 3s - loss: 0.6742 - accuracy: 0.7527
36/92 [==========>……………….] - ETA: 3s - loss: 0.6792 - accuracy: 0.7491
37/92 [===========>………………] - ETA: 3s - loss: 0.6797 - accuracy: 0.7492
38/92 [===========>………………] - ETA: 3s - loss: 0.6795 - accuracy: 0.7508
39/92 [===========>………………] - ETA: 3s - loss: 0.6798 - accuracy: 0.7524
40/92 [============>……………..] - ETA: 3s - loss: 0.6784 - accuracy: 0.7531
41/92 [============>……………..] - ETA: 2s - loss: 0.6777 - accuracy: 0.7546
42/92 [============>……………..] - ETA: 2s - loss: 0.6793 - accuracy: 0.7537
43/92 [=============>…………….] - ETA: 2s - loss: 0.6765 - accuracy: 0.7551
44/92 [=============>…………….] - ETA: 2s - loss: 0.6809 - accuracy: 0.7521
45/92 [=============>…………….] - ETA: 2s - loss: 0.6783 - accuracy: 0.7528
46/92 [==============>……………] - ETA: 2s - loss: 0.6778 - accuracy: 0.7520
47/92 [==============>……………] - ETA: 2s - loss: 0.6757 - accuracy: 0.7527
48/92 [==============>……………] - ETA: 2s - loss: 0.6733 - accuracy: 0.7552
49/92 [==============>……………] - ETA: 2s - loss: 0.6682 - accuracy: 0.7570
50/92 [===============>…………..] - ETA: 2s - loss: 0.6681 - accuracy: 0.7563
51/92 [===============>…………..] - ETA: 2s - loss: 0.6729 - accuracy: 0.7543
52/92 [===============>…………..] - ETA: 2s - loss: 0.6719 - accuracy: 0.7554
53/92 [================>………….] - ETA: 2s - loss: 0.6718 - accuracy: 0.7559
54/92 [================>………….] - ETA: 2s - loss: 0.6663 - accuracy: 0.7581
55/92 [================>………….] - ETA: 2s - loss: 0.6645 - accuracy: 0.7574
56/92 [=================>…………] - ETA: 2s - loss: 0.6582 - accuracy: 0.7600
57/92 [=================>…………] - ETA: 2s - loss: 0.6585 - accuracy: 0.7599
58/92 [=================>…………] - ETA: 1s - loss: 0.6641 - accuracy: 0.7586
59/92 [==================>………..] - ETA: 1s - loss: 0.6662 - accuracy: 0.7590
60/92 [==================>………..] - ETA: 1s - loss: 0.6648 - accuracy: 0.7599
61/92 [==================>………..] - ETA: 1s - loss: 0.6666 - accuracy: 0.7592
62/92 [===================>……….] - ETA: 1s - loss: 0.6684 - accuracy: 0.7576
63/92 [===================>……….] - ETA: 1s - loss: 0.6665 - accuracy: 0.7594
64/92 [===================>……….] - ETA: 1s - loss: 0.6657 - accuracy: 0.7598
65/92 [====================>………] - ETA: 1s - loss: 0.6704 - accuracy: 0.7582
66/92 [====================>………] - ETA: 1s - loss: 0.6705 - accuracy: 0.7576
67/92 [====================>………] - ETA: 1s - loss: 0.6678 - accuracy: 0.7579
68/92 [=====================>……..] - ETA: 1s - loss: 0.6693 - accuracy: 0.7574
69/92 [=====================>……..] - ETA: 1s - loss: 0.6673 - accuracy: 0.7586
71/92 [======================>…….] - ETA: 1s - loss: 0.6658 - accuracy: 0.7588
72/92 [======================>…….] - ETA: 1s - loss: 0.6684 - accuracy: 0.7583
73/92 [======================>…….] - ETA: 1s - loss: 0.6677 - accuracy: 0.7582
74/92 [=======================>……] - ETA: 1s - loss: 0.6661 - accuracy: 0.7581
75/92 [=======================>……] - ETA: 0s - loss: 0.6632 - accuracy: 0.7588
76/92 [=======================>……] - ETA: 0s - loss: 0.6618 - accuracy: 0.7591
77/92 [========================>…..] - ETA: 0s - loss: 0.6600 - accuracy: 0.7602
78/92 [========================>…..] - ETA: 0s - loss: 0.6619 - accuracy: 0.7592
79/92 [========================>…..] - ETA: 0s - loss: 0.6630 - accuracy: 0.7583
80/92 [=========================>….] - ETA: 0s - loss: 0.6628 - accuracy: 0.7586
81/92 [=========================>….] - ETA: 0s - loss: 0.6634 - accuracy: 0.7593
82/92 [=========================>….] - ETA: 0s - loss: 0.6638 - accuracy: 0.7592
83/92 [==========================>…] - ETA: 0s - loss: 0.6610 - accuracy: 0.7606
84/92 [==========================>…] - ETA: 0s - loss: 0.6593 - accuracy: 0.7616
85/92 [==========================>…] - ETA: 0s - loss: 0.6583 - accuracy: 0.7622
86/92 [===========================>..] - ETA: 0s - loss: 0.6555 - accuracy: 0.7635
87/92 [===========================>..] - ETA: 0s - loss: 0.6543 - accuracy: 0.7637
88/92 [===========================>..] - ETA: 0s - loss: 0.6534 - accuracy: 0.7639
89/92 [============================>.] - ETA: 0s - loss: 0.6534 - accuracy: 0.7637
90/92 [============================>.] - ETA: 0s - loss: 0.6510 - accuracy: 0.7646
91/92 [============================>.] - ETA: 0s - loss: 0.6516 - accuracy: 0.7634
92/92 [==============================] - ETA: 0s - loss: 0.6513 - accuracy: 0.7636
92/92 [==============================] - 6s 64ms/step - loss: 0.6513 - accuracy: 0.7636 - val_loss: 0.7100 - val_accuracy: 0.7166
Epoch 10/15
1/92 [..............................] - ETA: 7s - loss: 0.6052 - accuracy: 0.7812
2/92 [..............................] - ETA: 5s - loss: 0.7084 - accuracy: 0.7344
3/92 [..............................] - ETA: 5s - loss: 0.7363 - accuracy: 0.7292
4/92 [>.............................] - ETA: 5s - loss: 0.7406 - accuracy: 0.7031
5/92 [>.............................] - ETA: 5s - loss: 0.7162 - accuracy: 0.7125
6/92 [>.............................] - ETA: 5s - loss: 0.6760 - accuracy: 0.7344
7/92 [=>............................] - ETA: 5s - loss: 0.6778 - accuracy: 0.7455
8/92 [=>............................] - ETA: 4s - loss: 0.6985 - accuracy: 0.7422
9/92 [=>............................] - ETA: 4s - loss: 0.6704 - accuracy: 0.7500
10/92 [==>………………………] - ETA: 4s - loss: 0.6536 - accuracy: 0.7531
11/92 [==>………………………] - ETA: 4s - loss: 0.6671 - accuracy: 0.7415
12/92 [==>………………………] - ETA: 4s - loss: 0.6608 - accuracy: 0.7500
13/92 [===>……………………..] - ETA: 4s - loss: 0.6551 - accuracy: 0.7500
14/92 [===>……………………..] - ETA: 4s - loss: 0.6409 - accuracy: 0.7545
15/92 [===>……………………..] - ETA: 4s - loss: 0.6365 - accuracy: 0.7563
16/92 [====>…………………….] - ETA: 4s - loss: 0.6387 - accuracy: 0.7539
17/92 [====>…………………….] - ETA: 4s - loss: 0.6240 - accuracy: 0.7592
18/92 [====>…………………….] - ETA: 4s - loss: 0.6206 - accuracy: 0.7604
19/92 [=====>……………………] - ETA: 4s - loss: 0.6151 - accuracy: 0.7615
20/92 [=====>……………………] - ETA: 4s - loss: 0.6012 - accuracy: 0.7688
21/92 [=====>……………………] - ETA: 4s - loss: 0.6084 - accuracy: 0.7604
22/92 [======>…………………..] - ETA: 4s - loss: 0.6143 - accuracy: 0.7585
23/92 [======>…………………..] - ETA: 3s - loss: 0.6111 - accuracy: 0.7609
24/92 [======>…………………..] - ETA: 3s - loss: 0.6047 - accuracy: 0.7643
25/92 [=======>………………….] - ETA: 3s - loss: 0.6062 - accuracy: 0.7638
26/92 [=======>………………….] - ETA: 3s - loss: 0.6128 - accuracy: 0.7620
27/92 [=======>………………….] - ETA: 3s - loss: 0.6038 - accuracy: 0.7650
28/92 [========>…………………] - ETA: 3s - loss: 0.5998 - accuracy: 0.7667
29/92 [========>…………………] - ETA: 3s - loss: 0.6027 - accuracy: 0.7672
30/92 [========>…………………] - ETA: 3s - loss: 0.5977 - accuracy: 0.7698
31/92 [=========>………………..] - ETA: 3s - loss: 0.5898 - accuracy: 0.7742
32/92 [=========>………………..] - ETA: 3s - loss: 0.5843 - accuracy: 0.7764
33/92 [=========>………………..] - ETA: 3s - loss: 0.5808 - accuracy: 0.7784
34/92 [==========>……………….] - ETA: 3s - loss: 0.5807 - accuracy: 0.7785
35/92 [==========>……………….] - ETA: 3s - loss: 0.5837 - accuracy: 0.7777
36/92 [==========>……………….] - ETA: 3s - loss: 0.5817 - accuracy: 0.7769
37/92 [===========>………………] - ETA: 3s - loss: 0.5841 - accuracy: 0.7762
38/92 [===========>………………] - ETA: 3s - loss: 0.5874 - accuracy: 0.7747
39/92 [===========>………………] - ETA: 3s - loss: 0.5920 - accuracy: 0.7740
40/92 [============>……………..] - ETA: 3s - loss: 0.5974 - accuracy: 0.7711
41/92 [============>……………..] - ETA: 2s - loss: 0.5961 - accuracy: 0.7713
42/92 [============>……………..] - ETA: 2s - loss: 0.5957 - accuracy: 0.7708
43/92 [=============>…………….] - ETA: 2s - loss: 0.5937 - accuracy: 0.7718
44/92 [=============>…………….] - ETA: 2s - loss: 0.5961 - accuracy: 0.7727
45/92 [=============>…………….] - ETA: 2s - loss: 0.5961 - accuracy: 0.7715
46/92 [==============>……………] - ETA: 2s - loss: 0.5962 - accuracy: 0.7724
47/92 [==============>……………] - ETA: 2s - loss: 0.5937 - accuracy: 0.7733
48/92 [==============>……………] - ETA: 2s - loss: 0.5960 - accuracy: 0.7715
49/92 [==============>……………] - ETA: 2s - loss: 0.5972 - accuracy: 0.7723
50/92 [===============>…………..] - ETA: 2s - loss: 0.5950 - accuracy: 0.7719
52/92 [===============>…………..] - ETA: 2s - loss: 0.5917 - accuracy: 0.7729
53/92 [================>………….] - ETA: 2s - loss: 0.5941 - accuracy: 0.7731
54/92 [================>………….] - ETA: 2s - loss: 0.5972 - accuracy: 0.7721
55/92 [================>………….] - ETA: 2s - loss: 0.5983 - accuracy: 0.7705
56/92 [=================>…………] - ETA: 2s - loss: 0.5951 - accuracy: 0.7724
57/92 [=================>…………] - ETA: 2s - loss: 0.5926 - accuracy: 0.7737
58/92 [=================>…………] - ETA: 1s - loss: 0.5937 - accuracy: 0.7727
59/92 [==================>………..] - ETA: 1s - loss: 0.5921 - accuracy: 0.7729
60/92 [==================>………..] - ETA: 1s - loss: 0.5895 - accuracy: 0.7741
61/92 [==================>………..] - ETA: 1s - loss: 0.5880 - accuracy: 0.7747
62/92 [===================>……….] - ETA: 1s - loss: 0.5898 - accuracy: 0.7743
63/92 [===================>……….] - ETA: 1s - loss: 0.5955 - accuracy: 0.7734
64/92 [===================>……….] - ETA: 1s - loss: 0.5929 - accuracy: 0.7755
65/92 [====================>………] - ETA: 1s - loss: 0.5930 - accuracy: 0.7761
66/92 [====================>………] - ETA: 1s - loss: 0.5938 - accuracy: 0.7757
67/92 [====================>………] - ETA: 1s - loss: 0.5978 - accuracy: 0.7734
68/92 [=====================>……..] - ETA: 1s - loss: 0.5989 - accuracy: 0.7721
69/92 [=====================>……..] - ETA: 1s - loss: 0.5998 - accuracy: 0.7714
70/92 [=====================>……..] - ETA: 1s - loss: 0.6006 - accuracy: 0.7715
71/92 [======================>…….] - ETA: 1s - loss: 0.6049 - accuracy: 0.7694
72/92 [======================>…….] - ETA: 1s - loss: 0.6058 - accuracy: 0.7696
73/92 [======================>…….] - ETA: 1s - loss: 0.6072 - accuracy: 0.7685
74/92 [=======================>……] - ETA: 1s - loss: 0.6073 - accuracy: 0.7686
75/92 [=======================>……] - ETA: 0s - loss: 0.6071 - accuracy: 0.7684
76/92 [=======================>……] - ETA: 0s - loss: 0.6093 - accuracy: 0.7661
77/92 [========================>…..] - ETA: 0s - loss: 0.6097 - accuracy: 0.7667
78/92 [========================>…..] - ETA: 0s - loss: 0.6090 - accuracy: 0.7673
79/92 [========================>…..] - ETA: 0s - loss: 0.6091 - accuracy: 0.7679
80/92 [=========================>….] - ETA: 0s - loss: 0.6084 - accuracy: 0.7680
81/92 [=========================>….] - ETA: 0s - loss: 0.6114 - accuracy: 0.7659
82/92 [=========================>….] - ETA: 0s - loss: 0.6117 - accuracy: 0.7653
83/92 [==========================>…] - ETA: 0s - loss: 0.6108 - accuracy: 0.7655
84/92 [==========================>…] - ETA: 0s - loss: 0.6127 - accuracy: 0.7631
85/92 [==========================>…] - ETA: 0s - loss: 0.6124 - accuracy: 0.7644
86/92 [===========================>..] - ETA: 0s - loss: 0.6118 - accuracy: 0.7638
87/92 [===========================>..] - ETA: 0s - loss: 0.6130 - accuracy: 0.7633
88/92 [===========================>..] - ETA: 0s - loss: 0.6134 - accuracy: 0.7625
89/92 [============================>.] - ETA: 0s - loss: 0.6146 - accuracy: 0.7616
90/92 [============================>.] - ETA: 0s - loss: 0.6157 - accuracy: 0.7608
91/92 [============================>.] - ETA: 0s - loss: 0.6157 - accuracy: 0.7617
92/92 [==============================] - ETA: 0s - loss: 0.6144 - accuracy: 0.7616
92/92 [==============================] - 6s 65ms/step - loss: 0.6144 - accuracy: 0.7616 - val_loss: 0.7003 - val_accuracy: 0.7180
Epoch 11/15
1/92 [..............................] - ETA: 7s - loss: 0.5609 - accuracy: 0.8438
2/92 [..............................] - ETA: 5s - loss: 0.6191 - accuracy: 0.8125
3/92 [..............................] - ETA: 5s - loss: 0.6127 - accuracy: 0.7917
4/92 [>.............................] - ETA: 5s - loss: 0.5744 - accuracy: 0.8047
5/92 [>.............................] - ETA: 5s - loss: 0.5683 - accuracy: 0.8188
6/92 [>.............................] - ETA: 4s - loss: 0.5502 - accuracy: 0.8125
7/92 [=>............................] - ETA: 4s - loss: 0.5450 - accuracy: 0.8170
8/92 [=>............................] - ETA: 4s - loss: 0.5383 - accuracy: 0.8164
9/92 [=>............................] - ETA: 4s - loss: 0.5373 - accuracy: 0.8125
10/92 [==>………………………] - ETA: 4s - loss: 0.5662 - accuracy: 0.8000
11/92 [==>………………………] - ETA: 4s - loss: 0.5610 - accuracy: 0.8011
12/92 [==>………………………] - ETA: 4s - loss: 0.5628 - accuracy: 0.7969
13/92 [===>……………………..] - ETA: 4s - loss: 0.5678 - accuracy: 0.7885
14/92 [===>……………………..] - ETA: 4s - loss: 0.5793 - accuracy: 0.7857
15/92 [===>……………………..] - ETA: 4s - loss: 0.5903 - accuracy: 0.7812
16/92 [====>…………………….] - ETA: 4s - loss: 0.5735 - accuracy: 0.7871
17/92 [====>…………………….] - ETA: 4s - loss: 0.5678 - accuracy: 0.7941
18/92 [====>…………………….] - ETA: 4s - loss: 0.5694 - accuracy: 0.7951
19/92 [=====>……………………] - ETA: 4s - loss: 0.5696 - accuracy: 0.7911
20/92 [=====>……………………] - ETA: 4s - loss: 0.5765 - accuracy: 0.7875
21/92 [=====>……………………] - ETA: 4s - loss: 0.5767 - accuracy: 0.7887
22/92 [======>…………………..] - ETA: 4s - loss: 0.5683 - accuracy: 0.7912
23/92 [======>…………………..] - ETA: 3s - loss: 0.5697 - accuracy: 0.7894
24/92 [======>…………………..] - ETA: 3s - loss: 0.5762 - accuracy: 0.7852
25/92 [=======>………………….] - ETA: 3s - loss: 0.5817 - accuracy: 0.7812
26/92 [=======>………………….] - ETA: 3s - loss: 0.5912 - accuracy: 0.7800
27/92 [=======>………………….] - ETA: 3s - loss: 0.5881 - accuracy: 0.7836
29/92 [========>…………………] - ETA: 3s - loss: 0.5812 - accuracy: 0.7859
30/92 [========>…………………] - ETA: 3s - loss: 0.5884 - accuracy: 0.7847
31/92 [=========>………………..] - ETA: 3s - loss: 0.5870 - accuracy: 0.7856
32/92 [=========>………………..] - ETA: 3s - loss: 0.5873 - accuracy: 0.7844
33/92 [=========>………………..] - ETA: 3s - loss: 0.5851 - accuracy: 0.7853
34/92 [==========>……………….] - ETA: 3s - loss: 0.5834 - accuracy: 0.7833
35/92 [==========>……………….] - ETA: 3s - loss: 0.5811 - accuracy: 0.7851
36/92 [==========>……………….] - ETA: 3s - loss: 0.5825 - accuracy: 0.7841
37/92 [===========>………………] - ETA: 3s - loss: 0.5817 - accuracy: 0.7840
38/92 [===========>………………] - ETA: 3s - loss: 0.5770 - accuracy: 0.7864
39/92 [===========>………………] - ETA: 3s - loss: 0.5761 - accuracy: 0.7879
40/92 [============>……………..] - ETA: 2s - loss: 0.5797 - accuracy: 0.7862
41/92 [============>……………..] - ETA: 2s - loss: 0.5812 - accuracy: 0.7860
42/92 [============>……………..] - ETA: 2s - loss: 0.5845 - accuracy: 0.7837
43/92 [=============>…………….] - ETA: 2s - loss: 0.5830 - accuracy: 0.7844
44/92 [=============>…………….] - ETA: 2s - loss: 0.5943 - accuracy: 0.7800
45/92 [=============>…………….] - ETA: 2s - loss: 0.5970 - accuracy: 0.7800
46/92 [==============>……………] - ETA: 2s - loss: 0.6007 - accuracy: 0.7773
47/92 [==============>……………] - ETA: 2s - loss: 0.6019 - accuracy: 0.7781
48/92 [==============>……………] - ETA: 2s - loss: 0.5988 - accuracy: 0.7801
49/92 [==============>……………] - ETA: 2s - loss: 0.6033 - accuracy: 0.7776
50/92 [===============>…………..] - ETA: 2s - loss: 0.6090 - accuracy: 0.7770
51/92 [===============>…………..] - ETA: 2s - loss: 0.6071 - accuracy: 0.7777
52/92 [===============>…………..] - ETA: 2s - loss: 0.6072 - accuracy: 0.7760
53/92 [================>………….] - ETA: 2s - loss: 0.6075 - accuracy: 0.7749
54/92 [================>………….] - ETA: 2s - loss: 0.6071 - accuracy: 0.7750
55/92 [================>………….] - ETA: 2s - loss: 0.6111 - accuracy: 0.7728
56/92 [=================>…………] - ETA: 2s - loss: 0.6106 - accuracy: 0.7735
57/92 [=================>…………] - ETA: 2s - loss: 0.6089 - accuracy: 0.7742
58/92 [=================>…………] - ETA: 1s - loss: 0.6079 - accuracy: 0.7744
59/92 [==================>………..] - ETA: 1s - loss: 0.6050 - accuracy: 0.7766
60/92 [==================>………..] - ETA: 1s - loss: 0.6046 - accuracy: 0.7762
61/92 [==================>………..] - ETA: 1s - loss: 0.6085 - accuracy: 0.7752
62/92 [===================>……….] - ETA: 1s - loss: 0.6054 - accuracy: 0.7763
63/92 [===================>……….] - ETA: 1s - loss: 0.6062 - accuracy: 0.7754
64/92 [===================>……….] - ETA: 1s - loss: 0.6037 - accuracy: 0.7775
65/92 [====================>………] - ETA: 1s - loss: 0.6036 - accuracy: 0.7770
66/92 [====================>………] - ETA: 1s - loss: 0.6023 - accuracy: 0.7776
67/92 [====================>………] - ETA: 1s - loss: 0.6014 - accuracy: 0.7776
68/92 [=====================>……..] - ETA: 1s - loss: 0.5986 - accuracy: 0.7781
69/92 [=====================>……..] - ETA: 1s - loss: 0.5991 - accuracy: 0.7773
70/92 [=====================>……..] - ETA: 1s - loss: 0.5972 - accuracy: 0.7778
71/92 [======================>…….] - ETA: 1s - loss: 0.5990 - accuracy: 0.7761
72/92 [======================>…….] - ETA: 1s - loss: 0.5993 - accuracy: 0.7761
73/92 [======================>…….] - ETA: 1s - loss: 0.5993 - accuracy: 0.7758
74/92 [=======================>……] - ETA: 1s - loss: 0.5950 - accuracy: 0.7767
75/92 [=======================>……] - ETA: 0s - loss: 0.5950 - accuracy: 0.7755
76/92 [=======================>……] - ETA: 0s - loss: 0.5929 - accuracy: 0.7764
77/92 [========================>…..] - ETA: 0s - loss: 0.5914 - accuracy: 0.7769
78/92 [========================>…..] - ETA: 0s - loss: 0.5930 - accuracy: 0.7765
79/92 [========================>…..] - ETA: 0s - loss: 0.5929 - accuracy: 0.7754
80/92 [=========================>….] - ETA: 0s - loss: 0.5919 - accuracy: 0.7759
81/92 [=========================>….] - ETA: 0s - loss: 0.5910 - accuracy: 0.7759
82/92 [=========================>….] - ETA: 0s - loss: 0.5926 - accuracy: 0.7752
83/92 [==========================>…] - ETA: 0s - loss: 0.5900 - accuracy: 0.7764
84/92 [==========================>…] - ETA: 0s - loss: 0.5929 - accuracy: 0.7754
85/92 [==========================>…] - ETA: 0s - loss: 0.5931 - accuracy: 0.7758
86/92 [===========================>..] - ETA: 0s - loss: 0.5959 - accuracy: 0.7751
87/92 [===========================>..] - ETA: 0s - loss: 0.5965 - accuracy: 0.7749
88/92 [===========================>..] - ETA: 0s - loss: 0.5951 - accuracy: 0.7760
89/92 [============================>.] - ETA: 0s - loss: 0.5932 - accuracy: 0.7768
90/92 [============================>.] - ETA: 0s - loss: 0.5922 - accuracy: 0.7775
91/92 [============================>.] - ETA: 0s - loss: 0.5940 - accuracy: 0.7769
92/92 [==============================] - ETA: 0s - loss: 0.5958 - accuracy: 0.7755
92/92 [==============================] - 6s 64ms/step - loss: 0.5958 - accuracy: 0.7755 - val_loss: 0.6917 - val_accuracy: 0.7343
Epoch 12/15
1/92 [..............................] - ETA: 7s - loss: 0.5158 - accuracy: 0.8438
2/92 [..............................] - ETA: 5s - loss: 0.4748 - accuracy: 0.8594
3/92 [..............................] - ETA: 5s - loss: 0.4990 - accuracy: 0.8542
4/92 [>.............................] - ETA: 5s - loss: 0.5068 - accuracy: 0.8438
5/92 [>.............................] - ETA: 5s - loss: 0.4829 - accuracy: 0.8500
6/92 [>.............................] - ETA: 5s - loss: 0.4823 - accuracy: 0.8438
7/92 [=>............................] - ETA: 4s - loss: 0.4679 - accuracy: 0.8482
8/92 [=>............................] - ETA: 4s - loss: 0.4979 - accuracy: 0.8320
9/92 [=>............................] - ETA: 4s - loss: 0.5087 - accuracy: 0.8299
10/92 [==>………………………] - ETA: 4s - loss: 0.5058 - accuracy: 0.8313
11/92 [==>………………………] - ETA: 4s - loss: 0.5191 - accuracy: 0.8210
12/92 [==>………………………] - ETA: 4s - loss: 0.5316 - accuracy: 0.8125
13/92 [===>……………………..] - ETA: 4s - loss: 0.5217 - accuracy: 0.8125
14/92 [===>……………………..] - ETA: 4s - loss: 0.5414 - accuracy: 0.8036
15/92 [===>……………………..] - ETA: 4s - loss: 0.5347 - accuracy: 0.8083
16/92 [====>…………………….] - ETA: 4s - loss: 0.5313 - accuracy: 0.8086
17/92 [====>…………………….] - ETA: 4s - loss: 0.5244 - accuracy: 0.8088
18/92 [====>…………………….] - ETA: 4s - loss: 0.5229 - accuracy: 0.8090
19/92 [=====>……………………] - ETA: 4s - loss: 0.5354 - accuracy: 0.7993
20/92 [=====>……………………] - ETA: 4s - loss: 0.5348 - accuracy: 0.8000
21/92 [=====>……………………] - ETA: 4s - loss: 0.5396 - accuracy: 0.7946
22/92 [======>…………………..] - ETA: 4s - loss: 0.5375 - accuracy: 0.7926
23/92 [======>…………………..] - ETA: 4s - loss: 0.5355 - accuracy: 0.7962
24/92 [======>…………………..] - ETA: 3s - loss: 0.5312 - accuracy: 0.7982
25/92 [=======>………………….] - ETA: 3s - loss: 0.5287 - accuracy: 0.7975
26/92 [=======>………………….] - ETA: 3s - loss: 0.5257 - accuracy: 0.7981
27/92 [=======>………………….] - ETA: 3s - loss: 0.5265 - accuracy: 0.7986
28/92 [========>…………………] - ETA: 3s - loss: 0.5345 - accuracy: 0.7980
30/92 [========>…………………] - ETA: 3s - loss: 0.5298 - accuracy: 0.7973
31/92 [=========>………………..] - ETA: 3s - loss: 0.5337 - accuracy: 0.7978
32/92 [=========>………………..] - ETA: 3s - loss: 0.5332 - accuracy: 0.7963
33/92 [=========>………………..] - ETA: 3s - loss: 0.5406 - accuracy: 0.7920
34/92 [==========>……………….] - ETA: 3s - loss: 0.5398 - accuracy: 0.7944
35/92 [==========>……………….] - ETA: 3s - loss: 0.5319 - accuracy: 0.7986
36/92 [==========>……………….] - ETA: 3s - loss: 0.5308 - accuracy: 0.7990
37/92 [===========>………………] - ETA: 3s - loss: 0.5460 - accuracy: 0.7976
38/92 [===========>………………] - ETA: 3s - loss: 0.5438 - accuracy: 0.7972
39/92 [===========>………………] - ETA: 3s - loss: 0.5420 - accuracy: 0.7984
40/92 [============>……………..] - ETA: 3s - loss: 0.5416 - accuracy: 0.7980
41/92 [============>……………..] - ETA: 2s - loss: 0.5411 - accuracy: 0.7983
42/92 [============>……………..] - ETA: 2s - loss: 0.5440 - accuracy: 0.7964
43/92 [=============>…………….] - ETA: 2s - loss: 0.5429 - accuracy: 0.7982
44/92 [=============>…………….] - ETA: 2s - loss: 0.5435 - accuracy: 0.7993
45/92 [=============>…………….] - ETA: 2s - loss: 0.5461 - accuracy: 0.7982
46/92 [==============>……………] - ETA: 2s - loss: 0.5474 - accuracy: 0.7992
47/92 [==============>……………] - ETA: 2s - loss: 0.5516 - accuracy: 0.7981
48/92 [==============>……………] - ETA: 2s - loss: 0.5529 - accuracy: 0.7971
49/92 [==============>……………] - ETA: 2s - loss: 0.5537 - accuracy: 0.7962
50/92 [===============>…………..] - ETA: 2s - loss: 0.5525 - accuracy: 0.7971
51/92 [===============>…………..] - ETA: 2s - loss: 0.5512 - accuracy: 0.7962
52/92 [===============>…………..] - ETA: 2s - loss: 0.5524 - accuracy: 0.7947
53/92 [================>………….] - ETA: 2s - loss: 0.5529 - accuracy: 0.7956
54/92 [================>………….] - ETA: 2s - loss: 0.5568 - accuracy: 0.7924
55/92 [================>………….] - ETA: 2s - loss: 0.5531 - accuracy: 0.7945
56/92 [=================>…………] - ETA: 2s - loss: 0.5524 - accuracy: 0.7943
57/92 [=================>…………] - ETA: 2s - loss: 0.5604 - accuracy: 0.7902
58/92 [=================>…………] - ETA: 1s - loss: 0.5582 - accuracy: 0.7911
59/92 [==================>………..] - ETA: 1s - loss: 0.5608 - accuracy: 0.7915
60/92 [==================>………..] - ETA: 1s - loss: 0.5603 - accuracy: 0.7924
61/92 [==================>………..] - ETA: 1s - loss: 0.5640 - accuracy: 0.7912
62/92 [===================>……….] - ETA: 1s - loss: 0.5657 - accuracy: 0.7900
63/92 [===================>……….] - ETA: 1s - loss: 0.5675 - accuracy: 0.7893
64/92 [===================>……….] - ETA: 1s - loss: 0.5684 - accuracy: 0.7892
65/92 [====================>………] - ETA: 1s - loss: 0.5670 - accuracy: 0.7891
66/92 [====================>………] - ETA: 1s - loss: 0.5671 - accuracy: 0.7880
67/92 [====================>………] - ETA: 1s - loss: 0.5679 - accuracy: 0.7870
68/92 [=====================>……..] - ETA: 1s - loss: 0.5679 - accuracy: 0.7874
69/92 [=====================>……..] - ETA: 1s - loss: 0.5674 - accuracy: 0.7868
70/92 [=====================>……..] - ETA: 1s - loss: 0.5682 - accuracy: 0.7872
71/92 [======================>…….] - ETA: 1s - loss: 0.5684 - accuracy: 0.7862
72/92 [======================>…….] - ETA: 1s - loss: 0.5679 - accuracy: 0.7861
73/92 [======================>…….] - ETA: 1s - loss: 0.5695 - accuracy: 0.7848
74/92 [=======================>……] - ETA: 1s - loss: 0.5699 - accuracy: 0.7852
75/92 [=======================>……] - ETA: 0s - loss: 0.5709 - accuracy: 0.7834
76/92 [=======================>……] - ETA: 0s - loss: 0.5713 - accuracy: 0.7826
77/92 [========================>…..] - ETA: 0s - loss: 0.5702 - accuracy: 0.7834
78/92 [========================>…..] - ETA: 0s - loss: 0.5696 - accuracy: 0.7838
79/92 [========================>…..] - ETA: 0s - loss: 0.5722 - accuracy: 0.7833
80/92 [=========================>….] - ETA: 0s - loss: 0.5749 - accuracy: 0.7817
81/92 [=========================>….] - ETA: 0s - loss: 0.5762 - accuracy: 0.7817
82/92 [=========================>….] - ETA: 0s - loss: 0.5755 - accuracy: 0.7829
83/92 [==========================>…] - ETA: 0s - loss: 0.5726 - accuracy: 0.7847
84/92 [==========================>…] - ETA: 0s - loss: 0.5699 - accuracy: 0.7854
85/92 [==========================>…] - ETA: 0s - loss: 0.5690 - accuracy: 0.7850
86/92 [===========================>..] - ETA: 0s - loss: 0.5712 - accuracy: 0.7839
87/92 [===========================>..] - ETA: 0s - loss: 0.5746 - accuracy: 0.7828
88/92 [===========================>..] - ETA: 0s - loss: 0.5740 - accuracy: 0.7828
89/92 [============================>.] - ETA: 0s - loss: 0.5743 - accuracy: 0.7827
90/92 [============================>.] - ETA: 0s - loss: 0.5750 - accuracy: 0.7824
91/92 [============================>.] - ETA: 0s - loss: 0.5774 - accuracy: 0.7820
92/92 [==============================] - ETA: 0s - loss: 0.5758 - accuracy: 0.7827
92/92 [==============================] - 6s 64ms/step - loss: 0.5758 - accuracy: 0.7827 - val_loss: 0.6737 - val_accuracy: 0.7425
Epoch 13/15
1/92 [..............................] - ETA: 7s - loss: 0.4396 - accuracy: 0.8125
2/92 [..............................] - ETA: 5s - loss: 0.4462 - accuracy: 0.8125
3/92 [..............................] - ETA: 5s - loss: 0.4262 - accuracy: 0.8229
4/92 [>.............................] - ETA: 5s - loss: 0.4356 - accuracy: 0.8203
5/92 [>.............................] - ETA: 5s - loss: 0.4686 - accuracy: 0.8062
6/92 [>.............................] - ETA: 4s - loss: 0.4781 - accuracy: 0.8021
7/92 [=>............................] - ETA: 4s - loss: 0.4812 - accuracy: 0.8036
8/92 [=>............................] - ETA: 4s - loss: 0.4844 - accuracy: 0.8047
9/92 [=>............................] - ETA: 4s - loss: 0.5125 - accuracy: 0.7917
10/92 [==>………………………] - ETA: 4s - loss: 0.5056 - accuracy: 0.7906
11/92 [==>………………………] - ETA: 4s - loss: 0.5237 - accuracy: 0.7869
12/92 [==>………………………] - ETA: 4s - loss: 0.5073 - accuracy: 0.7995
13/92 [===>……………………..] - ETA: 4s - loss: 0.5007 - accuracy: 0.8029
14/92 [===>……………………..] - ETA: 4s - loss: 0.5219 - accuracy: 0.7902
15/92 [===>……………………..] - ETA: 4s - loss: 0.5231 - accuracy: 0.7958
16/92 [====>…………………….] - ETA: 4s - loss: 0.5311 - accuracy: 0.7988
17/92 [====>…………………….] - ETA: 4s - loss: 0.5357 - accuracy: 0.7960
18/92 [====>…………………….] - ETA: 4s - loss: 0.5235 - accuracy: 0.8003
20/92 [=====>……………………] - ETA: 4s - loss: 0.5218 - accuracy: 0.7991
21/92 [=====>……………………] - ETA: 4s - loss: 0.5102 - accuracy: 0.8042
22/92 [======>…………………..] - ETA: 4s - loss: 0.5218 - accuracy: 0.8003
23/92 [======>…………………..] - ETA: 3s - loss: 0.5211 - accuracy: 0.8022
24/92 [======>…………………..] - ETA: 3s - loss: 0.5188 - accuracy: 0.8039
25/92 [=======>………………….] - ETA: 3s - loss: 0.5143 - accuracy: 0.8068
26/92 [=======>………………….] - ETA: 3s - loss: 0.5156 - accuracy: 0.8070
27/92 [=======>………………….] - ETA: 3s - loss: 0.5277 - accuracy: 0.8002
28/92 [========>…………………] - ETA: 3s - loss: 0.5307 - accuracy: 0.7984
29/92 [========>…………………] - ETA: 3s - loss: 0.5394 - accuracy: 0.7957
30/92 [========>…………………] - ETA: 3s - loss: 0.5341 - accuracy: 0.7983
31/92 [=========>………………..] - ETA: 3s - loss: 0.5317 - accuracy: 0.7978
32/92 [=========>………………..] - ETA: 3s - loss: 0.5310 - accuracy: 0.7972
33/92 [=========>………………..] - ETA: 3s - loss: 0.5332 - accuracy: 0.7968
34/92 [==========>……………….] - ETA: 3s - loss: 0.5486 - accuracy: 0.7880
35/92 [==========>……………….] - ETA: 3s - loss: 0.5522 - accuracy: 0.7896
36/92 [==========>……………….] - ETA: 3s - loss: 0.5504 - accuracy: 0.7920
37/92 [===========>………………] - ETA: 3s - loss: 0.5485 - accuracy: 0.7951
38/92 [===========>………………] - ETA: 3s - loss: 0.5474 - accuracy: 0.7955
39/92 [===========>………………] - ETA: 3s - loss: 0.5452 - accuracy: 0.7976
40/92 [============>……………..] - ETA: 2s - loss: 0.5448 - accuracy: 0.7980
41/92 [============>……………..] - ETA: 2s - loss: 0.5430 - accuracy: 0.7991
42/92 [============>……………..] - ETA: 2s - loss: 0.5441 - accuracy: 0.7957
43/92 [=============>…………….] - ETA: 2s - loss: 0.5452 - accuracy: 0.7939
44/92 [=============>…………….] - ETA: 2s - loss: 0.5462 - accuracy: 0.7936
45/92 [=============>…………….] - ETA: 2s - loss: 0.5492 - accuracy: 0.7919
46/92 [==============>……………] - ETA: 2s - loss: 0.5471 - accuracy: 0.7923
47/92 [==============>……………] - ETA: 2s - loss: 0.5492 - accuracy: 0.7914
48/92 [==============>……………] - ETA: 2s - loss: 0.5489 - accuracy: 0.7906
49/92 [==============>……………] - ETA: 2s - loss: 0.5501 - accuracy: 0.7904
50/92 [===============>…………..] - ETA: 2s - loss: 0.5509 - accuracy: 0.7883
51/92 [===============>…………..] - ETA: 2s - loss: 0.5542 - accuracy: 0.7869
52/92 [===============>…………..] - ETA: 2s - loss: 0.5537 - accuracy: 0.7874
53/92 [================>………….] - ETA: 2s - loss: 0.5512 - accuracy: 0.7885
54/92 [================>………….] - ETA: 2s - loss: 0.5559 - accuracy: 0.7878
55/92 [================>………….] - ETA: 2s - loss: 0.5574 - accuracy: 0.7882
56/92 [=================>…………] - ETA: 2s - loss: 0.5546 - accuracy: 0.7898
57/92 [=================>…………] - ETA: 2s - loss: 0.5586 - accuracy: 0.7880
58/92 [=================>…………] - ETA: 1s - loss: 0.5582 - accuracy: 0.7873
59/92 [==================>………..] - ETA: 1s - loss: 0.5607 - accuracy: 0.7862
60/92 [==================>………..] - ETA: 1s - loss: 0.5620 - accuracy: 0.7861
61/92 [==================>………..] - ETA: 1s - loss: 0.5641 - accuracy: 0.7845
62/92 [===================>……….] - ETA: 1s - loss: 0.5626 - accuracy: 0.7854
63/92 [===================>……….] - ETA: 1s - loss: 0.5606 - accuracy: 0.7874
64/92 [===================>……….] - ETA: 1s - loss: 0.5580 - accuracy: 0.7877
65/92 [====================>………] - ETA: 1s - loss: 0.5549 - accuracy: 0.7891
66/92 [====================>………] - ETA: 1s - loss: 0.5561 - accuracy: 0.7866
67/92 [====================>………] - ETA: 1s - loss: 0.5561 - accuracy: 0.7865
68/92 [=====================>……..] - ETA: 1s - loss: 0.5565 - accuracy: 0.7864
69/92 [=====================>……..] - ETA: 1s - loss: 0.5549 - accuracy: 0.7882
70/92 [=====================>……..] - ETA: 1s - loss: 0.5530 - accuracy: 0.7890
71/92 [======================>…….] - ETA: 1s - loss: 0.5503 - accuracy: 0.7902
72/92 [======================>…….] - ETA: 1s - loss: 0.5483 - accuracy: 0.7914
73/92 [======================>…….] - ETA: 1s - loss: 0.5492 - accuracy: 0.7912
74/92 [=======================>……] - ETA: 1s - loss: 0.5528 - accuracy: 0.7898
75/92 [=======================>……] - ETA: 0s - loss: 0.5583 - accuracy: 0.7897
76/92 [=======================>……] - ETA: 0s - loss: 0.5578 - accuracy: 0.7896
77/92 [========================>…..] - ETA: 0s - loss: 0.5585 - accuracy: 0.7895
78/92 [========================>…..] - ETA: 0s - loss: 0.5557 - accuracy: 0.7910
79/92 [========================>…..] - ETA: 0s - loss: 0.5546 - accuracy: 0.7913
80/92 [=========================>….] - ETA: 0s - loss: 0.5538 - accuracy: 0.7915
81/92 [=========================>….] - ETA: 0s - loss: 0.5527 - accuracy: 0.7918
82/92 [=========================>….] - ETA: 0s - loss: 0.5529 - accuracy: 0.7913
83/92 [==========================>…] - ETA: 0s - loss: 0.5508 - accuracy: 0.7923
84/92 [==========================>…] - ETA: 0s - loss: 0.5517 - accuracy: 0.7918
85/92 [==========================>…] - ETA: 0s - loss: 0.5550 - accuracy: 0.7917
86/92 [===========================>..] - ETA: 0s - loss: 0.5556 - accuracy: 0.7912
87/92 [===========================>..] - ETA: 0s - loss: 0.5538 - accuracy: 0.7921
88/92 [===========================>..] - ETA: 0s - loss: 0.5527 - accuracy: 0.7931
89/92 [============================>.] - ETA: 0s - loss: 0.5515 - accuracy: 0.7937
90/92 [============================>.] - ETA: 0s - loss: 0.5509 - accuracy: 0.7942
91/92 [============================>.] - ETA: 0s - loss: 0.5499 - accuracy: 0.7948
92/92 [==============================] - ETA: 0s - loss: 0.5491 - accuracy: 0.7939
92/92 [==============================] - 6s 64ms/step - loss: 0.5491 - accuracy: 0.7939 - val_loss: 0.6629 - val_accuracy: 0.7493
Epoch 14/15
1/92 [..............................] - ETA: 8s - loss: 0.4716 - accuracy: 0.7812
2/92 [..............................] - ETA: 5s - loss: 0.5226 - accuracy: 0.8125
3/92 [..............................] - ETA: 5s - loss: 0.5847 - accuracy: 0.7604
4/92 [>.............................] - ETA: 5s - loss: 0.5652 - accuracy: 0.7578
5/92 [>.............................] - ETA: 5s - loss: 0.5379 - accuracy: 0.7688
6/92 [>.............................] - ETA: 5s - loss: 0.5409 - accuracy: 0.7708
7/92 [=>............................] - ETA: 4s - loss: 0.5271 - accuracy: 0.7902
8/92 [=>............................] - ETA: 4s - loss: 0.5225 - accuracy: 0.7930
9/92 [=>............................] - ETA: 4s - loss: 0.4996 - accuracy: 0.8056
10/92 [==>………………………] - ETA: 4s - loss: 0.5121 - accuracy: 0.7969
11/92 [==>………………………] - ETA: 4s - loss: 0.5245 - accuracy: 0.7926
12/92 [==>………………………] - ETA: 4s - loss: 0.5377 - accuracy: 0.7812
13/92 [===>……………………..] - ETA: 4s - loss: 0.5415 - accuracy: 0.7788
14/92 [===>……………………..] - ETA: 4s - loss: 0.5262 - accuracy: 0.7835
15/92 [===>……………………..] - ETA: 4s - loss: 0.5100 - accuracy: 0.7917
16/92 [====>…………………….] - ETA: 4s - loss: 0.5001 - accuracy: 0.7988
17/92 [====>…………………….] - ETA: 4s - loss: 0.4880 - accuracy: 0.8088
18/92 [====>…………………….] - ETA: 4s - loss: 0.4818 - accuracy: 0.8108
19/92 [=====>……………………] - ETA: 4s - loss: 0.5005 - accuracy: 0.8043
20/92 [=====>……………………] - ETA: 4s - loss: 0.5034 - accuracy: 0.8000
21/92 [=====>……………………] - ETA: 4s - loss: 0.5092 - accuracy: 0.7976
22/92 [======>…………………..] - ETA: 4s - loss: 0.5218 - accuracy: 0.7940
23/92 [======>…………………..] - ETA: 4s - loss: 0.5222 - accuracy: 0.7908
24/92 [======>…………………..] - ETA: 3s - loss: 0.5202 - accuracy: 0.7930
25/92 [=======>………………….] - ETA: 3s - loss: 0.5256 - accuracy: 0.7925
26/92 [=======>………………….] - ETA: 3s - loss: 0.5329 - accuracy: 0.7921
27/92 [=======>………………….] - ETA: 3s - loss: 0.5374 - accuracy: 0.7917
28/92 [========>…………………] - ETA: 3s - loss: 0.5278 - accuracy: 0.7969
29/92 [========>…………………] - ETA: 3s - loss: 0.5416 - accuracy: 0.7931
30/92 [========>…………………] - ETA: 3s - loss: 0.5388 - accuracy: 0.7948
31/92 [=========>………………..] - ETA: 3s - loss: 0.5455 - accuracy: 0.7913
32/92 [=========>………………..] - ETA: 3s - loss: 0.5467 - accuracy: 0.7900
33/92 [=========>………………..] - ETA: 3s - loss: 0.5513 - accuracy: 0.7888
34/92 [==========>……………….] - ETA: 3s - loss: 0.5546 - accuracy: 0.7886
35/92 [==========>……………….] - ETA: 3s - loss: 0.5517 - accuracy: 0.7902
36/92 [==========>……………….] - ETA: 3s - loss: 0.5480 - accuracy: 0.7908
37/92 [===========>………………] - ETA: 3s - loss: 0.5459 - accuracy: 0.7922
38/92 [===========>………………] - ETA: 3s - loss: 0.5450 - accuracy: 0.7936
39/92 [===========>………………] - ETA: 3s - loss: 0.5391 - accuracy: 0.7965
40/92 [============>……………..] - ETA: 3s - loss: 0.5383 - accuracy: 0.7961
41/92 [============>……………..] - ETA: 2s - loss: 0.5348 - accuracy: 0.7973
42/92 [============>……………..] - ETA: 2s - loss: 0.5337 - accuracy: 0.7976
43/92 [=============>…………….] - ETA: 2s - loss: 0.5316 - accuracy: 0.7994
44/92 [=============>…………….] - ETA: 2s - loss: 0.5287 - accuracy: 0.7997
45/92 [=============>…………….] - ETA: 2s - loss: 0.5271 - accuracy: 0.8028
46/92 [==============>……………] - ETA: 2s - loss: 0.5250 - accuracy: 0.8030
47/92 [==============>……………] - ETA: 2s - loss: 0.5255 - accuracy: 0.8039
48/92 [==============>……………] - ETA: 2s - loss: 0.5265 - accuracy: 0.8034
49/92 [==============>……………] - ETA: 2s - loss: 0.5259 - accuracy: 0.8029
50/92 [===============>…………..] - ETA: 2s - loss: 0.5263 - accuracy: 0.8031
51/92 [===============>…………..] - ETA: 2s - loss: 0.5260 - accuracy: 0.8027
52/92 [===============>…………..] - ETA: 2s - loss: 0.5245 - accuracy: 0.8041
53/92 [================>………….] - ETA: 2s - loss: 0.5264 - accuracy: 0.8042
54/92 [================>………….] - ETA: 2s - loss: 0.5256 - accuracy: 0.8050
55/92 [================>………….] - ETA: 2s - loss: 0.5227 - accuracy: 0.8057
56/92 [=================>…………] - ETA: 2s - loss: 0.5253 - accuracy: 0.8041
57/92 [=================>…………] - ETA: 2s - loss: 0.5265 - accuracy: 0.8026
58/92 [=================>…………] - ETA: 1s - loss: 0.5269 - accuracy: 0.8028
59/92 [==================>………..] - ETA: 1s - loss: 0.5235 - accuracy: 0.8040
60/92 [==================>………..] - ETA: 1s - loss: 0.5222 - accuracy: 0.8031
61/92 [==================>………..] - ETA: 1s - loss: 0.5280 - accuracy: 0.8007
62/92 [===================>……….] - ETA: 1s - loss: 0.5321 - accuracy: 0.7984
63/92 [===================>……….] - ETA: 1s - loss: 0.5292 - accuracy: 0.7991
64/92 [===================>……….] - ETA: 1s - loss: 0.5326 - accuracy: 0.7988
65/92 [====================>………] - ETA: 1s - loss: 0.5359 - accuracy: 0.7971
67/92 [====================>………] - ETA: 1s - loss: 0.5367 - accuracy: 0.7968
68/92 [=====================>……..] - ETA: 1s - loss: 0.5342 - accuracy: 0.7980
69/92 [=====================>……..] - ETA: 1s - loss: 0.5342 - accuracy: 0.7977
70/92 [=====================>……..] - ETA: 1s - loss: 0.5368 - accuracy: 0.7975
71/92 [======================>…….] - ETA: 1s - loss: 0.5370 - accuracy: 0.7977
72/92 [======================>…….] - ETA: 1s - loss: 0.5361 - accuracy: 0.7979
73/92 [======================>…….] - ETA: 1s - loss: 0.5350 - accuracy: 0.7990
74/92 [=======================>……] - ETA: 1s - loss: 0.5390 - accuracy: 0.7979
75/92 [=======================>……] - ETA: 0s - loss: 0.5402 - accuracy: 0.7972
76/92 [=======================>……] - ETA: 0s - loss: 0.5415 - accuracy: 0.7966
77/92 [========================>…..] - ETA: 0s - loss: 0.5416 - accuracy: 0.7964
78/92 [========================>…..] - ETA: 0s - loss: 0.5412 - accuracy: 0.7962
79/92 [========================>…..] - ETA: 0s - loss: 0.5435 - accuracy: 0.7948
80/92 [=========================>….] - ETA: 0s - loss: 0.5452 - accuracy: 0.7935
81/92 [=========================>….] - ETA: 0s - loss: 0.5493 - accuracy: 0.7918
82/92 [=========================>….] - ETA: 0s - loss: 0.5472 - accuracy: 0.7928
83/92 [==========================>…] - ETA: 0s - loss: 0.5474 - accuracy: 0.7923
84/92 [==========================>…] - ETA: 0s - loss: 0.5458 - accuracy: 0.7933
85/92 [==========================>…] - ETA: 0s - loss: 0.5452 - accuracy: 0.7931
86/92 [===========================>..] - ETA: 0s - loss: 0.5448 - accuracy: 0.7930
87/92 [===========================>..] - ETA: 0s - loss: 0.5434 - accuracy: 0.7939
88/92 [===========================>..] - ETA: 0s - loss: 0.5425 - accuracy: 0.7945
89/92 [============================>.] - ETA: 0s - loss: 0.5434 - accuracy: 0.7937
90/92 [============================>.] - ETA: 0s - loss: 0.5470 - accuracy: 0.7925
91/92 [============================>.] - ETA: 0s - loss: 0.5459 - accuracy: 0.7941
92/92 [==============================] - ETA: 0s - loss: 0.5434 - accuracy: 0.7950
92/92 [==============================] - 6s 64ms/step - loss: 0.5434 - accuracy: 0.7950 - val_loss: 0.7003 - val_accuracy: 0.7357
Epoch 15/15
1/92 [..............................] - ETA: 7s - loss: 0.3981 - accuracy: 0.8750
2/92 [..............................] - ETA: 5s - loss: 0.4149 - accuracy: 0.8438
3/92 [..............................] - ETA: 5s - loss: 0.5013 - accuracy: 0.8125
4/92 [>.............................] - ETA: 5s - loss: 0.4675 - accuracy: 0.8281
5/92 [>.............................] - ETA: 5s - loss: 0.4681 - accuracy: 0.8188
6/92 [>.............................] - ETA: 5s - loss: 0.4709 - accuracy: 0.8229
7/92 [=>............................] - ETA: 4s - loss: 0.4701 - accuracy: 0.8214
8/92 [=>............................] - ETA: 4s - loss: 0.4571 - accuracy: 0.8281
9/92 [=>............................] - ETA: 4s - loss: 0.4488 - accuracy: 0.8368
10/92 [==>………………………] - ETA: 4s - loss: 0.4419 - accuracy: 0.8406
11/92 [==>………………………] - ETA: 4s - loss: 0.4333 - accuracy: 0.8438
12/92 [==>………………………] - ETA: 4s - loss: 0.4277 - accuracy: 0.8464
13/92 [===>……………………..] - ETA: 4s - loss: 0.4366 - accuracy: 0.8462
14/92 [===>……………………..] - ETA: 4s - loss: 0.4465 - accuracy: 0.8393
15/92 [===>……………………..] - ETA: 4s - loss: 0.4502 - accuracy: 0.8354
16/92 [====>…………………….] - ETA: 4s - loss: 0.4574 - accuracy: 0.8301
17/92 [====>…………………….] - ETA: 4s - loss: 0.4492 - accuracy: 0.8364
18/92 [====>…………………….] - ETA: 4s - loss: 0.4515 - accuracy: 0.8299
19/92 [=====>……………………] - ETA: 4s - loss: 0.4427 - accuracy: 0.8306
20/92 [=====>……………………] - ETA: 4s - loss: 0.4537 - accuracy: 0.8266
21/92 [=====>……………………] - ETA: 4s - loss: 0.4501 - accuracy: 0.8274
22/92 [======>…………………..] - ETA: 4s - loss: 0.4552 - accuracy: 0.8253
23/92 [======>…………………..] - ETA: 4s - loss: 0.4613 - accuracy: 0.8207
24/92 [======>…………………..] - ETA: 3s - loss: 0.4597 - accuracy: 0.8229
25/92 [=======>………………….] - ETA: 3s - loss: 0.4571 - accuracy: 0.8250
26/92 [=======>………………….] - ETA: 3s - loss: 0.4539 - accuracy: 0.8269
27/92 [=======>………………….] - ETA: 3s - loss: 0.4631 - accuracy: 0.8252
28/92 [========>…………………] - ETA: 3s - loss: 0.4597 - accuracy: 0.8270
29/92 [========>…………………] - ETA: 3s - loss: 0.4645 - accuracy: 0.8244
30/92 [========>…………………] - ETA: 3s - loss: 0.4748 - accuracy: 0.8229
31/92 [=========>………………..] - ETA: 3s - loss: 0.4794 - accuracy: 0.8206
33/92 [=========>………………..] - ETA: 3s - loss: 0.4802 - accuracy: 0.8206
34/92 [==========>……………….] - ETA: 3s - loss: 0.4806 - accuracy: 0.8204
35/92 [==========>……………….] - ETA: 3s - loss: 0.4783 - accuracy: 0.8210
36/92 [==========>……………….] - ETA: 3s - loss: 0.4816 - accuracy: 0.8199
37/92 [===========>………………] - ETA: 3s - loss: 0.4810 - accuracy: 0.8189
38/92 [===========>………………] - ETA: 3s - loss: 0.4834 - accuracy: 0.8179
39/92 [===========>………………] - ETA: 3s - loss: 0.4872 - accuracy: 0.8177
40/92 [============>……………..] - ETA: 2s - loss: 0.4897 - accuracy: 0.8160
41/92 [============>……………..] - ETA: 2s - loss: 0.4889 - accuracy: 0.8160
42/92 [============>……………..] - ETA: 2s - loss: 0.4955 - accuracy: 0.8129
43/92 [=============>…………….] - ETA: 2s - loss: 0.5023 - accuracy: 0.8099
44/92 [=============>…………….] - ETA: 2s - loss: 0.5021 - accuracy: 0.8107
45/92 [=============>…………….] - ETA: 2s - loss: 0.5024 - accuracy: 0.8101
46/92 [==============>……………] - ETA: 2s - loss: 0.5017 - accuracy: 0.8101
47/92 [==============>……………] - ETA: 2s - loss: 0.5044 - accuracy: 0.8102
48/92 [==============>……………] - ETA: 2s - loss: 0.5106 - accuracy: 0.8069
49/92 [==============>……………] - ETA: 2s - loss: 0.5094 - accuracy: 0.8064
50/92 [===============>…………..] - ETA: 2s - loss: 0.5061 - accuracy: 0.8090
51/92 [===============>…………..] - ETA: 2s - loss: 0.5033 - accuracy: 0.8103
52/92 [===============>…………..] - ETA: 2s - loss: 0.5022 - accuracy: 0.8116
53/92 [================>………….] - ETA: 2s - loss: 0.5020 - accuracy: 0.8116
54/92 [================>………….] - ETA: 2s - loss: 0.5025 - accuracy: 0.8105
55/92 [================>………….] - ETA: 2s - loss: 0.5012 - accuracy: 0.8122
56/92 [=================>…………] - ETA: 2s - loss: 0.5013 - accuracy: 0.8122
57/92 [=================>…………] - ETA: 2s - loss: 0.5014 - accuracy: 0.8128
58/92 [=================>…………] - ETA: 1s - loss: 0.4985 - accuracy: 0.8144
59/92 [==================>………..] - ETA: 1s - loss: 0.4989 - accuracy: 0.8138
60/92 [==================>………..] - ETA: 1s - loss: 0.5007 - accuracy: 0.8117
61/92 [==================>………..] - ETA: 1s - loss: 0.5011 - accuracy: 0.8107
62/92 [===================>……….] - ETA: 1s - loss: 0.4986 - accuracy: 0.8122
63/92 [===================>……….] - ETA: 1s - loss: 0.4991 - accuracy: 0.8123
64/92 [===================>……….] - ETA: 1s - loss: 0.4975 - accuracy: 0.8137
65/92 [====================>………] - ETA: 1s - loss: 0.4949 - accuracy: 0.8156
66/92 [====================>………] - ETA: 1s - loss: 0.4949 - accuracy: 0.8156
67/92 [====================>………] - ETA: 1s - loss: 0.4917 - accuracy: 0.8165
68/92 [=====================>……..] - ETA: 1s - loss: 0.4898 - accuracy: 0.8178
69/92 [=====================>……..] - ETA: 1s - loss: 0.4893 - accuracy: 0.8177
70/92 [=====================>……..] - ETA: 1s - loss: 0.4878 - accuracy: 0.8185
71/92 [======================>…….] - ETA: 1s - loss: 0.4858 - accuracy: 0.8193
72/92 [======================>…….] - ETA: 1s - loss: 0.4848 - accuracy: 0.8193
73/92 [======================>…….] - ETA: 1s - loss: 0.4851 - accuracy: 0.8183
74/92 [=======================>……] - ETA: 1s - loss: 0.4869 - accuracy: 0.8174
75/92 [=======================>……] - ETA: 0s - loss: 0.4863 - accuracy: 0.8165
76/92 [=======================>……] - ETA: 0s - loss: 0.4874 - accuracy: 0.8156
77/92 [========================>…..] - ETA: 0s - loss: 0.4895 - accuracy: 0.8156
78/92 [========================>…..] - ETA: 0s - loss: 0.4899 - accuracy: 0.8159
79/92 [========================>…..] - ETA: 0s - loss: 0.4896 - accuracy: 0.8159
80/92 [=========================>….] - ETA: 0s - loss: 0.4900 - accuracy: 0.8154
81/92 [=========================>….] - ETA: 0s - loss: 0.4929 - accuracy: 0.8135
82/92 [=========================>….] - ETA: 0s - loss: 0.4921 - accuracy: 0.8131
83/92 [==========================>…] - ETA: 0s - loss: 0.4925 - accuracy: 0.8134
84/92 [==========================>…] - ETA: 0s - loss: 0.4912 - accuracy: 0.8131
85/92 [==========================>…] - ETA: 0s - loss: 0.4906 - accuracy: 0.8142
86/92 [===========================>..] - ETA: 0s - loss: 0.4939 - accuracy: 0.8120
87/92 [===========================>..] - ETA: 0s - loss: 0.4967 - accuracy: 0.8102
88/92 [===========================>..] - ETA: 0s - loss: 0.4958 - accuracy: 0.8109
89/92 [============================>.] - ETA: 0s - loss: 0.4973 - accuracy: 0.8095
90/92 [============================>.] - ETA: 0s - loss: 0.5005 - accuracy: 0.8085
91/92 [============================>.] - ETA: 0s - loss: 0.5006 - accuracy: 0.8089
92/92 [==============================] - ETA: 0s - loss: 0.5029 - accuracy: 0.8079
92/92 [==============================] - 6s 64ms/step - loss: 0.5029 - accuracy: 0.8079 - val_loss: 0.6990 - val_accuracy: 0.7411
Visualize Training Results¶
After applying data augmentation and Dropout, there is less overfitting than before, and training and validation accuracy are closer aligned.
acc = history.history['accuracy']
val_acc = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']
epochs_range = range(epochs)
plt.figure(figsize=(8, 8))
plt.subplot(1, 2, 1)
plt.plot(epochs_range, acc, label='Training Accuracy')
plt.plot(epochs_range, val_acc, label='Validation Accuracy')
plt.legend(loc='lower right')
plt.title('Training and Validation Accuracy')
plt.subplot(1, 2, 2)
plt.plot(epochs_range, loss, label='Training Loss')
plt.plot(epochs_range, val_loss, label='Validation Loss')
plt.legend(loc='upper right')
plt.title('Training and Validation Loss')
plt.show()
Predict on New Data¶
Finally, let us use the model to classify an image that was not included in the training or validation sets.
NOTE: Data augmentation and Dropout layers are inactive at inference time.
sunflower_url = "https://storage.googleapis.com/download.tensorflow.org/example_images/592px-Red_sunflower.jpg"
sunflower_path = tf.keras.utils.get_file('Red_sunflower', origin=sunflower_url)
img = keras.preprocessing.image.load_img(
sunflower_path, target_size=(img_height, img_width)
)
img_array = keras.preprocessing.image.img_to_array(img)
img_array = tf.expand_dims(img_array, 0) # Create a batch
predictions = model.predict(img_array)
score = tf.nn.softmax(predictions[0])
print(
"This image most likely belongs to {} with a {:.2f} percent confidence."
.format(class_names[np.argmax(score)], 100 * np.max(score))
)
1/1 [==============================] - ETA: 0s
1/1 [==============================] - 0s 83ms/step
This image most likely belongs to sunflowers with a 99.80 percent confidence.
Save the TensorFlow Model¶
#save the trained model - a new folder flower will be created
#and the file "saved_model.pb" is the pre-trained model
model_dir = "model"
saved_model_dir = f"{model_dir}/flower/saved_model"
model.save(saved_model_dir)
2024-02-10 01:13:41.810110: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'random_flip_input' with dtype float and shape [?,180,180,3]
[[{{node random_flip_input}}]]
2024-02-10 01:13:41.894772: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-02-10 01:13:41.904859: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'random_flip_input' with dtype float and shape [?,180,180,3]
[[{{node random_flip_input}}]]
2024-02-10 01:13:41.916513: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-02-10 01:13:41.923535: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-02-10 01:13:41.930525: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-02-10 01:13:41.941500: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-02-10 01:13:41.979801: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'sequential_1_input' with dtype float and shape [?,180,180,3]
[[{{node sequential_1_input}}]]
2024-02-10 01:13:42.047000: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-02-10 01:13:42.067363: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'sequential_1_input' with dtype float and shape [?,180,180,3]
[[{{node sequential_1_input}}]]
2024-02-10 01:13:42.106717: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,22,22,64]
[[{{node inputs}}]]
2024-02-10 01:13:42.133121: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-02-10 01:13:42.206690: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-02-10 01:13:42.348773: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-02-10 01:13:42.485746: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,22,22,64]
[[{{node inputs}}]]
2024-02-10 01:13:42.519497: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-02-10 01:13:42.547013: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
2024-02-10 01:13:42.593572: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'inputs' with dtype float and shape [?,180,180,3]
[[{{node inputs}}]]
WARNING:absl:Found untraced functions such as _jit_compiled_convolution_op, _jit_compiled_convolution_op, _jit_compiled_convolution_op, _update_step_xla while saving (showing 4 of 4). These functions will not be directly callable after loading.
INFO:tensorflow:Assets written to: model/flower/saved_model/assets
INFO:tensorflow:Assets written to: model/flower/saved_model/assets
Convert the TensorFlow model with OpenVINO Model Conversion API¶
To convert the model to
OpenVINO IR with FP16
precision, use model conversion Python API.
# Convert the model to ir model format and save it.
ir_model_path = Path("model/flower")
ir_model_path.mkdir(parents=True, exist_ok=True)
ir_model = ov.convert_model(saved_model_dir, input=[1,180,180,3])
ov.save_model(ir_model, ir_model_path / "flower_ir.xml")
Preprocessing Image Function¶
def pre_process_image(imagePath, img_height=180):
# Model input format
n, h, w, c = [1, img_height, img_height, 3]
image = Image.open(imagePath)
image = image.resize((h, w), resample=Image.BILINEAR)
# Convert to array and change data layout from HWC to CHW
image = np.array(image)
input_image = image.reshape((n, h, w, c))
return input_image
OpenVINO Runtime Setup¶
Select inference device¶
select device from dropdown list for running inference using OpenVINO
import ipywidgets as widgets
# Initialize OpenVINO runtime
core = ov.Core()
device = widgets.Dropdown(
options=core.available_devices + ["AUTO"],
value='AUTO',
description='Device:',
disabled=False,
)
device
Dropdown(description='Device:', index=1, options=('CPU', 'AUTO'), value='AUTO')
class_names=["daisy", "dandelion", "roses", "sunflowers", "tulips"]
compiled_model = core.compile_model(model=ir_model, device_name=device.value)
del ir_model
input_layer = compiled_model.input(0)
output_layer = compiled_model.output(0)
Run the Inference Step¶
# Run inference on the input image...
inp_img_url = "https://upload.wikimedia.org/wikipedia/commons/4/48/A_Close_Up_Photo_of_a_Dandelion.jpg"
OUTPUT_DIR = "output"
inp_file_name = f"A_Close_Up_Photo_of_a_Dandelion.jpg"
file_path = Path(OUTPUT_DIR)/Path(inp_file_name)
os.makedirs(OUTPUT_DIR, exist_ok=True)
# Download the image
download_file(inp_img_url, inp_file_name, directory=OUTPUT_DIR)
# Pre-process the image and get it ready for inference.
input_image = pre_process_image(file_path)
print(input_image.shape)
print(input_layer.shape)
res = compiled_model([input_image])[output_layer]
score = tf.nn.softmax(res[0])
# Show the results
image = Image.open(file_path)
plt.imshow(image)
print(
"This image most likely belongs to {} with a {:.2f} percent confidence."
.format(class_names[np.argmax(score)], 100 * np.max(score))
)
'output/A_Close_Up_Photo_of_a_Dandelion.jpg' already exists.
(1, 180, 180, 3)
[1,180,180,3]
This image most likely belongs to dandelion with a 98.49 percent confidence.
The Next Steps¶
This tutorial showed how to train a TensorFlow model, how to convert that model to OpenVINO’s IR format, and how to do inference on the converted model. For faster inference speed, you can quantize the IR model. To see how to quantize this model with OpenVINO’s Post-training Quantization with NNCF Tool, check out the Post-Training Quantization with TensorFlow Classification Model notebook.