From Training to Deployment with TensorFlow and OpenVINO™

This Jupyter notebook can be launched after a local installation only.

Github

Table of contents:

# @title Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# Copyright 2018 The TensorFlow Authors
#
# Modified for OpenVINO Notebooks

This tutorial demonstrates how to train, convert, and deploy an image classification model with TensorFlow and OpenVINO. This particular notebook shows the process where we perform the inference step on the freshly trained model that is converted to OpenVINO IR with model conversion API. For faster inference speed on the model created in this notebook, check out the Post-Training Quantization with TensorFlow Classification Model notebook.

This training code comprises the official TensorFlow Image Classification Tutorial in its entirety.

The flower_ir.bin and flower_ir.xml (pre-trained models) can be obtained by executing the code with ‘Runtime->Run All’ or the Ctrl+F9 command.

%pip install -q "openvino>=2023.1.0"
DEPRECATION: pytorch-lightning 1.6.5 has a non-standard dependency specifier torch>=1.8.*. pip 24.1 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of pytorch-lightning or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at https://github.com/pypa/pip/issues/12063
Note: you may need to restart the kernel to use updated packages.

TensorFlow Image Classification Training

The first part of the tutorial shows how to classify images of flowers (based on the TensorFlow’s official tutorial). It creates an image classifier using a keras.Sequential model, and loads data using preprocessing.image_dataset_from_directory. You will gain practical experience with the following concepts:

  • Efficiently loading a dataset off disk.

  • Identifying overfitting and applying techniques to mitigate it, including data augmentation and Dropout.

This tutorial follows a basic machine learning workflow:

  1. Examine and understand data

  2. Build an input pipeline

  3. Build the model

  4. Train the model

  5. Test the model

Import TensorFlow and Other Libraries

import os
import sys
from pathlib import Path

import PIL
import matplotlib.pyplot as plt
import numpy as np
import tensorflow as tf
from PIL import Image
import openvino as ov
from tensorflow import keras
from tensorflow.keras import layers
from tensorflow.keras.models import Sequential

sys.path.append("../utils")
from notebook_utils import download_file
2024-02-10 01:12:04.614496: I tensorflow/core/util/port.cc:110] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable TF_ENABLE_ONEDNN_OPTS=0.
2024-02-10 01:12:04.649325: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2024-02-10 01:12:05.161353: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT

Download and Explore the Dataset

This tutorial uses a dataset of about 3,700 photos of flowers. The dataset contains 5 sub-directories, one per class:

flower_photo/
  daisy/
  dandelion/
  roses/
  sunflowers/
  tulips/
import pathlib
dataset_url = "https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz"
data_dir = tf.keras.utils.get_file('flower_photos', origin=dataset_url, untar=True)
data_dir = pathlib.Path(data_dir)

After downloading, you should now have a copy of the dataset available. There are 3,670 total images:

image_count = len(list(data_dir.glob('*/*.jpg')))
print(image_count)
3670

Here are some roses:

roses = list(data_dir.glob('roses/*'))
PIL.Image.open(str(roses[0]))
../_images/301-tensorflow-training-openvino-with-output_14_0.png
PIL.Image.open(str(roses[1]))
../_images/301-tensorflow-training-openvino-with-output_15_0.png

And some tulips:

tulips = list(data_dir.glob('tulips/*'))
PIL.Image.open(str(tulips[0]))
../_images/301-tensorflow-training-openvino-with-output_17_0.png
PIL.Image.open(str(tulips[1]))
../_images/301-tensorflow-training-openvino-with-output_18_0.png

Load Using keras.preprocessing

Let’s load these images off disk using the helpful image_dataset_from_directory utility. This will take you from a directory of images on disk to a tf.data.Dataset in just a couple lines of code. If you like, you can also write your own data loading code from scratch by visiting the load images tutorial.

Create a Dataset

Define some parameters for the loader:

batch_size = 32
img_height = 180
img_width = 180

It’s good practice to use a validation split when developing your model. Let’s use 80% of the images for training, and 20% for validation.

train_ds = tf.keras.preprocessing.image_dataset_from_directory(
  data_dir,
  validation_split=0.2,
  subset="training",
  seed=123,
  image_size=(img_height, img_width),
  batch_size=batch_size)
Found 3670 files belonging to 5 classes.
Using 2936 files for training.
2024-02-10 01:12:08.217732: E tensorflow/compiler/xla/stream_executor/cuda/cuda_driver.cc:266] failed call to cuInit: CUDA_ERROR_COMPAT_NOT_SUPPORTED_ON_DEVICE: forward compatibility was attempted on non supported HW
2024-02-10 01:12:08.217763: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:168] retrieving CUDA diagnostic information for host: iotg-dev-workstation-07
2024-02-10 01:12:08.217767: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:175] hostname: iotg-dev-workstation-07
2024-02-10 01:12:08.217894: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:199] libcuda reported version is: 470.223.2
2024-02-10 01:12:08.217909: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:203] kernel reported version is: 470.182.3
2024-02-10 01:12:08.217913: E tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:312] kernel version 470.182.3 does not match DSO version 470.223.2 -- cannot find working devices in this configuration
val_ds = tf.keras.preprocessing.image_dataset_from_directory(
  data_dir,
  validation_split=0.2,
  subset="validation",
  seed=123,
  image_size=(img_height, img_width),
  batch_size=batch_size)
Found 3670 files belonging to 5 classes.
Using 734 files for validation.

You can find the class names in the class_names attribute on these datasets. These correspond to the directory names in alphabetical order.

class_names = train_ds.class_names
print(class_names)
['daisy', 'dandelion', 'roses', 'sunflowers', 'tulips']

Visualize the Data

Here are the first 9 images from the training dataset.

plt.figure(figsize=(10, 10))
for images, labels in train_ds.take(1):
    for i in range(9):
        ax = plt.subplot(3, 3, i + 1)
        plt.imshow(images[i].numpy().astype("uint8"))
        plt.title(class_names[labels[i]])
        plt.axis("off")
2024-02-10 01:12:08.550492: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
     [[{{node Placeholder/_0}}]]
2024-02-10 01:12:08.550818: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
     [[{{node Placeholder/_0}}]]
../_images/301-tensorflow-training-openvino-with-output_29_1.png

You will train a model using these datasets by passing them to model.fit in a moment. If you like, you can also manually iterate over the dataset and retrieve batches of images:

for image_batch, labels_batch in train_ds:
    print(image_batch.shape)
    print(labels_batch.shape)
    break
(32, 180, 180, 3)
(32,)
2024-02-10 01:12:09.380029: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
     [[{{node Placeholder/_4}}]]
2024-02-10 01:12:09.380404: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
     [[{{node Placeholder/_4}}]]

The image_batch is a tensor of the shape (32, 180, 180, 3). This is a batch of 32 images of shape 180x180x3 (the last dimension refers to color channels RGB). The label_batch is a tensor of the shape (32,), these are corresponding labels to the 32 images.

You can call .numpy() on the image_batch and labels_batch tensors to convert them to a numpy.ndarray.

Configure the Dataset for Performance

Let’s make sure to use buffered prefetching so you can yield data from disk without having I/O become blocking. These are two important methods you should use when loading data.

Dataset.cache() keeps the images in memory after they’re loaded off disk during the first epoch. This will ensure the dataset does not become a bottleneck while training your model. If your dataset is too large to fit into memory, you can also use this method to create a performant on-disk cache.

Dataset.prefetch() overlaps data preprocessing and model execution while training.

Interested readers can learn more about both methods, as well as how to cache data to disk in the data performance guide.

AUTOTUNE = tf.data.AUTOTUNE
train_ds = train_ds.cache().shuffle(1000).prefetch(buffer_size=AUTOTUNE)
val_ds = val_ds.cache().prefetch(buffer_size=AUTOTUNE)

Standardize the Data

The RGB channel values are in the [0, 255] range. This is not ideal for a neural network; in general you should seek to make your input values small. Here, you will standardize values to be in the [0, 1] range by using a Rescaling layer.

normalization_layer = layers.Rescaling(1./255)

Note: The Keras Preprocessing utilities and layers introduced in this section are currently experimental and may change.

There are two ways to use this layer. You can apply it to the dataset by calling map:

normalized_ds = train_ds.map(lambda x, y: (normalization_layer(x), y))
image_batch, labels_batch = next(iter(normalized_ds))
first_image = image_batch[0]
# Notice the pixels values are now in `[0,1]`.
print(np.min(first_image), np.max(first_image))
2024-02-10 01:12:09.568220: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
     [[{{node Placeholder/_4}}]]
2024-02-10 01:12:09.568598: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
     [[{{node Placeholder/_4}}]]
0.0 1.0

Or, you can include the layer inside your model definition, which can simplify deployment. Let’s use the second approach here.

Note: you previously resized images using the image_size argument of image_dataset_from_directory. If you want to include the resizing logic in your model as well, you can use the Resizing layer.

Create the Model

The model consists of three convolution blocks with a max pool layer in each of them. There’s a fully connected layer with 128 units on top of it that is activated by a relu activation function. This model has not been tuned for high accuracy, the goal of this tutorial is to show a standard approach.

num_classes = 5

model = Sequential([
  layers.experimental.preprocessing.Rescaling(1./255, input_shape=(img_height, img_width, 3)),
  layers.Conv2D(16, 3, padding='same', activation='relu'),
  layers.MaxPooling2D(),
  layers.Conv2D(32, 3, padding='same', activation='relu'),
  layers.MaxPooling2D(),
  layers.Conv2D(64, 3, padding='same', activation='relu'),
  layers.MaxPooling2D(),
  layers.Flatten(),
  layers.Dense(128, activation='relu'),
  layers.Dense(num_classes)
])

Compile the Model

For this tutorial, choose the optimizers.Adam optimizer and losses.SparseCategoricalCrossentropy loss function. To view training and validation accuracy for each training epoch, pass the metrics argument.

model.compile(optimizer='adam',
              loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
              metrics=['accuracy'])

Model Summary

View all the layers of the network using the model’s summary method.

NOTE: This section is commented out for performance reasons. Please feel free to uncomment these to compare the results.

# model.summary()

Train the Model

# epochs=10
# history = model.fit(
#   train_ds,
#   validation_data=val_ds,
#   epochs=epochs
# )

Visualize Training Results

Create plots of loss and accuracy on the training and validation sets.

# acc = history.history['accuracy']
# val_acc = history.history['val_accuracy']

# loss = history.history['loss']
# val_loss = history.history['val_loss']

# epochs_range = range(epochs)

# plt.figure(figsize=(8, 8))
# plt.subplot(1, 2, 1)
# plt.plot(epochs_range, acc, label='Training Accuracy')
# plt.plot(epochs_range, val_acc, label='Validation Accuracy')
# plt.legend(loc='lower right')
# plt.title('Training and Validation Accuracy')

# plt.subplot(1, 2, 2)
# plt.plot(epochs_range, loss, label='Training Loss')
# plt.plot(epochs_range, val_loss, label='Validation Loss')
# plt.legend(loc='upper right')
# plt.title('Training and Validation Loss')
# plt.show()

As you can see from the plots, training accuracy and validation accuracy are off by large margin and the model has achieved only around 60% accuracy on the validation set.

Let’s look at what went wrong and try to increase the overall performance of the model.

Overfitting

In the plots above, the training accuracy is increasing linearly over time, whereas validation accuracy stalls around 60% in the training process. Also, the difference in accuracy between training and validation accuracy is noticeable — a sign of overfitting.

When there are a small number of training examples, the model sometimes learns from noises or unwanted details from training examples—to an extent that it negatively impacts the performance of the model on new examples. This phenomenon is known as overfitting. It means that the model will have a difficult time generalizing on a new dataset.

There are multiple ways to fight overfitting in the training process. In this tutorial, you’ll use data augmentation and add Dropout to your model.

Data Augmentation

Overfitting generally occurs when there are a small number of training examples. Data augmentation takes the approach of generating additional training data from your existing examples by augmenting them using random transformations that yield believable-looking images. This helps expose the model to more aspects of the data and generalize better.

You will implement data augmentation using the layers from tf.keras.layers.experimental.preprocessing. These can be included inside your model like other layers, and run on the GPU.

data_augmentation = keras.Sequential(
  [
    layers.RandomFlip("horizontal",
                      input_shape=(img_height,
                                   img_width,
                                   3)),
    layers.RandomRotation(0.1),
    layers.RandomZoom(0.1),
  ]
)

Let’s visualize what a few augmented examples look like by applying data augmentation to the same image several times:

plt.figure(figsize=(10, 10))
for images, _ in train_ds.take(1):
    for i in range(9):
        augmented_images = data_augmentation(images)
        ax = plt.subplot(3, 3, i + 1)
        plt.imshow(augmented_images[0].numpy().astype("uint8"))
        plt.axis("off")
2024-02-10 01:12:10.342151: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
     [[{{node Placeholder/_0}}]]
2024-02-10 01:12:10.342455: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
     [[{{node Placeholder/_0}}]]
../_images/301-tensorflow-training-openvino-with-output_57_1.png

You will use data augmentation to train a model in a moment.

Dropout

Another technique to reduce overfitting is to introduce Dropout to the network, a form of regularization.

When you apply Dropout to a layer it randomly drops out (by setting the activation to zero) a number of output units from the layer during the training process. Dropout takes a fractional number as its input value, in the form such as 0.1, 0.2, 0.4, etc. This means dropping out 10%, 20% or 40% of the output units randomly from the applied layer.

Let’s create a new neural network using layers.Dropout, then train it using augmented images.

model = Sequential([
    data_augmentation,
    layers.Rescaling(1./255),
    layers.Conv2D(16, 3, padding='same', activation='relu'),
    layers.MaxPooling2D(),
    layers.Conv2D(32, 3, padding='same', activation='relu'),
    layers.MaxPooling2D(),
    layers.Conv2D(64, 3, padding='same', activation='relu'),
    layers.MaxPooling2D(),
    layers.Dropout(0.2),
    layers.Flatten(),
    layers.Dense(128, activation='relu'),
    layers.Dense(num_classes, name="outputs")
])

Compile and Train the Model

model.compile(optimizer='adam',
              loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
              metrics=['accuracy'])
model.summary()
Model: "sequential_2"
_________________________________________________________________
Layer (type)                Output Shape              Param #
=================================================================
sequential_1 (Sequential)   (None, 180, 180, 3)       0
rescaling_2 (Rescaling)     (None, 180, 180, 3)       0
conv2d_3 (Conv2D)           (None, 180, 180, 16)      448
max_pooling2d_3 (MaxPooling  (None, 90, 90, 16)       0
2D)
conv2d_4 (Conv2D)           (None, 90, 90, 32)        4640
max_pooling2d_4 (MaxPooling  (None, 45, 45, 32)       0
2D)
conv2d_5 (Conv2D)           (None, 45, 45, 64)        18496
max_pooling2d_5 (MaxPooling  (None, 22, 22, 64)       0
2D)
dropout (Dropout)           (None, 22, 22, 64)        0
flatten_1 (Flatten)         (None, 30976)             0
dense_2 (Dense)             (None, 128)               3965056
outputs (Dense)             (None, 5)                 645
=================================================================
Total params: 3,989,285
Trainable params: 3,989,285
Non-trainable params: 0
_________________________________________________________________
epochs = 15
history = model.fit(
    train_ds,
    validation_data=val_ds,
    epochs=epochs
)
Epoch 1/15
2024-02-10 01:12:11.537227: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [2936]
     [[{{node Placeholder/_0}}]]
2024-02-10 01:12:11.537529: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_4' with dtype int32 and shape [2936]
     [[{{node Placeholder/_4}}]]
1/92 [..............................] - ETA: 1:26 - loss: 1.6124 - accuracy: 0.2500
   
2/92 [..............................] - ETA: 5s - loss: 2.0113 - accuracy: 0.2344
   
3/92 [..............................] - ETA: 5s - loss: 2.0181 - accuracy: 0.1979
   
4/92 [>.............................] - ETA: 5s - loss: 1.9673 - accuracy: 0.1953
   
5/92 [>.............................] - ETA: 5s - loss: 1.8983 - accuracy: 0.2125
   
6/92 [>.............................] - ETA: 5s - loss: 1.8645 - accuracy: 0.2240
   
7/92 [=>............................] - ETA: 5s - loss: 1.8276 - accuracy: 0.2188
   
8/92 [=>............................] - ETA: 5s - loss: 1.8073 - accuracy: 0.2070
   
9/92 [=>............................] - ETA: 4s - loss: 1.7923 - accuracy: 0.1979


10/92 [==>………………………] - ETA: 4s - loss: 1.7763 - accuracy: 0.1969



11/92 [==>………………………] - ETA: 4s - loss: 1.7599 - accuracy: 0.1989



12/92 [==>………………………] - ETA: 4s - loss: 1.7465 - accuracy: 0.1953



13/92 [===>……………………..] - ETA: 4s - loss: 1.7323 - accuracy: 0.1995



14/92 [===>……………………..] - ETA: 4s - loss: 1.7228 - accuracy: 0.1964



15/92 [===>……………………..] - ETA: 4s - loss: 1.7165 - accuracy: 0.1937



16/92 [====>…………………….] - ETA: 4s - loss: 1.7071 - accuracy: 0.2012



17/92 [====>…………………….] - ETA: 4s - loss: 1.6996 - accuracy: 0.2022



18/92 [====>…………………….] - ETA: 4s - loss: 1.6922 - accuracy: 0.2031



19/92 [=====>……………………] - ETA: 4s - loss: 1.6856 - accuracy: 0.2039



20/92 [=====>……………………] - ETA: 4s - loss: 1.6816 - accuracy: 0.2109



21/92 [=====>……………………] - ETA: 4s - loss: 1.6764 - accuracy: 0.2158



22/92 [======>…………………..] - ETA: 4s - loss: 1.6696 - accuracy: 0.2188



23/92 [======>…………………..] - ETA: 4s - loss: 1.6659 - accuracy: 0.2188



24/92 [======>…………………..] - ETA: 4s - loss: 1.6625 - accuracy: 0.2279



25/92 [=======>………………….] - ETA: 3s - loss: 1.6613 - accuracy: 0.2250



26/92 [=======>………………….] - ETA: 3s - loss: 1.6581 - accuracy: 0.2236



27/92 [=======>………………….] - ETA: 3s - loss: 1.6524 - accuracy: 0.2280



28/92 [========>…………………] - ETA: 3s - loss: 1.6476 - accuracy: 0.2288



29/92 [========>…………………] - ETA: 3s - loss: 1.6450 - accuracy: 0.2284



30/92 [========>…………………] - ETA: 3s - loss: 1.6415 - accuracy: 0.2292



31/92 [=========>………………..] - ETA: 3s - loss: 1.6390 - accuracy: 0.2288



32/92 [=========>………………..] - ETA: 3s - loss: 1.6343 - accuracy: 0.2344



33/92 [=========>………………..] - ETA: 3s - loss: 1.6313 - accuracy: 0.2358



34/92 [==========>……………….] - ETA: 3s - loss: 1.6276 - accuracy: 0.2371



35/92 [==========>……………….] - ETA: 3s - loss: 1.6229 - accuracy: 0.2384



36/92 [==========>……………….] - ETA: 3s - loss: 1.6208 - accuracy: 0.2378



37/92 [===========>………………] - ETA: 3s - loss: 1.6166 - accuracy: 0.2432



38/92 [===========>………………] - ETA: 3s - loss: 1.6133 - accuracy: 0.2442



39/92 [===========>………………] - ETA: 3s - loss: 1.6112 - accuracy: 0.2420



40/92 [============>……………..] - ETA: 3s - loss: 1.6055 - accuracy: 0.2453



41/92 [============>……………..] - ETA: 3s - loss: 1.6035 - accuracy: 0.2462



42/92 [============>……………..] - ETA: 2s - loss: 1.6018 - accuracy: 0.2448



43/92 [=============>…………….] - ETA: 2s - loss: 1.5969 - accuracy: 0.2464



44/92 [=============>…………….] - ETA: 2s - loss: 1.5921 - accuracy: 0.2507



45/92 [=============>…………….] - ETA: 2s - loss: 1.5882 - accuracy: 0.2569



46/92 [==============>……………] - ETA: 2s - loss: 1.5821 - accuracy: 0.2615



47/92 [==============>……………] - ETA: 2s - loss: 1.5754 - accuracy: 0.2620



48/92 [==============>……………] - ETA: 2s - loss: 1.5700 - accuracy: 0.2637



49/92 [==============>……………] - ETA: 2s - loss: 1.5665 - accuracy: 0.2659



50/92 [===============>…………..] - ETA: 2s - loss: 1.5562 - accuracy: 0.2725



51/92 [===============>…………..] - ETA: 2s - loss: 1.5479 - accuracy: 0.2757



52/92 [===============>…………..] - ETA: 2s - loss: 1.5424 - accuracy: 0.2800



53/92 [================>………….] - ETA: 2s - loss: 1.5411 - accuracy: 0.2789



54/92 [================>………….] - ETA: 2s - loss: 1.5413 - accuracy: 0.2807



55/92 [================>………….] - ETA: 2s - loss: 1.5364 - accuracy: 0.2847



56/92 [=================>…………] - ETA: 2s - loss: 1.5344 - accuracy: 0.2868



57/92 [=================>…………] - ETA: 2s - loss: 1.5273 - accuracy: 0.2906



58/92 [=================>…………] - ETA: 2s - loss: 1.5258 - accuracy: 0.2909



59/92 [==================>………..] - ETA: 1s - loss: 1.5204 - accuracy: 0.2966



60/92 [==================>………..] - ETA: 1s - loss: 1.5154 - accuracy: 0.3016



61/92 [==================>………..] - ETA: 1s - loss: 1.5101 - accuracy: 0.3053



62/92 [===================>……….] - ETA: 1s - loss: 1.5026 - accuracy: 0.3115



63/92 [===================>……….] - ETA: 1s - loss: 1.4993 - accuracy: 0.3125



64/92 [===================>……….] - ETA: 1s - loss: 1.4941 - accuracy: 0.3159



65/92 [====================>………] - ETA: 1s - loss: 1.4912 - accuracy: 0.3178



66/92 [====================>………] - ETA: 1s - loss: 1.4849 - accuracy: 0.3205



67/92 [====================>………] - ETA: 1s - loss: 1.4824 - accuracy: 0.3223



68/92 [=====================>……..] - ETA: 1s - loss: 1.4789 - accuracy: 0.3235



69/92 [=====================>……..] - ETA: 1s - loss: 1.4753 - accuracy: 0.3252



70/92 [=====================>……..] - ETA: 1s - loss: 1.4727 - accuracy: 0.3272



71/92 [======================>…….] - ETA: 1s - loss: 1.4665 - accuracy: 0.3288



72/92 [======================>…….] - ETA: 1s - loss: 1.4654 - accuracy: 0.3307



73/92 [======================>…….] - ETA: 1s - loss: 1.4620 - accuracy: 0.3313



74/92 [=======================>……] - ETA: 1s - loss: 1.4587 - accuracy: 0.3336



75/92 [=======================>……] - ETA: 1s - loss: 1.4537 - accuracy: 0.3363



76/92 [=======================>……] - ETA: 0s - loss: 1.4491 - accuracy: 0.3392



77/92 [========================>…..] - ETA: 0s - loss: 1.4422 - accuracy: 0.3421



78/92 [========================>…..] - ETA: 0s - loss: 1.4408 - accuracy: 0.3438



79/92 [========================>…..] - ETA: 0s - loss: 1.4371 - accuracy: 0.3465



80/92 [=========================>….] - ETA: 0s - loss: 1.4337 - accuracy: 0.3484



81/92 [=========================>….] - ETA: 0s - loss: 1.4300 - accuracy: 0.3499



82/92 [=========================>….] - ETA: 0s - loss: 1.4264 - accuracy: 0.3518



83/92 [==========================>…] - ETA: 0s - loss: 1.4237 - accuracy: 0.3524



84/92 [==========================>…] - ETA: 0s - loss: 1.4204 - accuracy: 0.3534



85/92 [==========================>…] - ETA: 0s - loss: 1.4163 - accuracy: 0.3555



86/92 [===========================>..] - ETA: 0s - loss: 1.4132 - accuracy: 0.3586



87/92 [===========================>..] - ETA: 0s - loss: 1.4104 - accuracy: 0.3602



88/92 [===========================>..] - ETA: 0s - loss: 1.4067 - accuracy: 0.3629



89/92 [============================>.] - ETA: 0s - loss: 1.4013 - accuracy: 0.3655



90/92 [============================>.] - ETA: 0s - loss: 1.3969 - accuracy: 0.3677



91/92 [============================>.] - ETA: 0s - loss: 1.3957 - accuracy: 0.3678



92/92 [==============================] - ETA: 0s - loss: 1.3942 - accuracy: 0.3682

2024-02-10 01:12:17.857731: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [734]
     [[{{node Placeholder/_0}}]]
2024-02-10 01:12:17.858066: I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'Placeholder/_0' with dtype string and shape [734]
     [[{{node Placeholder/_0}}]]


92/92 [==============================] - 7s 67ms/step - loss: 1.3942 - accuracy: 0.3682 - val_loss: 1.2278 - val_accuracy: 0.4864

Epoch 2/15
1/92 [..............................] - ETA: 7s - loss: 1.1103 - accuracy: 0.5000
   
2/92 [..............................] - ETA: 5s - loss: 1.1211 - accuracy: 0.5781
   
3/92 [..............................] - ETA: 5s - loss: 1.1221 - accuracy: 0.5521
   
4/92 [>.............................] - ETA: 5s - loss: 1.1369 - accuracy: 0.5469
   
5/92 [>.............................] - ETA: 5s - loss: 1.1210 - accuracy: 0.5312
   
6/92 [>.............................] - ETA: 5s - loss: 1.1376 - accuracy: 0.5260
   
7/92 [=>............................] - ETA: 5s - loss: 1.0852 - accuracy: 0.5402
   
8/92 [=>............................] - ETA: 4s - loss: 1.0779 - accuracy: 0.5312
   
9/92 [=>............................] - ETA: 4s - loss: 1.1305 - accuracy: 0.5208


10/92 [==>………………………] - ETA: 4s - loss: 1.1464 - accuracy: 0.5094



11/92 [==>………………………] - ETA: 4s - loss: 1.1565 - accuracy: 0.5028



12/92 [==>………………………] - ETA: 4s - loss: 1.1657 - accuracy: 0.5052



14/92 [===>……………………..] - ETA: 4s - loss: 1.1414 - accuracy: 0.5114



15/92 [===>……………………..] - ETA: 4s - loss: 1.1527 - accuracy: 0.5064



16/92 [====>…………………….] - ETA: 4s - loss: 1.1726 - accuracy: 0.4980



17/92 [====>…………………….] - ETA: 4s - loss: 1.1768 - accuracy: 0.4907



18/92 [====>…………………….] - ETA: 4s - loss: 1.1829 - accuracy: 0.4894



19/92 [=====>……………………] - ETA: 4s - loss: 1.1829 - accuracy: 0.4917



20/92 [=====>……………………] - ETA: 4s - loss: 1.1813 - accuracy: 0.4968



21/92 [=====>……………………] - ETA: 4s - loss: 1.1744 - accuracy: 0.5030



22/92 [======>…………………..] - ETA: 4s - loss: 1.1793 - accuracy: 0.5000



23/92 [======>…………………..] - ETA: 3s - loss: 1.1772 - accuracy: 0.5014



24/92 [======>…………………..] - ETA: 3s - loss: 1.1748 - accuracy: 0.5039



25/92 [=======>………………….] - ETA: 3s - loss: 1.1731 - accuracy: 0.5038



26/92 [=======>………………….] - ETA: 3s - loss: 1.1667 - accuracy: 0.5109



27/92 [=======>………………….] - ETA: 3s - loss: 1.1709 - accuracy: 0.5093



28/92 [========>…………………] - ETA: 3s - loss: 1.1625 - accuracy: 0.5146



29/92 [========>…………………] - ETA: 3s - loss: 1.1613 - accuracy: 0.5152



30/92 [========>…………………] - ETA: 3s - loss: 1.1667 - accuracy: 0.5147



31/92 [=========>………………..] - ETA: 3s - loss: 1.1620 - accuracy: 0.5193



32/92 [=========>………………..] - ETA: 3s - loss: 1.1589 - accuracy: 0.5177



33/92 [=========>………………..] - ETA: 3s - loss: 1.1593 - accuracy: 0.5181



34/92 [==========>……………….] - ETA: 3s - loss: 1.1560 - accuracy: 0.5167



35/92 [==========>……………….] - ETA: 3s - loss: 1.1544 - accuracy: 0.5162



36/92 [==========>……………….] - ETA: 3s - loss: 1.1475 - accuracy: 0.5210



37/92 [===========>………………] - ETA: 3s - loss: 1.1431 - accuracy: 0.5213



38/92 [===========>………………] - ETA: 3s - loss: 1.1386 - accuracy: 0.5265



39/92 [===========>………………] - ETA: 3s - loss: 1.1383 - accuracy: 0.5282



40/92 [============>……………..] - ETA: 3s - loss: 1.1376 - accuracy: 0.5275



41/92 [============>……………..] - ETA: 2s - loss: 1.1386 - accuracy: 0.5268



42/92 [============>……………..] - ETA: 2s - loss: 1.1374 - accuracy: 0.5262



43/92 [=============>…………….] - ETA: 2s - loss: 1.1333 - accuracy: 0.5278



44/92 [=============>…………….] - ETA: 2s - loss: 1.1271 - accuracy: 0.5293



45/92 [=============>…………….] - ETA: 2s - loss: 1.1306 - accuracy: 0.5258



46/92 [==============>……………] - ETA: 2s - loss: 1.1306 - accuracy: 0.5246



47/92 [==============>……………] - ETA: 2s - loss: 1.1274 - accuracy: 0.5261



48/92 [==============>……………] - ETA: 2s - loss: 1.1235 - accuracy: 0.5268



49/92 [==============>……………] - ETA: 2s - loss: 1.1230 - accuracy: 0.5282



50/92 [===============>…………..] - ETA: 2s - loss: 1.1240 - accuracy: 0.5283



51/92 [===============>…………..] - ETA: 2s - loss: 1.1228 - accuracy: 0.5302



52/92 [===============>…………..] - ETA: 2s - loss: 1.1179 - accuracy: 0.5326



53/92 [================>………….] - ETA: 2s - loss: 1.1113 - accuracy: 0.5367



54/92 [================>………….] - ETA: 2s - loss: 1.1151 - accuracy: 0.5378



55/92 [================>………….] - ETA: 2s - loss: 1.1139 - accuracy: 0.5388



56/92 [=================>…………] - ETA: 2s - loss: 1.1117 - accuracy: 0.5392



57/92 [=================>…………] - ETA: 2s - loss: 1.1080 - accuracy: 0.5413



58/92 [=================>…………] - ETA: 1s - loss: 1.1064 - accuracy: 0.5438



59/92 [==================>………..] - ETA: 1s - loss: 1.1016 - accuracy: 0.5452



60/92 [==================>………..] - ETA: 1s - loss: 1.1005 - accuracy: 0.5455



61/92 [==================>………..] - ETA: 1s - loss: 1.1002 - accuracy: 0.5448



62/92 [===================>……….] - ETA: 1s - loss: 1.0986 - accuracy: 0.5461



63/92 [===================>……….] - ETA: 1s - loss: 1.0959 - accuracy: 0.5483



64/92 [===================>……….] - ETA: 1s - loss: 1.0937 - accuracy: 0.5505



65/92 [====================>………] - ETA: 1s - loss: 1.0917 - accuracy: 0.5502



66/92 [====================>………] - ETA: 1s - loss: 1.0901 - accuracy: 0.5490



67/92 [====================>………] - ETA: 1s - loss: 1.0901 - accuracy: 0.5501



68/92 [=====================>……..] - ETA: 1s - loss: 1.0897 - accuracy: 0.5517



69/92 [=====================>……..] - ETA: 1s - loss: 1.0857 - accuracy: 0.5523



70/92 [=====================>……..] - ETA: 1s - loss: 1.0831 - accuracy: 0.5538



71/92 [======================>…….] - ETA: 1s - loss: 1.0809 - accuracy: 0.5548



72/92 [======================>…….] - ETA: 1s - loss: 1.0771 - accuracy: 0.5575



73/92 [======================>…….] - ETA: 1s - loss: 1.0745 - accuracy: 0.5597



74/92 [=======================>……] - ETA: 1s - loss: 1.0762 - accuracy: 0.5593



75/92 [=======================>……] - ETA: 0s - loss: 1.0742 - accuracy: 0.5594



76/92 [=======================>……] - ETA: 0s - loss: 1.0738 - accuracy: 0.5586



77/92 [========================>…..] - ETA: 0s - loss: 1.0748 - accuracy: 0.5574



78/92 [========================>…..] - ETA: 0s - loss: 1.0728 - accuracy: 0.5579



79/92 [========================>…..] - ETA: 0s - loss: 1.0742 - accuracy: 0.5595



80/92 [=========================>….] - ETA: 0s - loss: 1.0765 - accuracy: 0.5588



81/92 [=========================>….] - ETA: 0s - loss: 1.0769 - accuracy: 0.5600



82/92 [=========================>….] - ETA: 0s - loss: 1.0754 - accuracy: 0.5596



83/92 [==========================>…] - ETA: 0s - loss: 1.0743 - accuracy: 0.5597



84/92 [==========================>…] - ETA: 0s - loss: 1.0725 - accuracy: 0.5619



85/92 [==========================>…] - ETA: 0s - loss: 1.0725 - accuracy: 0.5608



86/92 [===========================>..] - ETA: 0s - loss: 1.0737 - accuracy: 0.5601



87/92 [===========================>..] - ETA: 0s - loss: 1.0731 - accuracy: 0.5602



88/92 [===========================>..] - ETA: 0s - loss: 1.0762 - accuracy: 0.5580



89/92 [============================>.] - ETA: 0s - loss: 1.0749 - accuracy: 0.5588



90/92 [============================>.] - ETA: 0s - loss: 1.0765 - accuracy: 0.5578



91/92 [============================>.] - ETA: 0s - loss: 1.0753 - accuracy: 0.5579



92/92 [==============================] - ETA: 0s - loss: 1.0742 - accuracy: 0.5589



92/92 [==============================] - 6s 64ms/step - loss: 1.0742 - accuracy: 0.5589 - val_loss: 1.0685 - val_accuracy: 0.5627

Epoch 3/15
1/92 [..............................] - ETA: 7s - loss: 0.9481 - accuracy: 0.6250
   
2/92 [..............................] - ETA: 5s - loss: 1.0400 - accuracy: 0.5312
   
3/92 [..............................] - ETA: 5s - loss: 1.0083 - accuracy: 0.5833
   
4/92 [>.............................] - ETA: 5s - loss: 0.9945 - accuracy: 0.6016
   
5/92 [>.............................] - ETA: 5s - loss: 0.9999 - accuracy: 0.5875
   
6/92 [>.............................] - ETA: 4s - loss: 0.9974 - accuracy: 0.5833
   
7/92 [=>............................] - ETA: 4s - loss: 0.9819 - accuracy: 0.5982
   
8/92 [=>............................] - ETA: 4s - loss: 0.9862 - accuracy: 0.5938
   
9/92 [=>............................] - ETA: 4s - loss: 0.9919 - accuracy: 0.6007


10/92 [==>………………………] - ETA: 4s - loss: 0.9875 - accuracy: 0.6062



11/92 [==>………………………] - ETA: 4s - loss: 0.9882 - accuracy: 0.6080



12/92 [==>………………………] - ETA: 4s - loss: 0.9726 - accuracy: 0.6120



13/92 [===>……………………..] - ETA: 4s - loss: 0.9736 - accuracy: 0.6034



14/92 [===>……………………..] - ETA: 4s - loss: 0.9760 - accuracy: 0.5982



15/92 [===>……………………..] - ETA: 4s - loss: 0.9785 - accuracy: 0.5979



16/92 [====>…………………….] - ETA: 4s - loss: 0.9527 - accuracy: 0.6172



17/92 [====>…………………….] - ETA: 4s - loss: 0.9492 - accuracy: 0.6213



18/92 [====>…………………….] - ETA: 4s - loss: 0.9475 - accuracy: 0.6215



19/92 [=====>……………………] - ETA: 4s - loss: 0.9402 - accuracy: 0.6234



20/92 [=====>……………………] - ETA: 4s - loss: 0.9313 - accuracy: 0.6281



21/92 [=====>……………………] - ETA: 4s - loss: 0.9270 - accuracy: 0.6324



22/92 [======>…………………..] - ETA: 4s - loss: 0.9411 - accuracy: 0.6264



23/92 [======>…………………..] - ETA: 4s - loss: 0.9370 - accuracy: 0.6264



24/92 [======>…………………..] - ETA: 3s - loss: 0.9346 - accuracy: 0.6276



25/92 [=======>………………….] - ETA: 3s - loss: 0.9300 - accuracy: 0.6300



26/92 [=======>………………….] - ETA: 3s - loss: 0.9249 - accuracy: 0.6298



27/92 [=======>………………….] - ETA: 3s - loss: 0.9217 - accuracy: 0.6319



28/92 [========>…………………] - ETA: 3s - loss: 0.9224 - accuracy: 0.6283



29/92 [========>…………………] - ETA: 3s - loss: 0.9219 - accuracy: 0.6261



30/92 [========>…………………] - ETA: 3s - loss: 0.9185 - accuracy: 0.6250



31/92 [=========>………………..] - ETA: 3s - loss: 0.9241 - accuracy: 0.6220



32/92 [=========>………………..] - ETA: 3s - loss: 0.9325 - accuracy: 0.6182



33/92 [=========>………………..] - ETA: 3s - loss: 0.9336 - accuracy: 0.6231



34/92 [==========>……………….] - ETA: 3s - loss: 0.9345 - accuracy: 0.6241



35/92 [==========>……………….] - ETA: 3s - loss: 0.9392 - accuracy: 0.6232



36/92 [==========>……………….] - ETA: 3s - loss: 0.9411 - accuracy: 0.6224



37/92 [===========>………………] - ETA: 3s - loss: 0.9436 - accuracy: 0.6225



38/92 [===========>………………] - ETA: 3s - loss: 0.9506 - accuracy: 0.6201



39/92 [===========>………………] - ETA: 3s - loss: 0.9491 - accuracy: 0.6234



40/92 [============>……………..] - ETA: 3s - loss: 0.9481 - accuracy: 0.6227



41/92 [============>……………..] - ETA: 2s - loss: 0.9429 - accuracy: 0.6250



42/92 [============>……………..] - ETA: 2s - loss: 0.9385 - accuracy: 0.6272



43/92 [=============>…………….] - ETA: 2s - loss: 0.9361 - accuracy: 0.6279



44/92 [=============>…………….] - ETA: 2s - loss: 0.9339 - accuracy: 0.6271



45/92 [=============>…………….] - ETA: 2s - loss: 0.9319 - accuracy: 0.6278



46/92 [==============>……………] - ETA: 2s - loss: 0.9336 - accuracy: 0.6277



47/92 [==============>……………] - ETA: 2s - loss: 0.9366 - accuracy: 0.6270



48/92 [==============>……………] - ETA: 2s - loss: 0.9346 - accuracy: 0.6263



49/92 [==============>……………] - ETA: 2s - loss: 0.9313 - accuracy: 0.6276



50/92 [===============>…………..] - ETA: 2s - loss: 0.9317 - accuracy: 0.6263



51/92 [===============>…………..] - ETA: 2s - loss: 0.9299 - accuracy: 0.6256



52/92 [===============>…………..] - ETA: 2s - loss: 0.9252 - accuracy: 0.6274



53/92 [================>………….] - ETA: 2s - loss: 0.9252 - accuracy: 0.6268



54/92 [================>………….] - ETA: 2s - loss: 0.9259 - accuracy: 0.6279



55/92 [================>………….] - ETA: 2s - loss: 0.9261 - accuracy: 0.6273



56/92 [=================>…………] - ETA: 2s - loss: 0.9282 - accuracy: 0.6278



57/92 [=================>…………] - ETA: 2s - loss: 0.9275 - accuracy: 0.6283



58/92 [=================>…………] - ETA: 1s - loss: 0.9368 - accuracy: 0.6245



59/92 [==================>………..] - ETA: 1s - loss: 0.9373 - accuracy: 0.6229



60/92 [==================>………..] - ETA: 1s - loss: 0.9358 - accuracy: 0.6234



61/92 [==================>………..] - ETA: 1s - loss: 0.9381 - accuracy: 0.6224



62/92 [===================>……….] - ETA: 1s - loss: 0.9375 - accuracy: 0.6235



63/92 [===================>……….] - ETA: 1s - loss: 0.9379 - accuracy: 0.6240



64/92 [===================>……….] - ETA: 1s - loss: 0.9418 - accuracy: 0.6230



65/92 [====================>………] - ETA: 1s - loss: 0.9404 - accuracy: 0.6221



66/92 [====================>………] - ETA: 1s - loss: 0.9375 - accuracy: 0.6245



67/92 [====================>………] - ETA: 1s - loss: 0.9395 - accuracy: 0.6227



68/92 [=====================>……..] - ETA: 1s - loss: 0.9382 - accuracy: 0.6232



69/92 [=====================>……..] - ETA: 1s - loss: 0.9388 - accuracy: 0.6236



70/92 [=====================>……..] - ETA: 1s - loss: 0.9366 - accuracy: 0.6246



71/92 [======================>…….] - ETA: 1s - loss: 0.9352 - accuracy: 0.6250



72/92 [======================>…….] - ETA: 1s - loss: 0.9365 - accuracy: 0.6233



73/92 [======================>…….] - ETA: 1s - loss: 0.9340 - accuracy: 0.6250



74/92 [=======================>……] - ETA: 1s - loss: 0.9335 - accuracy: 0.6258



75/92 [=======================>……] - ETA: 0s - loss: 0.9347 - accuracy: 0.6254



76/92 [=======================>……] - ETA: 0s - loss: 0.9330 - accuracy: 0.6262



77/92 [========================>…..] - ETA: 0s - loss: 0.9301 - accuracy: 0.6291



78/92 [========================>…..] - ETA: 0s - loss: 0.9306 - accuracy: 0.6290



80/92 [=========================>….] - ETA: 0s - loss: 0.9272 - accuracy: 0.6313



81/92 [=========================>….] - ETA: 0s - loss: 0.9265 - accuracy: 0.6316



82/92 [=========================>….] - ETA: 0s - loss: 0.9277 - accuracy: 0.6307



83/92 [==========================>…] - ETA: 0s - loss: 0.9297 - accuracy: 0.6295



84/92 [==========================>…] - ETA: 0s - loss: 0.9302 - accuracy: 0.6299



85/92 [==========================>…] - ETA: 0s - loss: 0.9299 - accuracy: 0.6309



86/92 [===========================>..] - ETA: 0s - loss: 0.9320 - accuracy: 0.6301



87/92 [===========================>..] - ETA: 0s - loss: 0.9358 - accuracy: 0.6279



88/92 [===========================>..] - ETA: 0s - loss: 0.9352 - accuracy: 0.6289



89/92 [============================>.] - ETA: 0s - loss: 0.9329 - accuracy: 0.6292



90/92 [============================>.] - ETA: 0s - loss: 0.9313 - accuracy: 0.6295



91/92 [============================>.] - ETA: 0s - loss: 0.9285 - accuracy: 0.6309



92/92 [==============================] - ETA: 0s - loss: 0.9316 - accuracy: 0.6291



92/92 [==============================] - 6s 64ms/step - loss: 0.9316 - accuracy: 0.6291 - val_loss: 0.8948 - val_accuracy: 0.6540

Epoch 4/15
1/92 [..............................] - ETA: 6s - loss: 0.9437 - accuracy: 0.6250
   
2/92 [..............................] - ETA: 5s - loss: 0.9400 - accuracy: 0.6406
   
3/92 [..............................] - ETA: 5s - loss: 1.0213 - accuracy: 0.6146
   
4/92 [>.............................] - ETA: 5s - loss: 1.0144 - accuracy: 0.6016
   
5/92 [>.............................] - ETA: 5s - loss: 0.9660 - accuracy: 0.6250
   
6/92 [>.............................] - ETA: 5s - loss: 0.9476 - accuracy: 0.6510
   
7/92 [=>............................] - ETA: 4s - loss: 0.9413 - accuracy: 0.6518
   
8/92 [=>............................] - ETA: 4s - loss: 0.9381 - accuracy: 0.6484
   
9/92 [=>............................] - ETA: 4s - loss: 0.9303 - accuracy: 0.6389


10/92 [==>………………………] - ETA: 4s - loss: 0.9133 - accuracy: 0.6469



11/92 [==>………………………] - ETA: 4s - loss: 0.9262 - accuracy: 0.6420



12/92 [==>………………………] - ETA: 4s - loss: 0.9131 - accuracy: 0.6458



13/92 [===>……………………..] - ETA: 4s - loss: 0.9015 - accuracy: 0.6538



14/92 [===>……………………..] - ETA: 4s - loss: 0.8883 - accuracy: 0.6585



15/92 [===>……………………..] - ETA: 4s - loss: 0.8822 - accuracy: 0.6562



16/92 [====>…………………….] - ETA: 4s - loss: 0.8838 - accuracy: 0.6602



17/92 [====>…………………….] - ETA: 4s - loss: 0.9114 - accuracy: 0.6489



18/92 [====>…………………….] - ETA: 4s - loss: 0.9011 - accuracy: 0.6510



19/92 [=====>……………………] - ETA: 4s - loss: 0.8960 - accuracy: 0.6530



20/92 [=====>……………………] - ETA: 4s - loss: 0.8969 - accuracy: 0.6516



21/92 [=====>……………………] - ETA: 4s - loss: 0.8903 - accuracy: 0.6503



22/92 [======>…………………..] - ETA: 4s - loss: 0.8889 - accuracy: 0.6477



23/92 [======>…………………..] - ETA: 4s - loss: 0.8916 - accuracy: 0.6467



24/92 [======>…………………..] - ETA: 3s - loss: 0.8925 - accuracy: 0.6471



25/92 [=======>………………….] - ETA: 3s - loss: 0.8842 - accuracy: 0.6513



26/92 [=======>………………….] - ETA: 3s - loss: 0.8855 - accuracy: 0.6502



27/92 [=======>………………….] - ETA: 3s - loss: 0.8771 - accuracy: 0.6539



28/92 [========>…………………] - ETA: 3s - loss: 0.8767 - accuracy: 0.6518



29/92 [========>…………………] - ETA: 3s - loss: 0.8698 - accuracy: 0.6541



30/92 [========>…………………] - ETA: 3s - loss: 0.8710 - accuracy: 0.6542



31/92 [=========>………………..] - ETA: 3s - loss: 0.8748 - accuracy: 0.6532



32/92 [=========>………………..] - ETA: 3s - loss: 0.8705 - accuracy: 0.6543



33/92 [=========>………………..] - ETA: 3s - loss: 0.8692 - accuracy: 0.6562



34/92 [==========>……………….] - ETA: 3s - loss: 0.8665 - accuracy: 0.6599



36/92 [==========>……………….] - ETA: 3s - loss: 0.8706 - accuracy: 0.6617



37/92 [===========>………………] - ETA: 3s - loss: 0.8676 - accuracy: 0.6624



38/92 [===========>………………] - ETA: 3s - loss: 0.8652 - accuracy: 0.6623



39/92 [===========>………………] - ETA: 3s - loss: 0.8623 - accuracy: 0.6629



40/92 [============>……………..] - ETA: 3s - loss: 0.8594 - accuracy: 0.6651



41/92 [============>……………..] - ETA: 2s - loss: 0.8566 - accuracy: 0.6672



42/92 [============>……………..] - ETA: 2s - loss: 0.8507 - accuracy: 0.6707



43/92 [=============>…………….] - ETA: 2s - loss: 0.8462 - accuracy: 0.6718



44/92 [=============>…………….] - ETA: 2s - loss: 0.8465 - accuracy: 0.6693



45/92 [=============>…………….] - ETA: 2s - loss: 0.8456 - accuracy: 0.6690



46/92 [==============>……………] - ETA: 2s - loss: 0.8501 - accuracy: 0.6653



47/92 [==============>……………] - ETA: 2s - loss: 0.8534 - accuracy: 0.6651



48/92 [==============>……………] - ETA: 2s - loss: 0.8511 - accuracy: 0.6656



49/92 [==============>……………] - ETA: 2s - loss: 0.8530 - accuracy: 0.6654



50/92 [===============>…………..] - ETA: 2s - loss: 0.8499 - accuracy: 0.6658



51/92 [===============>…………..] - ETA: 2s - loss: 0.8518 - accuracy: 0.6644



52/92 [===============>…………..] - ETA: 2s - loss: 0.8521 - accuracy: 0.6636



53/92 [================>………….] - ETA: 2s - loss: 0.8538 - accuracy: 0.6629



54/92 [================>………….] - ETA: 2s - loss: 0.8603 - accuracy: 0.6605



55/92 [================>………….] - ETA: 2s - loss: 0.8678 - accuracy: 0.6592



56/92 [=================>…………] - ETA: 2s - loss: 0.8674 - accuracy: 0.6592



57/92 [=================>…………] - ETA: 2s - loss: 0.8660 - accuracy: 0.6591



58/92 [=================>…………] - ETA: 1s - loss: 0.8648 - accuracy: 0.6607



59/92 [==================>………..] - ETA: 1s - loss: 0.8631 - accuracy: 0.6617



60/92 [==================>………..] - ETA: 1s - loss: 0.8645 - accuracy: 0.6595



61/92 [==================>………..] - ETA: 1s - loss: 0.8637 - accuracy: 0.6600



62/92 [===================>……….] - ETA: 1s - loss: 0.8635 - accuracy: 0.6594



63/92 [===================>……….] - ETA: 1s - loss: 0.8610 - accuracy: 0.6599



64/92 [===================>……….] - ETA: 1s - loss: 0.8574 - accuracy: 0.6623



65/92 [====================>………] - ETA: 1s - loss: 0.8597 - accuracy: 0.6612



66/92 [====================>………] - ETA: 1s - loss: 0.8600 - accuracy: 0.6602



67/92 [====================>………] - ETA: 1s - loss: 0.8582 - accuracy: 0.6606



68/92 [=====================>……..] - ETA: 1s - loss: 0.8573 - accuracy: 0.6601



69/92 [=====================>……..] - ETA: 1s - loss: 0.8575 - accuracy: 0.6600



70/92 [=====================>……..] - ETA: 1s - loss: 0.8576 - accuracy: 0.6604



71/92 [======================>…….] - ETA: 1s - loss: 0.8571 - accuracy: 0.6617



72/92 [======================>…….] - ETA: 1s - loss: 0.8587 - accuracy: 0.6598



73/92 [======================>…….] - ETA: 1s - loss: 0.8593 - accuracy: 0.6598



74/92 [=======================>……] - ETA: 1s - loss: 0.8549 - accuracy: 0.6610



75/92 [=======================>……] - ETA: 0s - loss: 0.8569 - accuracy: 0.6610



76/92 [=======================>……] - ETA: 0s - loss: 0.8579 - accuracy: 0.6605



77/92 [========================>…..] - ETA: 0s - loss: 0.8619 - accuracy: 0.6584



78/92 [========================>…..] - ETA: 0s - loss: 0.8620 - accuracy: 0.6580



79/92 [========================>…..] - ETA: 0s - loss: 0.8613 - accuracy: 0.6583



80/92 [=========================>….] - ETA: 0s - loss: 0.8616 - accuracy: 0.6591



81/92 [=========================>….] - ETA: 0s - loss: 0.8579 - accuracy: 0.6602



82/92 [=========================>….] - ETA: 0s - loss: 0.8597 - accuracy: 0.6594



83/92 [==========================>…] - ETA: 0s - loss: 0.8589 - accuracy: 0.6597



84/92 [==========================>…] - ETA: 0s - loss: 0.8584 - accuracy: 0.6601



85/92 [==========================>…] - ETA: 0s - loss: 0.8609 - accuracy: 0.6578



86/92 [===========================>..] - ETA: 0s - loss: 0.8626 - accuracy: 0.6567



87/92 [===========================>..] - ETA: 0s - loss: 0.8617 - accuracy: 0.6574



88/92 [===========================>..] - ETA: 0s - loss: 0.8618 - accuracy: 0.6571



89/92 [============================>.] - ETA: 0s - loss: 0.8622 - accuracy: 0.6553



90/92 [============================>.] - ETA: 0s - loss: 0.8624 - accuracy: 0.6542



91/92 [============================>.] - ETA: 0s - loss: 0.8621 - accuracy: 0.6553



92/92 [==============================] - ETA: 0s - loss: 0.8626 - accuracy: 0.6550



92/92 [==============================] - 6s 64ms/step - loss: 0.8626 - accuracy: 0.6550 - val_loss: 0.8452 - val_accuracy: 0.6540

Epoch 5/15
1/92 [..............................] - ETA: 7s - loss: 0.6427 - accuracy: 0.7188
   
2/92 [..............................] - ETA: 5s - loss: 0.6912 - accuracy: 0.7188
   
3/92 [..............................] - ETA: 5s - loss: 0.7137 - accuracy: 0.6875
   
4/92 [>.............................] - ETA: 5s - loss: 0.7362 - accuracy: 0.6875
   
5/92 [>.............................] - ETA: 5s - loss: 0.7649 - accuracy: 0.6812
   
6/92 [>.............................] - ETA: 5s - loss: 0.8160 - accuracy: 0.6562
   
7/92 [=>............................] - ETA: 4s - loss: 0.8130 - accuracy: 0.6562
   
8/92 [=>............................] - ETA: 4s - loss: 0.7964 - accuracy: 0.6680
   
9/92 [=>............................] - ETA: 4s - loss: 0.8136 - accuracy: 0.6632


10/92 [==>………………………] - ETA: 4s - loss: 0.8277 - accuracy: 0.6562



11/92 [==>………………………] - ETA: 4s - loss: 0.8489 - accuracy: 0.6477



12/92 [==>………………………] - ETA: 4s - loss: 0.8382 - accuracy: 0.6510



13/92 [===>……………………..] - ETA: 4s - loss: 0.8242 - accuracy: 0.6611



14/92 [===>……………………..] - ETA: 4s - loss: 0.8200 - accuracy: 0.6607



15/92 [===>……………………..] - ETA: 4s - loss: 0.8231 - accuracy: 0.6562



16/92 [====>…………………….] - ETA: 4s - loss: 0.8219 - accuracy: 0.6602



17/92 [====>…………………….] - ETA: 4s - loss: 0.8155 - accuracy: 0.6654



18/92 [====>…………………….] - ETA: 4s - loss: 0.8175 - accuracy: 0.6667



19/92 [=====>……………………] - ETA: 4s - loss: 0.8136 - accuracy: 0.6727



20/92 [=====>……………………] - ETA: 4s - loss: 0.8088 - accuracy: 0.6719



21/92 [=====>……………………] - ETA: 4s - loss: 0.7999 - accuracy: 0.6771



22/92 [======>…………………..] - ETA: 4s - loss: 0.8036 - accuracy: 0.6747



23/92 [======>…………………..] - ETA: 4s - loss: 0.8034 - accuracy: 0.6780



24/92 [======>…………………..] - ETA: 3s - loss: 0.8009 - accuracy: 0.6797



25/92 [=======>………………….] - ETA: 3s - loss: 0.7954 - accuracy: 0.6837



26/92 [=======>………………….] - ETA: 3s - loss: 0.7901 - accuracy: 0.6875



27/92 [=======>………………….] - ETA: 3s - loss: 0.7908 - accuracy: 0.6852



28/92 [========>…………………] - ETA: 3s - loss: 0.7907 - accuracy: 0.6864



29/92 [========>…………………] - ETA: 3s - loss: 0.7913 - accuracy: 0.6864



30/92 [========>…………………] - ETA: 3s - loss: 0.7988 - accuracy: 0.6823



31/92 [=========>………………..] - ETA: 3s - loss: 0.8004 - accuracy: 0.6804



32/92 [=========>………………..] - ETA: 3s - loss: 0.8030 - accuracy: 0.6797



33/92 [=========>………………..] - ETA: 3s - loss: 0.7989 - accuracy: 0.6809



34/92 [==========>……………….] - ETA: 3s - loss: 0.8025 - accuracy: 0.6792



35/92 [==========>……………….] - ETA: 3s - loss: 0.7999 - accuracy: 0.6786



36/92 [==========>……………….] - ETA: 3s - loss: 0.7912 - accuracy: 0.6832



37/92 [===========>………………] - ETA: 3s - loss: 0.7929 - accuracy: 0.6833



38/92 [===========>………………] - ETA: 3s - loss: 0.7941 - accuracy: 0.6826



39/92 [===========>………………] - ETA: 3s - loss: 0.7943 - accuracy: 0.6843



40/92 [============>……………..] - ETA: 3s - loss: 0.8040 - accuracy: 0.6828



41/92 [============>……………..] - ETA: 2s - loss: 0.8066 - accuracy: 0.6822



42/92 [============>……………..] - ETA: 2s - loss: 0.8043 - accuracy: 0.6838



43/92 [=============>…………….] - ETA: 2s - loss: 0.8004 - accuracy: 0.6860



44/92 [=============>…………….] - ETA: 2s - loss: 0.7954 - accuracy: 0.6875



45/92 [=============>…………….] - ETA: 2s - loss: 0.7981 - accuracy: 0.6875



46/92 [==============>……………] - ETA: 2s - loss: 0.7944 - accuracy: 0.6889



47/92 [==============>……………] - ETA: 2s - loss: 0.7967 - accuracy: 0.6875



48/92 [==============>……………] - ETA: 2s - loss: 0.7933 - accuracy: 0.6888



49/92 [==============>……………] - ETA: 2s - loss: 0.7877 - accuracy: 0.6913



50/92 [===============>…………..] - ETA: 2s - loss: 0.7844 - accuracy: 0.6919



51/92 [===============>…………..] - ETA: 2s - loss: 0.7897 - accuracy: 0.6893



53/92 [================>………….] - ETA: 2s - loss: 0.7869 - accuracy: 0.6902



54/92 [================>………….] - ETA: 2s - loss: 0.7861 - accuracy: 0.6913



55/92 [================>………….] - ETA: 2s - loss: 0.7849 - accuracy: 0.6912



56/92 [=================>…………] - ETA: 2s - loss: 0.7852 - accuracy: 0.6917



57/92 [=================>…………] - ETA: 2s - loss: 0.7868 - accuracy: 0.6916



58/92 [=================>…………] - ETA: 1s - loss: 0.7890 - accuracy: 0.6916



59/92 [==================>………..] - ETA: 1s - loss: 0.7928 - accuracy: 0.6910



60/92 [==================>………..] - ETA: 1s - loss: 0.7903 - accuracy: 0.6930



61/92 [==================>………..] - ETA: 1s - loss: 0.7904 - accuracy: 0.6939



62/92 [===================>……….] - ETA: 1s - loss: 0.7863 - accuracy: 0.6969



63/92 [===================>……….] - ETA: 1s - loss: 0.7856 - accuracy: 0.6977



64/92 [===================>……….] - ETA: 1s - loss: 0.7854 - accuracy: 0.6971



65/92 [====================>………] - ETA: 1s - loss: 0.7943 - accuracy: 0.6940



66/92 [====================>………] - ETA: 1s - loss: 0.7963 - accuracy: 0.6939



67/92 [====================>………] - ETA: 1s - loss: 0.8017 - accuracy: 0.6910



68/92 [=====================>……..] - ETA: 1s - loss: 0.8011 - accuracy: 0.6923



69/92 [=====================>……..] - ETA: 1s - loss: 0.8030 - accuracy: 0.6918



70/92 [=====================>……..] - ETA: 1s - loss: 0.8016 - accuracy: 0.6913



71/92 [======================>…….] - ETA: 1s - loss: 0.8007 - accuracy: 0.6917



72/92 [======================>…….] - ETA: 1s - loss: 0.8018 - accuracy: 0.6908



73/92 [======================>…….] - ETA: 1s - loss: 0.8020 - accuracy: 0.6920



74/92 [=======================>……] - ETA: 1s - loss: 0.8055 - accuracy: 0.6903



75/92 [=======================>……] - ETA: 0s - loss: 0.8092 - accuracy: 0.6894



76/92 [=======================>……] - ETA: 0s - loss: 0.8083 - accuracy: 0.6914



77/92 [========================>…..] - ETA: 0s - loss: 0.8063 - accuracy: 0.6930



78/92 [========================>…..] - ETA: 0s - loss: 0.8076 - accuracy: 0.6929



79/92 [========================>…..] - ETA: 0s - loss: 0.8080 - accuracy: 0.6925



80/92 [=========================>….] - ETA: 0s - loss: 0.8081 - accuracy: 0.6932



81/92 [=========================>….] - ETA: 0s - loss: 0.8083 - accuracy: 0.6916



82/92 [=========================>….] - ETA: 0s - loss: 0.8071 - accuracy: 0.6919



83/92 [==========================>…] - ETA: 0s - loss: 0.8083 - accuracy: 0.6915



84/92 [==========================>…] - ETA: 0s - loss: 0.8077 - accuracy: 0.6914



85/92 [==========================>…] - ETA: 0s - loss: 0.8083 - accuracy: 0.6910



86/92 [===========================>..] - ETA: 0s - loss: 0.8087 - accuracy: 0.6902



87/92 [===========================>..] - ETA: 0s - loss: 0.8088 - accuracy: 0.6891



88/92 [===========================>..] - ETA: 0s - loss: 0.8067 - accuracy: 0.6898



89/92 [============================>.] - ETA: 0s - loss: 0.8064 - accuracy: 0.6887



90/92 [============================>.] - ETA: 0s - loss: 0.8070 - accuracy: 0.6891



91/92 [============================>.] - ETA: 0s - loss: 0.8046 - accuracy: 0.6901



92/92 [==============================] - ETA: 0s - loss: 0.8027 - accuracy: 0.6911



92/92 [==============================] - 6s 64ms/step - loss: 0.8027 - accuracy: 0.6911 - val_loss: 0.9385 - val_accuracy: 0.6540

Epoch 6/15
1/92 [..............................] - ETA: 7s - loss: 0.8309 - accuracy: 0.6875
   
2/92 [..............................] - ETA: 5s - loss: 0.8032 - accuracy: 0.7031
   
3/92 [..............................] - ETA: 5s - loss: 0.8544 - accuracy: 0.6979
   
4/92 [>.............................] - ETA: 5s - loss: 0.7712 - accuracy: 0.7422
   
5/92 [>.............................] - ETA: 5s - loss: 0.7464 - accuracy: 0.7375
   
6/92 [>.............................] - ETA: 5s - loss: 0.7709 - accuracy: 0.7396
   
7/92 [=>............................] - ETA: 4s - loss: 0.7587 - accuracy: 0.7411
   
8/92 [=>............................] - ETA: 4s - loss: 0.8016 - accuracy: 0.7227
   
9/92 [=>............................] - ETA: 4s - loss: 0.7806 - accuracy: 0.7153


10/92 [==>………………………] - ETA: 4s - loss: 0.7750 - accuracy: 0.7188



11/92 [==>………………………] - ETA: 4s - loss: 0.7794 - accuracy: 0.7159



12/92 [==>………………………] - ETA: 4s - loss: 0.7689 - accuracy: 0.7161



13/92 [===>……………………..] - ETA: 4s - loss: 0.7657 - accuracy: 0.7115



14/92 [===>……………………..] - ETA: 4s - loss: 0.7456 - accuracy: 0.7165



15/92 [===>……………………..] - ETA: 4s - loss: 0.7359 - accuracy: 0.7188



16/92 [====>…………………….] - ETA: 4s - loss: 0.7275 - accuracy: 0.7207



17/92 [====>…………………….] - ETA: 4s - loss: 0.7406 - accuracy: 0.7151



18/92 [====>…………………….] - ETA: 4s - loss: 0.7406 - accuracy: 0.7135



19/92 [=====>……………………] - ETA: 4s - loss: 0.7385 - accuracy: 0.7138



20/92 [=====>……………………] - ETA: 4s - loss: 0.7366 - accuracy: 0.7141



21/92 [=====>……………………] - ETA: 4s - loss: 0.7355 - accuracy: 0.7143



22/92 [======>…………………..] - ETA: 4s - loss: 0.7315 - accuracy: 0.7145



23/92 [======>…………………..] - ETA: 4s - loss: 0.7283 - accuracy: 0.7160



24/92 [======>…………………..] - ETA: 3s - loss: 0.7425 - accuracy: 0.7070



25/92 [=======>………………….] - ETA: 3s - loss: 0.7453 - accuracy: 0.7088



26/92 [=======>………………….] - ETA: 3s - loss: 0.7423 - accuracy: 0.7151



27/92 [=======>………………….] - ETA: 3s - loss: 0.7448 - accuracy: 0.7118



28/92 [========>…………………] - ETA: 3s - loss: 0.7458 - accuracy: 0.7121



29/92 [========>…………………] - ETA: 3s - loss: 0.7452 - accuracy: 0.7144



30/92 [========>…………………] - ETA: 3s - loss: 0.7466 - accuracy: 0.7135



31/92 [=========>………………..] - ETA: 3s - loss: 0.7409 - accuracy: 0.7157



32/92 [=========>………………..] - ETA: 3s - loss: 0.7407 - accuracy: 0.7168



33/92 [=========>………………..] - ETA: 3s - loss: 0.7351 - accuracy: 0.7197



34/92 [==========>……………….] - ETA: 3s - loss: 0.7291 - accuracy: 0.7215



35/92 [==========>……………….] - ETA: 3s - loss: 0.7285 - accuracy: 0.7223



36/92 [==========>……………….] - ETA: 3s - loss: 0.7361 - accuracy: 0.7188



37/92 [===========>………………] - ETA: 3s - loss: 0.7333 - accuracy: 0.7204



38/92 [===========>………………] - ETA: 3s - loss: 0.7315 - accuracy: 0.7204



39/92 [===========>………………] - ETA: 3s - loss: 0.7309 - accuracy: 0.7204



40/92 [============>……………..] - ETA: 3s - loss: 0.7319 - accuracy: 0.7211



41/92 [============>……………..] - ETA: 2s - loss: 0.7360 - accuracy: 0.7180



42/92 [============>……………..] - ETA: 2s - loss: 0.7382 - accuracy: 0.7173



43/92 [=============>…………….] - ETA: 2s - loss: 0.7401 - accuracy: 0.7151



44/92 [=============>…………….] - ETA: 2s - loss: 0.7342 - accuracy: 0.7180



45/92 [=============>…………….] - ETA: 2s - loss: 0.7370 - accuracy: 0.7174



46/92 [==============>……………] - ETA: 2s - loss: 0.7370 - accuracy: 0.7174



47/92 [==============>……………] - ETA: 2s - loss: 0.7349 - accuracy: 0.7201



48/92 [==============>……………] - ETA: 2s - loss: 0.7335 - accuracy: 0.7181



49/92 [==============>……………] - ETA: 2s - loss: 0.7344 - accuracy: 0.7162



50/92 [===============>…………..] - ETA: 2s - loss: 0.7317 - accuracy: 0.7163



51/92 [===============>…………..] - ETA: 2s - loss: 0.7332 - accuracy: 0.7157



52/92 [===============>…………..] - ETA: 2s - loss: 0.7387 - accuracy: 0.7133



53/92 [================>………….] - ETA: 2s - loss: 0.7450 - accuracy: 0.7117



54/92 [================>………….] - ETA: 2s - loss: 0.7455 - accuracy: 0.7124



55/92 [================>………….] - ETA: 2s - loss: 0.7468 - accuracy: 0.7114



56/92 [=================>…………] - ETA: 2s - loss: 0.7489 - accuracy: 0.7121



58/92 [=================>…………] - ETA: 1s - loss: 0.7542 - accuracy: 0.7094



59/92 [==================>………..] - ETA: 1s - loss: 0.7542 - accuracy: 0.7090



60/92 [==================>………..] - ETA: 1s - loss: 0.7528 - accuracy: 0.7113



61/92 [==================>………..] - ETA: 1s - loss: 0.7565 - accuracy: 0.7088



62/92 [===================>……….] - ETA: 1s - loss: 0.7541 - accuracy: 0.7110



63/92 [===================>……….] - ETA: 1s - loss: 0.7551 - accuracy: 0.7097



64/92 [===================>……….] - ETA: 1s - loss: 0.7536 - accuracy: 0.7108



65/92 [====================>………] - ETA: 1s - loss: 0.7494 - accuracy: 0.7128



66/92 [====================>………] - ETA: 1s - loss: 0.7460 - accuracy: 0.7144



67/92 [====================>………] - ETA: 1s - loss: 0.7464 - accuracy: 0.7140



68/92 [=====================>……..] - ETA: 1s - loss: 0.7455 - accuracy: 0.7145



69/92 [=====================>……..] - ETA: 1s - loss: 0.7467 - accuracy: 0.7136



70/92 [=====================>……..] - ETA: 1s - loss: 0.7478 - accuracy: 0.7142



71/92 [======================>…….] - ETA: 1s - loss: 0.7466 - accuracy: 0.7155



72/92 [======================>…….] - ETA: 1s - loss: 0.7430 - accuracy: 0.7160



73/92 [======================>…….] - ETA: 1s - loss: 0.7404 - accuracy: 0.7169



74/92 [=======================>……] - ETA: 1s - loss: 0.7433 - accuracy: 0.7148



75/92 [=======================>……] - ETA: 0s - loss: 0.7427 - accuracy: 0.7153



76/92 [=======================>……] - ETA: 0s - loss: 0.7442 - accuracy: 0.7149



77/92 [========================>…..] - ETA: 0s - loss: 0.7437 - accuracy: 0.7146



78/92 [========================>…..] - ETA: 0s - loss: 0.7435 - accuracy: 0.7150



79/92 [========================>…..] - ETA: 0s - loss: 0.7461 - accuracy: 0.7151



80/92 [=========================>….] - ETA: 0s - loss: 0.7502 - accuracy: 0.7143



81/92 [=========================>….] - ETA: 0s - loss: 0.7494 - accuracy: 0.7148



82/92 [=========================>….] - ETA: 0s - loss: 0.7490 - accuracy: 0.7156



83/92 [==========================>…] - ETA: 0s - loss: 0.7489 - accuracy: 0.7156



84/92 [==========================>…] - ETA: 0s - loss: 0.7506 - accuracy: 0.7134



85/92 [==========================>…] - ETA: 0s - loss: 0.7490 - accuracy: 0.7146



86/92 [===========================>..] - ETA: 0s - loss: 0.7500 - accuracy: 0.7143



87/92 [===========================>..] - ETA: 0s - loss: 0.7500 - accuracy: 0.7143



88/92 [===========================>..] - ETA: 0s - loss: 0.7505 - accuracy: 0.7140



89/92 [============================>.] - ETA: 0s - loss: 0.7555 - accuracy: 0.7116



90/92 [============================>.] - ETA: 0s - loss: 0.7560 - accuracy: 0.7117



91/92 [============================>.] - ETA: 0s - loss: 0.7572 - accuracy: 0.7111



92/92 [==============================] - ETA: 0s - loss: 0.7571 - accuracy: 0.7115



92/92 [==============================] - 6s 64ms/step - loss: 0.7571 - accuracy: 0.7115 - val_loss: 0.7898 - val_accuracy: 0.6757

Epoch 7/15
1/92 [..............................] - ETA: 7s - loss: 1.0297 - accuracy: 0.5312
   
2/92 [..............................] - ETA: 5s - loss: 0.9824 - accuracy: 0.6250
   
3/92 [..............................] - ETA: 5s - loss: 0.9507 - accuracy: 0.6250
   
4/92 [>.............................] - ETA: 5s - loss: 0.8653 - accuracy: 0.6797
   
5/92 [>.............................] - ETA: 5s - loss: 0.8340 - accuracy: 0.6875
   
6/92 [>.............................] - ETA: 5s - loss: 0.7951 - accuracy: 0.7135
   
7/92 [=>............................] - ETA: 5s - loss: 0.7668 - accuracy: 0.7321
   
8/92 [=>............................] - ETA: 4s - loss: 0.7473 - accuracy: 0.7383
   
9/92 [=>............................] - ETA: 4s - loss: 0.7561 - accuracy: 0.7257


10/92 [==>………………………] - ETA: 4s - loss: 0.7352 - accuracy: 0.7344



11/92 [==>………………………] - ETA: 4s - loss: 0.7213 - accuracy: 0.7443



12/92 [==>………………………] - ETA: 4s - loss: 0.7234 - accuracy: 0.7396



13/92 [===>……………………..] - ETA: 4s - loss: 0.7168 - accuracy: 0.7380



14/92 [===>……………………..] - ETA: 4s - loss: 0.7381 - accuracy: 0.7344



15/92 [===>……………………..] - ETA: 4s - loss: 0.7442 - accuracy: 0.7292



16/92 [====>…………………….] - ETA: 4s - loss: 0.7399 - accuracy: 0.7305



17/92 [====>…………………….] - ETA: 4s - loss: 0.7363 - accuracy: 0.7316



18/92 [====>…………………….] - ETA: 4s - loss: 0.7337 - accuracy: 0.7292



19/92 [=====>……………………] - ETA: 4s - loss: 0.7476 - accuracy: 0.7286



20/92 [=====>……………………] - ETA: 4s - loss: 0.7442 - accuracy: 0.7297



21/92 [=====>……………………] - ETA: 4s - loss: 0.7557 - accuracy: 0.7217



22/92 [======>…………………..] - ETA: 4s - loss: 0.7469 - accuracy: 0.7244



23/92 [======>…………………..] - ETA: 4s - loss: 0.7375 - accuracy: 0.7296



24/92 [======>…………………..] - ETA: 3s - loss: 0.7313 - accuracy: 0.7318



25/92 [=======>………………….] - ETA: 3s - loss: 0.7247 - accuracy: 0.7337



26/92 [=======>………………….] - ETA: 3s - loss: 0.7243 - accuracy: 0.7344



27/92 [=======>………………….] - ETA: 3s - loss: 0.7181 - accuracy: 0.7350



28/92 [========>…………………] - ETA: 3s - loss: 0.7166 - accuracy: 0.7333



29/92 [========>…………………] - ETA: 3s - loss: 0.7185 - accuracy: 0.7328



30/92 [========>…………………] - ETA: 3s - loss: 0.7122 - accuracy: 0.7354



31/92 [=========>………………..] - ETA: 3s - loss: 0.7137 - accuracy: 0.7349



32/92 [=========>………………..] - ETA: 3s - loss: 0.7101 - accuracy: 0.7363



33/92 [=========>………………..] - ETA: 3s - loss: 0.7042 - accuracy: 0.7377



34/92 [==========>……………….] - ETA: 3s - loss: 0.7058 - accuracy: 0.7362



35/92 [==========>……………….] - ETA: 3s - loss: 0.7068 - accuracy: 0.7339



36/92 [==========>……………….] - ETA: 3s - loss: 0.7114 - accuracy: 0.7318



37/92 [===========>………………] - ETA: 3s - loss: 0.7073 - accuracy: 0.7348



38/92 [===========>………………] - ETA: 3s - loss: 0.7022 - accuracy: 0.7360



39/92 [===========>………………] - ETA: 3s - loss: 0.7105 - accuracy: 0.7340



40/92 [============>……………..] - ETA: 3s - loss: 0.7085 - accuracy: 0.7344



41/92 [============>……………..] - ETA: 2s - loss: 0.7062 - accuracy: 0.7348



42/92 [============>……………..] - ETA: 2s - loss: 0.7022 - accuracy: 0.7359



43/92 [=============>…………….] - ETA: 2s - loss: 0.7012 - accuracy: 0.7355



44/92 [=============>…………….] - ETA: 2s - loss: 0.7048 - accuracy: 0.7358



45/92 [=============>…………….] - ETA: 2s - loss: 0.7104 - accuracy: 0.7312



46/92 [==============>……………] - ETA: 2s - loss: 0.7085 - accuracy: 0.7303



47/92 [==============>……………] - ETA: 2s - loss: 0.7083 - accuracy: 0.7307



48/92 [==============>……………] - ETA: 2s - loss: 0.7098 - accuracy: 0.7298



49/92 [==============>……………] - ETA: 2s - loss: 0.7082 - accuracy: 0.7315



50/92 [===============>…………..] - ETA: 2s - loss: 0.7100 - accuracy: 0.7319



51/92 [===============>…………..] - ETA: 2s - loss: 0.7118 - accuracy: 0.7292



52/92 [===============>…………..] - ETA: 2s - loss: 0.7132 - accuracy: 0.7290



53/92 [================>………….] - ETA: 2s - loss: 0.7155 - accuracy: 0.7270



54/92 [================>………….] - ETA: 2s - loss: 0.7165 - accuracy: 0.7274



55/92 [================>………….] - ETA: 2s - loss: 0.7120 - accuracy: 0.7312



56/92 [=================>…………] - ETA: 2s - loss: 0.7100 - accuracy: 0.7321



57/92 [=================>…………] - ETA: 2s - loss: 0.7080 - accuracy: 0.7330



58/92 [=================>…………] - ETA: 1s - loss: 0.7078 - accuracy: 0.7322



59/92 [==================>………..] - ETA: 1s - loss: 0.7053 - accuracy: 0.7325



60/92 [==================>………..] - ETA: 1s - loss: 0.7040 - accuracy: 0.7323



61/92 [==================>………..] - ETA: 1s - loss: 0.7034 - accuracy: 0.7336



62/92 [===================>……….] - ETA: 1s - loss: 0.7052 - accuracy: 0.7329



63/92 [===================>……….] - ETA: 1s - loss: 0.7045 - accuracy: 0.7331



64/92 [===================>……….] - ETA: 1s - loss: 0.7029 - accuracy: 0.7329



65/92 [====================>………] - ETA: 1s - loss: 0.7010 - accuracy: 0.7327



66/92 [====================>………] - ETA: 1s - loss: 0.7028 - accuracy: 0.7320



67/92 [====================>………] - ETA: 1s - loss: 0.6986 - accuracy: 0.7341



68/92 [=====================>……..] - ETA: 1s - loss: 0.7025 - accuracy: 0.7335



69/92 [=====================>……..] - ETA: 1s - loss: 0.7025 - accuracy: 0.7337



70/92 [=====================>……..] - ETA: 1s - loss: 0.7004 - accuracy: 0.7339



71/92 [======================>…….] - ETA: 1s - loss: 0.7004 - accuracy: 0.7337



72/92 [======================>…….] - ETA: 1s - loss: 0.7039 - accuracy: 0.7326



73/92 [======================>…….] - ETA: 1s - loss: 0.7038 - accuracy: 0.7324



74/92 [=======================>……] - ETA: 1s - loss: 0.7075 - accuracy: 0.7314



75/92 [=======================>……] - ETA: 0s - loss: 0.7079 - accuracy: 0.7308



76/92 [=======================>……] - ETA: 0s - loss: 0.7064 - accuracy: 0.7315



77/92 [========================>…..] - ETA: 0s - loss: 0.7045 - accuracy: 0.7321



78/92 [========================>…..] - ETA: 0s - loss: 0.7033 - accuracy: 0.7324



79/92 [========================>…..] - ETA: 0s - loss: 0.7059 - accuracy: 0.7318



80/92 [=========================>….] - ETA: 0s - loss: 0.7084 - accuracy: 0.7305



81/92 [=========================>….] - ETA: 0s - loss: 0.7065 - accuracy: 0.7307



82/92 [=========================>….] - ETA: 0s - loss: 0.7112 - accuracy: 0.7287



83/92 [==========================>…] - ETA: 0s - loss: 0.7109 - accuracy: 0.7282



85/92 [==========================>…] - ETA: 0s - loss: 0.7080 - accuracy: 0.7301



86/92 [===========================>..] - ETA: 0s - loss: 0.7118 - accuracy: 0.7281



87/92 [===========================>..] - ETA: 0s - loss: 0.7095 - accuracy: 0.7287



88/92 [===========================>..] - ETA: 0s - loss: 0.7078 - accuracy: 0.7297



89/92 [============================>.] - ETA: 0s - loss: 0.7082 - accuracy: 0.7299



90/92 [============================>.] - ETA: 0s - loss: 0.7092 - accuracy: 0.7295



91/92 [============================>.] - ETA: 0s - loss: 0.7118 - accuracy: 0.7276



92/92 [==============================] - ETA: 0s - loss: 0.7101 - accuracy: 0.7279



92/92 [==============================] - 6s 64ms/step - loss: 0.7101 - accuracy: 0.7279 - val_loss: 0.7492 - val_accuracy: 0.7125

Epoch 8/15
1/92 [..............................] - ETA: 7s - loss: 0.4827 - accuracy: 0.8438
   
2/92 [..............................] - ETA: 5s - loss: 0.5721 - accuracy: 0.7969
   
3/92 [..............................] - ETA: 5s - loss: 0.5558 - accuracy: 0.7812
   
4/92 [>.............................] - ETA: 5s - loss: 0.5700 - accuracy: 0.7578
   
5/92 [>.............................] - ETA: 5s - loss: 0.5991 - accuracy: 0.7625
   
6/92 [>.............................] - ETA: 5s - loss: 0.6347 - accuracy: 0.7552
   
7/92 [=>............................] - ETA: 4s - loss: 0.6193 - accuracy: 0.7634
   
8/92 [=>............................] - ETA: 4s - loss: 0.6489 - accuracy: 0.7461
   
9/92 [=>............................] - ETA: 4s - loss: 0.6611 - accuracy: 0.7361


10/92 [==>………………………] - ETA: 4s - loss: 0.6498 - accuracy: 0.7375



11/92 [==>………………………] - ETA: 4s - loss: 0.6517 - accuracy: 0.7330



12/92 [==>………………………] - ETA: 4s - loss: 0.6420 - accuracy: 0.7448



13/92 [===>……………………..] - ETA: 4s - loss: 0.6429 - accuracy: 0.7476



14/92 [===>……………………..] - ETA: 4s - loss: 0.6445 - accuracy: 0.7455



15/92 [===>……………………..] - ETA: 4s - loss: 0.6421 - accuracy: 0.7437



16/92 [====>…………………….] - ETA: 4s - loss: 0.6356 - accuracy: 0.7461



17/92 [====>…………………….] - ETA: 4s - loss: 0.6272 - accuracy: 0.7500



18/92 [====>…………………….] - ETA: 4s - loss: 0.6430 - accuracy: 0.7396



19/92 [=====>……………………] - ETA: 4s - loss: 0.6390 - accuracy: 0.7434



20/92 [=====>……………………] - ETA: 4s - loss: 0.6434 - accuracy: 0.7422



21/92 [=====>……………………] - ETA: 4s - loss: 0.6388 - accuracy: 0.7470



22/92 [======>…………………..] - ETA: 4s - loss: 0.6338 - accuracy: 0.7472



23/92 [======>…………………..] - ETA: 4s - loss: 0.6420 - accuracy: 0.7418



24/92 [======>…………………..] - ETA: 3s - loss: 0.6480 - accuracy: 0.7383



25/92 [=======>………………….] - ETA: 3s - loss: 0.6484 - accuracy: 0.7425



26/92 [=======>………………….] - ETA: 3s - loss: 0.6535 - accuracy: 0.7440



27/92 [=======>………………….] - ETA: 3s - loss: 0.6577 - accuracy: 0.7407



28/92 [========>…………………] - ETA: 3s - loss: 0.6681 - accuracy: 0.7388



29/92 [========>…………………] - ETA: 3s - loss: 0.6671 - accuracy: 0.7414



30/92 [========>…………………] - ETA: 3s - loss: 0.6700 - accuracy: 0.7396



31/92 [=========>………………..] - ETA: 3s - loss: 0.6742 - accuracy: 0.7389



32/92 [=========>………………..] - ETA: 3s - loss: 0.6706 - accuracy: 0.7393



33/92 [=========>………………..] - ETA: 3s - loss: 0.6776 - accuracy: 0.7367



34/92 [==========>……………….] - ETA: 3s - loss: 0.6775 - accuracy: 0.7362



35/92 [==========>……………….] - ETA: 3s - loss: 0.6801 - accuracy: 0.7366



36/92 [==========>……………….] - ETA: 3s - loss: 0.6821 - accuracy: 0.7361



37/92 [===========>………………] - ETA: 3s - loss: 0.6873 - accuracy: 0.7356



38/92 [===========>………………] - ETA: 3s - loss: 0.6916 - accuracy: 0.7336



39/92 [===========>………………] - ETA: 3s - loss: 0.6910 - accuracy: 0.7324



40/92 [============>……………..] - ETA: 3s - loss: 0.6938 - accuracy: 0.7305



41/92 [============>……………..] - ETA: 2s - loss: 0.6958 - accuracy: 0.7294



42/92 [============>……………..] - ETA: 2s - loss: 0.6957 - accuracy: 0.7307



43/92 [=============>…………….] - ETA: 2s - loss: 0.6969 - accuracy: 0.7289



44/92 [=============>…………….] - ETA: 2s - loss: 0.6972 - accuracy: 0.7301



45/92 [=============>…………….] - ETA: 2s - loss: 0.6984 - accuracy: 0.7292



46/92 [==============>……………] - ETA: 2s - loss: 0.6993 - accuracy: 0.7283



47/92 [==============>……………] - ETA: 2s - loss: 0.6983 - accuracy: 0.7287



48/92 [==============>……………] - ETA: 2s - loss: 0.6976 - accuracy: 0.7298



49/92 [==============>……………] - ETA: 2s - loss: 0.6959 - accuracy: 0.7321



50/92 [===============>…………..] - ETA: 2s - loss: 0.6940 - accuracy: 0.7325



51/92 [===============>…………..] - ETA: 2s - loss: 0.6960 - accuracy: 0.7298



52/92 [===============>…………..] - ETA: 2s - loss: 0.6936 - accuracy: 0.7320



53/92 [================>………….] - ETA: 2s - loss: 0.6952 - accuracy: 0.7317



54/92 [================>………….] - ETA: 2s - loss: 0.6960 - accuracy: 0.7321



55/92 [================>………….] - ETA: 2s - loss: 0.6993 - accuracy: 0.7301



56/92 [=================>…………] - ETA: 2s - loss: 0.6994 - accuracy: 0.7299



57/92 [=================>…………] - ETA: 2s - loss: 0.7015 - accuracy: 0.7303



58/92 [=================>…………] - ETA: 1s - loss: 0.7040 - accuracy: 0.7279



59/92 [==================>………..] - ETA: 1s - loss: 0.7009 - accuracy: 0.7293



60/92 [==================>………..] - ETA: 1s - loss: 0.6984 - accuracy: 0.7318



61/92 [==================>………..] - ETA: 1s - loss: 0.6969 - accuracy: 0.7321



62/92 [===================>……….] - ETA: 1s - loss: 0.7041 - accuracy: 0.7308



63/92 [===================>……….] - ETA: 1s - loss: 0.7038 - accuracy: 0.7326



64/92 [===================>……….] - ETA: 1s - loss: 0.7015 - accuracy: 0.7349



65/92 [====================>………] - ETA: 1s - loss: 0.7055 - accuracy: 0.7327



66/92 [====================>………] - ETA: 1s - loss: 0.7060 - accuracy: 0.7311



67/92 [====================>………] - ETA: 1s - loss: 0.7073 - accuracy: 0.7304



68/92 [=====================>……..] - ETA: 1s - loss: 0.7051 - accuracy: 0.7316



69/92 [=====================>……..] - ETA: 1s - loss: 0.7052 - accuracy: 0.7310



70/92 [=====================>……..] - ETA: 1s - loss: 0.7040 - accuracy: 0.7312



71/92 [======================>…….] - ETA: 1s - loss: 0.7014 - accuracy: 0.7315



72/92 [======================>…….] - ETA: 1s - loss: 0.6997 - accuracy: 0.7313



73/92 [======================>…….] - ETA: 1s - loss: 0.7002 - accuracy: 0.7320



74/92 [=======================>……] - ETA: 1s - loss: 0.7024 - accuracy: 0.7306



75/92 [=======================>……] - ETA: 0s - loss: 0.7007 - accuracy: 0.7304



76/92 [=======================>……] - ETA: 0s - loss: 0.6981 - accuracy: 0.7315



77/92 [========================>…..] - ETA: 0s - loss: 0.6973 - accuracy: 0.7309



78/92 [========================>…..] - ETA: 0s - loss: 0.6992 - accuracy: 0.7304



79/92 [========================>…..] - ETA: 0s - loss: 0.7035 - accuracy: 0.7282



80/92 [=========================>….] - ETA: 0s - loss: 0.7038 - accuracy: 0.7277



81/92 [=========================>….] - ETA: 0s - loss: 0.7013 - accuracy: 0.7296



82/92 [=========================>….] - ETA: 0s - loss: 0.7029 - accuracy: 0.7287



83/92 [==========================>…] - ETA: 0s - loss: 0.7047 - accuracy: 0.7282



84/92 [==========================>…] - ETA: 0s - loss: 0.7022 - accuracy: 0.7295



85/92 [==========================>…] - ETA: 0s - loss: 0.7017 - accuracy: 0.7298



87/92 [===========================>..] - ETA: 0s - loss: 0.7005 - accuracy: 0.7298



88/92 [===========================>..] - ETA: 0s - loss: 0.7001 - accuracy: 0.7293



89/92 [============================>.] - ETA: 0s - loss: 0.6992 - accuracy: 0.7292



90/92 [============================>.] - ETA: 0s - loss: 0.7045 - accuracy: 0.7270



91/92 [============================>.] - ETA: 0s - loss: 0.7020 - accuracy: 0.7280



92/92 [==============================] - ETA: 0s - loss: 0.7008 - accuracy: 0.7285



92/92 [==============================] - 6s 64ms/step - loss: 0.7008 - accuracy: 0.7285 - val_loss: 0.7422 - val_accuracy: 0.7139

Epoch 9/15
1/92 [..............................] - ETA: 7s - loss: 0.5952 - accuracy: 0.8125
   
2/92 [..............................] - ETA: 5s - loss: 0.6358 - accuracy: 0.7656
   
3/92 [..............................] - ETA: 5s - loss: 0.6093 - accuracy: 0.7812
   
4/92 [>.............................] - ETA: 5s - loss: 0.6300 - accuracy: 0.7812
   
5/92 [>.............................] - ETA: 5s - loss: 0.6096 - accuracy: 0.7875
   
6/92 [>.............................] - ETA: 4s - loss: 0.6280 - accuracy: 0.7812
   
7/92 [=>............................] - ETA: 4s - loss: 0.6085 - accuracy: 0.7902
   
8/92 [=>............................] - ETA: 4s - loss: 0.5959 - accuracy: 0.7930
   
9/92 [=>............................] - ETA: 4s - loss: 0.6384 - accuracy: 0.7674


10/92 [==>………………………] - ETA: 4s - loss: 0.6578 - accuracy: 0.7594



11/92 [==>………………………] - ETA: 4s - loss: 0.6381 - accuracy: 0.7557



12/92 [==>………………………] - ETA: 4s - loss: 0.6323 - accuracy: 0.7526



13/92 [===>……………………..] - ETA: 4s - loss: 0.6182 - accuracy: 0.7620



14/92 [===>……………………..] - ETA: 4s - loss: 0.6125 - accuracy: 0.7679



15/92 [===>……………………..] - ETA: 4s - loss: 0.6359 - accuracy: 0.7604



16/92 [====>…………………….] - ETA: 4s - loss: 0.6226 - accuracy: 0.7676



17/92 [====>…………………….] - ETA: 4s - loss: 0.6165 - accuracy: 0.7665



18/92 [====>…………………….] - ETA: 4s - loss: 0.6314 - accuracy: 0.7622



19/92 [=====>……………………] - ETA: 4s - loss: 0.6526 - accuracy: 0.7549



20/92 [=====>……………………] - ETA: 4s - loss: 0.6501 - accuracy: 0.7578



21/92 [=====>……………………] - ETA: 4s - loss: 0.6442 - accuracy: 0.7574



22/92 [======>…………………..] - ETA: 4s - loss: 0.6517 - accuracy: 0.7557



23/92 [======>…………………..] - ETA: 4s - loss: 0.6544 - accuracy: 0.7554



24/92 [======>…………………..] - ETA: 3s - loss: 0.6612 - accuracy: 0.7526



25/92 [=======>………………….] - ETA: 3s - loss: 0.6606 - accuracy: 0.7538



26/92 [=======>………………….] - ETA: 3s - loss: 0.6734 - accuracy: 0.7476



27/92 [=======>………………….] - ETA: 3s - loss: 0.6781 - accuracy: 0.7465



28/92 [========>…………………] - ETA: 3s - loss: 0.6773 - accuracy: 0.7478



29/92 [========>…………………] - ETA: 3s - loss: 0.6764 - accuracy: 0.7468



30/92 [========>…………………] - ETA: 3s - loss: 0.6786 - accuracy: 0.7448



31/92 [=========>………………..] - ETA: 3s - loss: 0.6748 - accuracy: 0.7470



32/92 [=========>………………..] - ETA: 3s - loss: 0.6695 - accuracy: 0.7500



33/92 [=========>………………..] - ETA: 3s - loss: 0.6737 - accuracy: 0.7500



34/92 [==========>……………….] - ETA: 3s - loss: 0.6742 - accuracy: 0.7528



35/92 [==========>……………….] - ETA: 3s - loss: 0.6742 - accuracy: 0.7527



36/92 [==========>……………….] - ETA: 3s - loss: 0.6792 - accuracy: 0.7491



37/92 [===========>………………] - ETA: 3s - loss: 0.6797 - accuracy: 0.7492



38/92 [===========>………………] - ETA: 3s - loss: 0.6795 - accuracy: 0.7508



39/92 [===========>………………] - ETA: 3s - loss: 0.6798 - accuracy: 0.7524



40/92 [============>……………..] - ETA: 3s - loss: 0.6784 - accuracy: 0.7531



41/92 [============>……………..] - ETA: 2s - loss: 0.6777 - accuracy: 0.7546



42/92 [============>……………..] - ETA: 2s - loss: 0.6793 - accuracy: 0.7537



43/92 [=============>…………….] - ETA: 2s - loss: 0.6765 - accuracy: 0.7551



44/92 [=============>…………….] - ETA: 2s - loss: 0.6809 - accuracy: 0.7521



45/92 [=============>…………….] - ETA: 2s - loss: 0.6783 - accuracy: 0.7528



46/92 [==============>……………] - ETA: 2s - loss: 0.6778 - accuracy: 0.7520



47/92 [==============>……………] - ETA: 2s - loss: 0.6757 - accuracy: 0.7527



48/92 [==============>……………] - ETA: 2s - loss: 0.6733 - accuracy: 0.7552



49/92 [==============>……………] - ETA: 2s - loss: 0.6682 - accuracy: 0.7570



50/92 [===============>…………..] - ETA: 2s - loss: 0.6681 - accuracy: 0.7563



51/92 [===============>…………..] - ETA: 2s - loss: 0.6729 - accuracy: 0.7543



52/92 [===============>…………..] - ETA: 2s - loss: 0.6719 - accuracy: 0.7554



53/92 [================>………….] - ETA: 2s - loss: 0.6718 - accuracy: 0.7559



54/92 [================>………….] - ETA: 2s - loss: 0.6663 - accuracy: 0.7581



55/92 [================>………….] - ETA: 2s - loss: 0.6645 - accuracy: 0.7574



56/92 [=================>…………] - ETA: 2s - loss: 0.6582 - accuracy: 0.7600



57/92 [=================>…………] - ETA: 2s - loss: 0.6585 - accuracy: 0.7599



58/92 [=================>…………] - ETA: 1s - loss: 0.6641 - accuracy: 0.7586



59/92 [==================>………..] - ETA: 1s - loss: 0.6662 - accuracy: 0.7590



60/92 [==================>………..] - ETA: 1s - loss: 0.6648 - accuracy: 0.7599



61/92 [==================>………..] - ETA: 1s - loss: 0.6666 - accuracy: 0.7592



62/92 [===================>……….] - ETA: 1s - loss: 0.6684 - accuracy: 0.7576



63/92 [===================>……….] - ETA: 1s - loss: 0.6665 - accuracy: 0.7594



64/92 [===================>……….] - ETA: 1s - loss: 0.6657 - accuracy: 0.7598



65/92 [====================>………] - ETA: 1s - loss: 0.6704 - accuracy: 0.7582



66/92 [====================>………] - ETA: 1s - loss: 0.6705 - accuracy: 0.7576



67/92 [====================>………] - ETA: 1s - loss: 0.6678 - accuracy: 0.7579



68/92 [=====================>……..] - ETA: 1s - loss: 0.6693 - accuracy: 0.7574



69/92 [=====================>……..] - ETA: 1s - loss: 0.6673 - accuracy: 0.7586



71/92 [======================>…….] - ETA: 1s - loss: 0.6658 - accuracy: 0.7588



72/92 [======================>…….] - ETA: 1s - loss: 0.6684 - accuracy: 0.7583



73/92 [======================>…….] - ETA: 1s - loss: 0.6677 - accuracy: 0.7582



74/92 [=======================>……] - ETA: 1s - loss: 0.6661 - accuracy: 0.7581



75/92 [=======================>……] - ETA: 0s - loss: 0.6632 - accuracy: 0.7588



76/92 [=======================>……] - ETA: 0s - loss: 0.6618 - accuracy: 0.7591



77/92 [========================>…..] - ETA: 0s - loss: 0.6600 - accuracy: 0.7602



78/92 [========================>…..] - ETA: 0s - loss: 0.6619 - accuracy: 0.7592



79/92 [========================>…..] - ETA: 0s - loss: 0.6630 - accuracy: 0.7583



80/92 [=========================>….] - ETA: 0s - loss: 0.6628 - accuracy: 0.7586



81/92 [=========================>….] - ETA: 0s - loss: 0.6634 - accuracy: 0.7593



82/92 [=========================>….] - ETA: 0s - loss: 0.6638 - accuracy: 0.7592



83/92 [==========================>…] - ETA: 0s - loss: 0.6610 - accuracy: 0.7606



84/92 [==========================>…] - ETA: 0s - loss: 0.6593 - accuracy: 0.7616



85/92 [==========================>…] - ETA: 0s - loss: 0.6583 - accuracy: 0.7622



86/92 [===========================>..] - ETA: 0s - loss: 0.6555 - accuracy: 0.7635



87/92 [===========================>..] - ETA: 0s - loss: 0.6543 - accuracy: 0.7637



88/92 [===========================>..] - ETA: 0s - loss: 0.6534 - accuracy: 0.7639



89/92 [============================>.] - ETA: 0s - loss: 0.6534 - accuracy: 0.7637



90/92 [============================>.] - ETA: 0s - loss: 0.6510 - accuracy: 0.7646



91/92 [============================>.] - ETA: 0s - loss: 0.6516 - accuracy: 0.7634



92/92 [==============================] - ETA: 0s - loss: 0.6513 - accuracy: 0.7636



92/92 [==============================] - 6s 64ms/step - loss: 0.6513 - accuracy: 0.7636 - val_loss: 0.7100 - val_accuracy: 0.7166

Epoch 10/15
1/92 [..............................] - ETA: 7s - loss: 0.6052 - accuracy: 0.7812
   
2/92 [..............................] - ETA: 5s - loss: 0.7084 - accuracy: 0.7344
   
3/92 [..............................] - ETA: 5s - loss: 0.7363 - accuracy: 0.7292
   
4/92 [>.............................] - ETA: 5s - loss: 0.7406 - accuracy: 0.7031
   
5/92 [>.............................] - ETA: 5s - loss: 0.7162 - accuracy: 0.7125
   
6/92 [>.............................] - ETA: 5s - loss: 0.6760 - accuracy: 0.7344
   
7/92 [=>............................] - ETA: 5s - loss: 0.6778 - accuracy: 0.7455