Author: David Griffiths
Date created: 2020/05/25
Last modified: 2024/01/09
Description: Implementation of PointNet for ModelNet10 classification.
View in Colab β’ GitHub source
Classification, detection and segmentation of unordered 3D point sets i.e. point clouds is a core problem in computer vision. This example implements the seminal point cloud deep learning paper PointNet (Qi et al., 2017). For a detailed intoduction on PointNet see this blog post.
If using colab first install trimesh with !pip install trimesh
.
import os
import glob
import trimesh
import numpy as np
from tensorflow import data as tf_data
from keras import ops
import keras
from keras import layers
from matplotlib import pyplot as plt
keras.utils.set_random_seed(seed=42)
We use the ModelNet10 model dataset, the smaller 10 class version of the ModelNet40 dataset. First download the data:
DATA_DIR = keras.utils.get_file(
"modelnet.zip",
"http://3dvision.princeton.edu/projects/2014/3DShapeNets/ModelNet10.zip",
extract=True,
)
DATA_DIR = os.path.join(os.path.dirname(DATA_DIR), "ModelNet10")
Downloading data from http://3dvision.princeton.edu/projects/2014/3DShapeNets/ModelNet10.zip
0/473402300 [37mββββββββββββββββββββ 0s 0s/step
8192/473402300 [37mββββββββββββββββββββ 1:06:44 8us/step
40960/473402300 [37mββββββββββββββββββββ 26:17 3us/step
90112/473402300 [37mββββββββββββββββββββ 17:49 2us/step
188416/473402300 [37mββββββββββββββββββββ 11:20 1us/step
385024/473402300 [37mββββββββββββββββββββ 6:55 1us/step
786432/473402300 [37mββββββββββββββββββββ 4:03 1us/step
1581056/473402300 [37mββββββββββββββββββββ 2:21 0us/step
3170304/473402300 [37mββββββββββββββββββββ 1:20 0us/step
6004736/473402300 [37mββββββββββββββββββββ 47s 0us/step
8880128/473402300 [37mββββββββββββββββββββ 35s 0us/step
11902976/473402300 [37mββββββββββββββββββββ 28s 0us/step
14925824/473402300 [37mββββββββββββββββββββ 24s 0us/step
17915904/473402300 [37mββββββββββββββββββββ 22s 0us/step
21020672/473402300 [37mββββββββββββββββββββ 20s 0us/step
23977984/473402300 β[37mβββββββββββββββββββ 18s 0us/step
26861568/473402300 β[37mβββββββββββββββββββ 17s 0us/step
29958144/473402300 β[37mβββββββββββββββββββ 16s 0us/step
33071104/473402300 β[37mβββββββββββββββββββ 16s 0us/step
36175872/473402300 β[37mβββββββββββββββββββ 15s 0us/step
39206912/473402300 β[37mβββββββββββββββββββ 14s 0us/step
41902080/473402300 β[37mβββββββββββββββββββ 14s 0us/step
45015040/473402300 β[37mβββββββββββββββββββ 14s 0us/step
48021504/473402300 ββ[37mββββββββββββββββββ 13s 0us/step
51003392/473402300 ββ[37mββββββββββββββββββ 13s 0us/step
53960704/473402300 ββ[37mββββββββββββββββββ 13s 0us/step
56803328/473402300 ββ[37mββββββββββββββββββ 12s 0us/step
59834368/473402300 ββ[37mββββββββββββββββββ 12s 0us/step
62750720/473402300 ββ[37mββββββββββββββββββ 12s 0us/step
65839104/473402300 ββ[37mββββββββββββββββββ 12s 0us/step
68698112/473402300 ββ[37mββββββββββββββββββ 11s 0us/step
71385088/473402300 βββ[37mβββββββββββββββββ 11s 0us/step
74432512/473402300 βββ[37mβββββββββββββββββ 11s 0us/step
77365248/473402300 βββ[37mβββββββββββββββββ 11s 0us/step
80363520/473402300 βββ[37mβββββββββββββββββ 11s 0us/step
83156992/473402300 βββ[37mβββββββββββββββββ 11s 0us/step
86179840/473402300 βββ[37mβββββββββββββββββ 10s 0us/step
89300992/473402300 βββ[37mβββββββββββββββββ 10s 0us/step
92282880/473402300 βββ[37mβββββββββββββββββ 10s 0us/step
95371264/473402300 ββββ[37mββββββββββββββββ 10s 0us/step
98410496/473402300 ββββ[37mββββββββββββββββ 10s 0us/step
101130240/473402300 ββββ[37mββββββββββββββββ 10s 0us/step
104169472/473402300 ββββ[37mββββββββββββββββ 10s 0us/step
107192320/473402300 ββββ[37mββββββββββββββββ 9s 0us/step
110297088/473402300 ββββ[37mββββββββββββββββ 9s 0us/step
113344512/473402300 ββββ[37mββββββββββββββββ 9s 0us/step
116391936/473402300 ββββ[37mββββββββββββββββ 9s 0us/step
119513088/473402300 βββββ[37mβββββββββββββββ 9s 0us/step
122626048/473402300 βββββ[37mβββββββββββββββ 9s 0us/step
125313024/473402300 βββββ[37mβββββββββββββββ 9s 0us/step
128368640/473402300 βββββ[37mβββββββββββββββ 9s 0us/step
131432448/473402300 βββββ[37mβββββββββββββββ 8s 0us/step
134520832/473402300 βββββ[37mβββββββββββββββ 8s 0us/step
137560064/473402300 βββββ[37mβββββββββββββββ 8s 0us/step
140648448/473402300 βββββ[37mβββββββββββββββ 8s 0us/step
143720448/473402300 ββββββ[37mββββββββββββββ 8s 0us/step
146808832/473402300 ββββββ[37mββββββββββββββ 8s 0us/step
149864448/473402300 ββββββ[37mββββββββββββββ 8s 0us/step
152592384/473402300 ββββββ[37mββββββββββββββ 8s 0us/step
155623424/473402300 ββββββ[37mββββββββββββββ 8s 0us/step
158728192/473402300 ββββββ[37mββββββββββββββ 8s 0us/step
161783808/473402300 ββββββ[37mββββββββββββββ 7s 0us/step
164806656/473402300 ββββββ[37mββββββββββββββ 7s 0us/step
167895040/473402300 βββββββ[37mβββββββββββββ 7s 0us/step
170975232/473402300 βββββββ[37mβββββββββββββ 7s 0us/step
174071808/473402300 βββββββ[37mβββββββββββββ 7s 0us/step
177119232/473402300 βββββββ[37mβββββββββββββ 7s 0us/step
180166656/473402300 βββββββ[37mβββββββββββββ 7s 0us/step
182976512/473402300 βββββββ[37mβββββββββββββ 7s 0us/step
185884672/473402300 βββββββ[37mβββββββββββββ 7s 0us/step
188932096/473402300 βββββββ[37mβββββββββββββ 7s 0us/step
192028672/473402300 ββββββββ[37mββββββββββββ 7s 0us/step
195117056/473402300 ββββββββ[37mββββββββββββ 6s 0us/step
198189056/473402300 ββββββββ[37mββββββββββββ 6s 0us/step
201302016/473402300 ββββββββ[37mββββββββββββ 6s 0us/step
204406784/473402300 ββββββββ[37mββββββββββββ 6s 0us/step
207470592/473402300 ββββββββ[37mββββββββββββ 6s 0us/step
210575360/473402300 ββββββββ[37mββββββββββββ 6s 0us/step
213581824/473402300 βββββββββ[37mβββββββββββ 6s 0us/step
216268800/473402300 βββββββββ[37mβββββββββββ 6s 0us/step
218374144/473402300 βββββββββ[37mβββββββββββ 6s 0us/step
220569600/473402300 βββββββββ[37mβββββββββββ 6s 0us/step
222363648/473402300 βββββββββ[37mβββββββββββ 6s 0us/step
225345536/473402300 βββββββββ[37mβββββββββββ 6s 0us/step
228425728/473402300 βββββββββ[37mβββββββββββ 6s 0us/step
231473152/473402300 βββββββββ[37mβββββββββββ 5s 0us/step
234577920/473402300 βββββββββ[37mβββββββββββ 5s 0us/step
237690880/473402300 ββββββββββ[37mββββββββββ 5s 0us/step
240746496/473402300 ββββββββββ[37mββββββββββ 5s 0us/step
243834880/473402300 ββββββββββ[37mββββββββββ 5s 0us/step
246898688/473402300 ββββββββββ[37mββββββββββ 5s 0us/step
249954304/473402300 ββββββββββ[37mββββββββββ 5s 0us/step
252936192/473402300 ββββββββββ[37mββββββββββ 5s 0us/step
255672320/473402300 ββββββββββ[37mββββββββββ 5s 0us/step
258695168/473402300 ββββββββββ[37mββββββββββ 5s 0us/step
261734400/473402300 βββββββββββ[37mβββββββββ 5s 0us/step
264847360/473402300 βββββββββββ[37mβββββββββ 5s 0us/step
267919360/473402300 βββββββββββ[37mβββββββββ 4s 0us/step
271015936/473402300 βββββββββββ[37mβββββββββ 4s 0us/step
273768448/473402300 βββββββββββ[37mβββββββββ 4s 0us/step
276840448/473402300 βββββββββββ[37mβββββββββ 4s 0us/step
279625728/473402300 βββββββββββ[37mβββββββββ 4s 0us/step
282525696/473402300 βββββββββββ[37mβββββββββ 4s 0us/step
285581312/473402300 ββββββββββββ[37mββββββββ 4s 0us/step
288645120/473402300 ββββββββββββ[37mββββββββ 4s 0us/step
291733504/473402300 ββββββββββββ[37mββββββββ 4s 0us/step
294682624/473402300 ββββββββββββ[37mββββββββ 4s 0us/step
297795584/473402300 ββββββββββββ[37mββββββββ 4s 0us/step
300851200/473402300 ββββββββββββ[37mββββββββ 4s 0us/step
303955968/473402300 ββββββββββββ[37mββββββββ 4s 0us/step
306798592/473402300 ββββββββββββ[37mββββββββ 3s 0us/step
309846016/473402300 βββββββββββββ[37mβββββββ 3s 0us/step
312926208/473402300 βββββββββββββ[37mβββββββ 3s 0us/step
315990016/473402300 βββββββββββββ[37mβββββββ 3s 0us/step
319053824/473402300 βββββββββββββ[37mβββββββ 3s 0us/step
322134016/473402300 βββββββββββββ[37mβββββββ 3s 0us/step
325099520/473402300 βββββββββββββ[37mβββββββ 3s 0us/step
328187904/473402300 βββββββββββββ[37mβββββββ 3s 0us/step
331251712/473402300 βββββββββββββ[37mβββββββ 3s 0us/step
334364672/473402300 ββββββββββββββ[37mββββββ 3s 0us/step
337477632/473402300 ββββββββββββββ[37mββββββ 3s 0us/step
340598784/473402300 ββββββββββββββ[37mββββββ 3s 0us/step
343130112/473402300 ββββββββββββββ[37mββββββ 3s 0us/step
345554944/473402300 ββββββββββββββ[37mββββββ 3s 0us/step
347570176/473402300 ββββββββββββββ[37mββββββ 2s 0us/step
350224384/473402300 ββββββββββββββ[37mββββββ 2s 0us/step
352436224/473402300 ββββββββββββββ[37mββββββ 2s 0us/step
355393536/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
357179392/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
359858176/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
362045440/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364281856/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364298240/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364306432/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364314624/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364322816/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364331008/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364339200/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364347392/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364355584/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364363776/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364371968/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364380160/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364396544/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364445696/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
364601344/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
365084672/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
366510080/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
369491968/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
372400128/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
375521280/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
378535936/473402300 βββββββββββββββ[37mβββββ 2s 0us/step
381558784/473402300 ββββββββββββββββ[37mββββ 2s 0us/step
384475136/473402300 ββββββββββββββββ[37mββββ 2s 0us/step
387571712/473402300 ββββββββββββββββ[37mββββ 2s 0us/step
390463488/473402300 ββββββββββββββββ[37mββββ 2s 0us/step
393551872/473402300 ββββββββββββββββ[37mββββ 2s 0us/step
396632064/473402300 ββββββββββββββββ[37mββββ 2s 0us/step
399704064/473402300 ββββββββββββββββ[37mββββ 1s 0us/step
402767872/473402300 βββββββββββββββββ[37mβββ 1s 0us/step
405790720/473402300 βββββββββββββββββ[37mβββ 1s 0us/step
408854528/473402300 βββββββββββββββββ[37mβββ 1s 0us/step
411975680/473402300 βββββββββββββββββ[37mβββ 1s 0us/step
414982144/473402300 βββββββββββββββββ[37mβββ 1s 0us/step
418045952/473402300 βββββββββββββββββ[37mβββ 1s 0us/step
421167104/473402300 βββββββββββββββββ[37mβββ 1s 0us/step
423878656/473402300 βββββββββββββββββ[37mβββ 1s 0us/step
426999808/473402300 ββββββββββββββββββ[37mββ 1s 0us/step
430112768/473402300 ββββββββββββββββββ[37mββ 1s 0us/step
433053696/473402300 ββββββββββββββββββ[37mββ 1s 0us/step
436125696/473402300 ββββββββββββββββββ[37mββ 0s 0us/step
439189504/473402300 ββββββββββββββββββ[37mββ 0s 0us/step
442286080/473402300 ββββββββββββββββββ[37mββ 0s 0us/step
445063168/473402300 ββββββββββββββββββ[37mββ 0s 0us/step
448118784/473402300 ββββββββββββββββββ[37mββ 0s 0us/step
451166208/473402300 βββββββββββββββββββ[37mβ 0s 0us/step
454262784/473402300 βββββββββββββββββββ[37mβ 0s 0us/step
457293824/473402300 βββββββββββββββββββ[37mβ 0s 0us/step
460275712/473402300 βββββββββββββββββββ[37mβ 0s 0us/step
463011840/473402300 βββββββββββββββββββ[37mβ 0s 0us/step
466018304/473402300 βββββββββββββββββββ[37mβ 0s 0us/step
469057536/473402300 βββββββββββββββββββ[37mβ 0s 0us/step
472145920/473402300 βββββββββββββββββββ[37mβ 0s 0us/step
473402300/473402300 ββββββββββββββββββββ 12s 0us/step
We can use the trimesh
package to read and visualize the .off
mesh files.
mesh = trimesh.load(os.path.join(DATA_DIR, "chair/train/chair_0001.off"))
mesh.show()
To convert a mesh file to a point cloud we first need to sample points on the mesh
surface. .sample()
performs a uniform random sampling. Here we sample at 2048 locations
and visualize in matplotlib
.
points = mesh.sample(2048)
fig = plt.figure(figsize=(5, 5))
ax = fig.add_subplot(111, projection="3d")
ax.scatter(points[:, 0], points[:, 1], points[:, 2])
ax.set_axis_off()
plt.show()
To generate a tf.data.Dataset()
we need to first parse through the ModelNet data
folders. Each mesh is loaded and sampled into a point cloud before being added to a
standard python list and converted to a numpy
array. We also store the current
enumerate index value as the object label and use a dictionary to recall this later.
def parse_dataset(num_points=2048):
train_points = []
train_labels = []
test_points = []
test_labels = []
class_map = {}
folders = glob.glob(os.path.join(DATA_DIR, "[!README]*"))
for i, folder in enumerate(folders):
print("processing class: {}".format(os.path.basename(folder)))
# store folder name with ID so we can retrieve later
class_map[i] = folder.split("/")[-1]
# gather all files
train_files = glob.glob(os.path.join(folder, "train/*"))
test_files = glob.glob(os.path.join(folder, "test/*"))
for f in train_files:
train_points.append(trimesh.load(f).sample(num_points))
train_labels.append(i)
for f in test_files:
test_points.append(trimesh.load(f).sample(num_points))
test_labels.append(i)
return (
np.array(train_points),
np.array(test_points),
np.array(train_labels),
np.array(test_labels),
class_map,
)
Set the number of points to sample and batch size and parse the dataset. This can take ~5minutes to complete.
NUM_POINTS = 2048
NUM_CLASSES = 10
BATCH_SIZE = 32
train_points, test_points, train_labels, test_labels, CLASS_MAP = parse_dataset(
NUM_POINTS
)
processing class: bathtub
processing class: monitor
processing class: desk
processing class: dresser
processing class: toilet
processing class: bed
processing class: sofa
processing class: chair
processing class: night_stand
processing class: table
Our data can now be read into a tf.data.Dataset()
object. We set the shuffle buffer
size to the entire size of the dataset as prior to this the data is ordered by class.
Data augmentation is important when working with point cloud data. We create a
augmentation function to jitter and shuffle the train dataset.
def augment(points, label):
# jitter points
points += keras.random.uniform(points.shape, -0.005, 0.005, dtype="float64")
# shuffle points
points = keras.random.shuffle(points)
return points, label
train_size = 0.8
dataset = tf_data.Dataset.from_tensor_slices((train_points, train_labels))
test_dataset = tf_data.Dataset.from_tensor_slices((test_points, test_labels))
train_dataset_size = int(len(dataset) * train_size)
dataset = dataset.shuffle(len(train_points)).map(augment)
test_dataset = test_dataset.shuffle(len(test_points)).batch(BATCH_SIZE)
train_dataset = dataset.take(train_dataset_size).batch(BATCH_SIZE)
validation_dataset = dataset.skip(train_dataset_size).batch(BATCH_SIZE)
Each convolution and fully-connected layer (with exception for end layers) consists of Convolution / Dense -> Batch Normalization -> ReLU Activation.
def conv_bn(x, filters):
x = layers.Conv1D(filters, kernel_size=1, padding="valid")(x)
x = layers.BatchNormalization(momentum=0.0)(x)
return layers.Activation("relu")(x)
def dense_bn(x, filters):
x = layers.Dense(filters)(x)
x = layers.BatchNormalization(momentum=0.0)(x)
return layers.Activation("relu")(x)
PointNet consists of two core components. The primary MLP network, and the transformer net (T-net). The T-net aims to learn an affine transformation matrix by its own mini network. The T-net is used twice. The first time to transform the input features (n, 3) into a canonical representation. The second is an affine transformation for alignment in feature space (n, 3). As per the original paper we constrain the transformation to be close to an orthogonal matrix (i.e. ||X*X^T - I|| = 0).
class OrthogonalRegularizer(keras.regularizers.Regularizer):
def __init__(self, num_features, l2reg=0.001):
self.num_features = num_features
self.l2reg = l2reg
self.eye = ops.eye(num_features)
def __call__(self, x):
x = ops.reshape(x, (-1, self.num_features, self.num_features))
xxt = ops.tensordot(x, x, axes=(2, 2))
xxt = ops.reshape(xxt, (-1, self.num_features, self.num_features))
return ops.sum(self.l2reg * ops.square(xxt - self.eye))
We can then define a general function to build T-net layers.
def tnet(inputs, num_features):
# Initialise bias as the identity matrix
bias = keras.initializers.Constant(np.eye(num_features).flatten())
reg = OrthogonalRegularizer(num_features)
x = conv_bn(inputs, 32)
x = conv_bn(x, 64)
x = conv_bn(x, 512)
x = layers.GlobalMaxPooling1D()(x)
x = dense_bn(x, 256)
x = dense_bn(x, 128)
x = layers.Dense(
num_features * num_features,
kernel_initializer="zeros",
bias_initializer=bias,
activity_regularizer=reg,
)(x)
feat_T = layers.Reshape((num_features, num_features))(x)
# Apply affine transformation to input features
return layers.Dot(axes=(2, 1))([inputs, feat_T])
The main network can be then implemented in the same manner where the t-net mini models can be dropped in a layers in the graph. Here we replicate the network architecture published in the original paper but with half the number of weights at each layer as we are using the smaller 10 class ModelNet dataset.
inputs = keras.Input(shape=(NUM_POINTS, 3))
x = tnet(inputs, 3)
x = conv_bn(x, 32)
x = conv_bn(x, 32)
x = tnet(x, 32)
x = conv_bn(x, 32)
x = conv_bn(x, 64)
x = conv_bn(x, 512)
x = layers.GlobalMaxPooling1D()(x)
x = dense_bn(x, 256)
x = layers.Dropout(0.3)(x)
x = dense_bn(x, 128)
x = layers.Dropout(0.3)(x)
outputs = layers.Dense(NUM_CLASSES, activation="softmax")(x)
model = keras.Model(inputs=inputs, outputs=outputs, name="pointnet")
model.summary()
Model: "pointnet"
βββββββββββββββββββββββ³ββββββββββββββββββββ³ββββββββββ³βββββββββββββββββββββββ β Layer (type) β Output Shape β Param # β Connected to β β‘βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ© β input_layer β (None, 2048, 3) β 0 β - β β (InputLayer) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β conv1d (Conv1D) β (None, 2048, 32) β 128 β input_layer[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalization β (None, 2048, 32) β 128 β conv1d[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation β (None, 2048, 32) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β conv1d_1 (Conv1D) β (None, 2048, 64) β 2,112 β activation[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 2048, 64) β 256 β conv1d_1[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_1 β (None, 2048, 64) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β conv1d_2 (Conv1D) β (None, 2048, 512) β 33,280 β activation_1[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 2048, 512) β 2,048 β conv1d_2[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_2 β (None, 2048, 512) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β global_max_poolingβ¦ β (None, 512) β 0 β activation_2[0][0] β β (GlobalMaxPooling1β¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β dense (Dense) β (None, 256) β 131,328 β global_max_pooling1β¦ β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 256) β 1,024 β dense[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_3 β (None, 256) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β dense_1 (Dense) β (None, 128) β 32,896 β activation_3[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 128) β 512 β dense_1[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_4 β (None, 128) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β dense_2 (Dense) β (None, 9) β 1,161 β activation_4[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β reshape (Reshape) β (None, 3, 3) β 0 β dense_2[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β dot (Dot) β (None, 2048, 3) β 0 β input_layer[0][0], β β β β β reshape[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β conv1d_3 (Conv1D) β (None, 2048, 32) β 128 β dot[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 2048, 32) β 128 β conv1d_3[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_5 β (None, 2048, 32) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β conv1d_4 (Conv1D) β (None, 2048, 32) β 1,056 β activation_5[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 2048, 32) β 128 β conv1d_4[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_6 β (None, 2048, 32) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β conv1d_5 (Conv1D) β (None, 2048, 32) β 1,056 β activation_6[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 2048, 32) β 128 β conv1d_5[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_7 β (None, 2048, 32) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β conv1d_6 (Conv1D) β (None, 2048, 64) β 2,112 β activation_7[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 2048, 64) β 256 β conv1d_6[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_8 β (None, 2048, 64) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β conv1d_7 (Conv1D) β (None, 2048, 512) β 33,280 β activation_8[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 2048, 512) β 2,048 β conv1d_7[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_9 β (None, 2048, 512) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β global_max_poolingβ¦ β (None, 512) β 0 β activation_9[0][0] β β (GlobalMaxPooling1β¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β dense_3 (Dense) β (None, 256) β 131,328 β global_max_pooling1β¦ β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 256) β 1,024 β dense_3[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_10 β (None, 256) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β dense_4 (Dense) β (None, 128) β 32,896 β activation_10[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 128) β 512 β dense_4[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_11 β (None, 128) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β dense_5 (Dense) β (None, 1024) β 132,096 β activation_11[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β reshape_1 (Reshape) β (None, 32, 32) β 0 β dense_5[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β dot_1 (Dot) β (None, 2048, 32) β 0 β activation_6[0][0], β β β β β reshape_1[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β conv1d_8 (Conv1D) β (None, 2048, 32) β 1,056 β dot_1[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 2048, 32) β 128 β conv1d_8[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_12 β (None, 2048, 32) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β conv1d_9 (Conv1D) β (None, 2048, 64) β 2,112 β activation_12[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 2048, 64) β 256 β conv1d_9[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_13 β (None, 2048, 64) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β conv1d_10 (Conv1D) β (None, 2048, 512) β 33,280 β activation_13[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 2048, 512) β 2,048 β conv1d_10[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_14 β (None, 2048, 512) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β global_max_poolingβ¦ β (None, 512) β 0 β activation_14[0][0] β β (GlobalMaxPooling1β¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β dense_6 (Dense) β (None, 256) β 131,328 β global_max_pooling1β¦ β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 256) β 1,024 β dense_6[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_15 β (None, 256) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β dropout (Dropout) β (None, 256) β 0 β activation_15[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β dense_7 (Dense) β (None, 128) β 32,896 β dropout[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β batch_normalizatioβ¦ β (None, 128) β 512 β dense_7[0][0] β β (BatchNormalizatioβ¦ β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β activation_16 β (None, 128) β 0 β batch_normalizationβ¦ β β (Activation) β β β β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β dropout_1 (Dropout) β (None, 128) β 0 β activation_16[0][0] β βββββββββββββββββββββββΌββββββββββββββββββββΌββββββββββΌβββββββββββββββββββββββ€ β dense_8 (Dense) β (None, 10) β 1,290 β dropout_1[0][0] β βββββββββββββββββββββββ΄ββββββββββββββββββββ΄ββββββββββ΄βββββββββββββββββββββββ
Total params: 748,979 (2.86 MB)
Trainable params: 742,899 (2.83 MB)
Non-trainable params: 6,080 (23.75 KB)
Once the model is defined it can be trained like any other standard classification model
using .compile()
and .fit()
.
model.compile(
loss="sparse_categorical_crossentropy",
optimizer=keras.optimizers.Adam(learning_rate=0.001),
metrics=["sparse_categorical_accuracy"],
)
model.fit(train_dataset, epochs=20, validation_data=validation_dataset)
Epoch 1/20
1/100 [37mββββββββββββββββββββ 16:59 10s/step - loss: 70.7465 - sparse_categorical_accuracy: 0.2188
2/100 [37mββββββββββββββββββββ 2:06 1s/step - loss: 69.8872 - sparse_categorical_accuracy: 0.1953
3/100 [37mββββββββββββββββββββ 2:00 1s/step - loss: 69.4798 - sparse_categorical_accuracy: 0.1823
4/100 [37mββββββββββββββββββββ 1:57 1s/step - loss: 68.7454 - sparse_categorical_accuracy: 0.1719
5/100 β[37mβββββββββββββββββββ 1:53 1s/step - loss: 67.8508 - sparse_categorical_accuracy: 0.1700
6/100 β[37mβββββββββββββββββββ 1:50 1s/step - loss: 67.0352 - sparse_categorical_accuracy: 0.1703
7/100 β[37mβββββββββββββββββββ 1:47 1s/step - loss: 66.3409 - sparse_categorical_accuracy: 0.1702
8/100 β[37mβββββββββββββββββββ 1:45 1s/step - loss: 65.5973 - sparse_categorical_accuracy: 0.1734
9/100 β[37mβββββββββββββββββββ 1:43 1s/step - loss: 64.8169 - sparse_categorical_accuracy: 0.1761
10/100 ββ[37mββββββββββββββββββ 1:41 1s/step - loss: 64.0699 - sparse_categorical_accuracy: 0.1769
11/100 ββ[37mββββββββββββββββββ 1:39 1s/step - loss: 63.3220 - sparse_categorical_accuracy: 0.1779
12/100 ββ[37mββββββββββββββββββ 1:38 1s/step - loss: 62.6677 - sparse_categorical_accuracy: 0.1776
13/100 ββ[37mββββββββββββββββββ 1:36 1s/step - loss: 62.0234 - sparse_categorical_accuracy: 0.1778
14/100 ββ[37mββββββββββββββββββ 1:35 1s/step - loss: 61.4256 - sparse_categorical_accuracy: 0.1774
15/100 βββ[37mβββββββββββββββββ 1:34 1s/step - loss: 60.8435 - sparse_categorical_accuracy: 0.1772
16/100 βββ[37mβββββββββββββββββ 1:32 1s/step - loss: 60.2982 - sparse_categorical_accuracy: 0.1771
17/100 βββ[37mβββββββββββββββββ 1:31 1s/step - loss: 59.7788 - sparse_categorical_accuracy: 0.1773
18/100 βββ[37mβββββββββββββββββ 1:29 1s/step - loss: 59.2792 - sparse_categorical_accuracy: 0.1777
19/100 βββ[37mβββββββββββββββββ 1:28 1s/step - loss: 58.7959 - sparse_categorical_accuracy: 0.1782
20/100 ββββ[37mββββββββββββββββ 1:27 1s/step - loss: 58.3345 - sparse_categorical_accuracy: 0.1787
21/100 ββββ[37mββββββββββββββββ 1:25 1s/step - loss: 57.8916 - sparse_categorical_accuracy: 0.1794
22/100 ββββ[37mββββββββββββββββ 1:24 1s/step - loss: 57.4650 - sparse_categorical_accuracy: 0.1803
23/100 ββββ[37mββββββββββββββββ 1:23 1s/step - loss: 57.0690 - sparse_categorical_accuracy: 0.1811
24/100 ββββ[37mββββββββββββββββ 1:22 1s/step - loss: 56.6876 - sparse_categorical_accuracy: 0.1819
25/100 βββββ[37mβββββββββββββββ 1:20 1s/step - loss: 56.3285 - sparse_categorical_accuracy: 0.1827
26/100 βββββ[37mβββββββββββββββ 1:19 1s/step - loss: 55.9864 - sparse_categorical_accuracy: 0.1834
27/100 βββββ[37mβββββββββββββββ 1:18 1s/step - loss: 55.6550 - sparse_categorical_accuracy: 0.1843
28/100 βββββ[37mβββββββββββββββ 1:17 1s/step - loss: 55.3351 - sparse_categorical_accuracy: 0.1852
29/100 βββββ[37mβββββββββββββββ 1:16 1s/step - loss: 55.0261 - sparse_categorical_accuracy: 0.1863
30/100 ββββββ[37mββββββββββββββ 1:15 1s/step - loss: 54.7329 - sparse_categorical_accuracy: 0.1872
31/100 ββββββ[37mββββββββββββββ 1:13 1s/step - loss: 54.4503 - sparse_categorical_accuracy: 0.1882
32/100 ββββββ[37mββββββββββββββ 1:12 1s/step - loss: 54.1778 - sparse_categorical_accuracy: 0.1891
33/100 ββββββ[37mββββββββββββββ 1:11 1s/step - loss: 53.9170 - sparse_categorical_accuracy: 0.1900
34/100 ββββββ[37mββββββββββββββ 1:10 1s/step - loss: 53.6651 - sparse_categorical_accuracy: 0.1909
35/100 βββββββ[37mβββββββββββββ 1:09 1s/step - loss: 53.4239 - sparse_categorical_accuracy: 0.1916
36/100 βββββββ[37mβββββββββββββ 1:08 1s/step - loss: 53.1926 - sparse_categorical_accuracy: 0.1922
37/100 βββββββ[37mβββββββββββββ 1:07 1s/step - loss: 52.9695 - sparse_categorical_accuracy: 0.1929
38/100 βββββββ[37mβββββββββββββ 1:05 1s/step - loss: 52.7542 - sparse_categorical_accuracy: 0.1935
39/100 βββββββ[37mβββββββββββββ 1:04 1s/step - loss: 52.5469 - sparse_categorical_accuracy: 0.1940
40/100 ββββββββ[37mββββββββββββ 1:03 1s/step - loss: 52.3461 - sparse_categorical_accuracy: 0.1946
41/100 ββββββββ[37mββββββββββββ 1:02 1s/step - loss: 52.1509 - sparse_categorical_accuracy: 0.1950
42/100 ββββββββ[37mββββββββββββ 1:01 1s/step - loss: 51.9608 - sparse_categorical_accuracy: 0.1955
43/100 ββββββββ[37mββββββββββββ 1:00 1s/step - loss: 51.7759 - sparse_categorical_accuracy: 0.1960
44/100 ββββββββ[37mββββββββββββ 59s 1s/step - loss: 51.5960 - sparse_categorical_accuracy: 0.1966
45/100 βββββββββ[37mβββββββββββ 58s 1s/step - loss: 51.4224 - sparse_categorical_accuracy: 0.1971
46/100 βββββββββ[37mβββββββββββ 57s 1s/step - loss: 51.2539 - sparse_categorical_accuracy: 0.1976
47/100 βββββββββ[37mβββββββββββ 56s 1s/step - loss: 51.0897 - sparse_categorical_accuracy: 0.1982
48/100 βββββββββ[37mβββββββββββ 55s 1s/step - loss: 50.9300 - sparse_categorical_accuracy: 0.1987
49/100 βββββββββ[37mβββββββββββ 54s 1s/step - loss: 50.7742 - sparse_categorical_accuracy: 0.1992
50/100 ββββββββββ[37mββββββββββ 52s 1s/step - loss: 50.6223 - sparse_categorical_accuracy: 0.1997
51/100 ββββββββββ[37mββββββββββ 51s 1s/step - loss: 50.4747 - sparse_categorical_accuracy: 0.2001
52/100 ββββββββββ[37mββββββββββ 50s 1s/step - loss: 50.3312 - sparse_categorical_accuracy: 0.2006
53/100 ββββββββββ[37mββββββββββ 49s 1s/step - loss: 50.1910 - sparse_categorical_accuracy: 0.2011
54/100 ββββββββββ[37mββββββββββ 48s 1s/step - loss: 50.0539 - sparse_categorical_accuracy: 0.2017
55/100 βββββββββββ[37mβββββββββ 47s 1s/step - loss: 49.9200 - sparse_categorical_accuracy: 0.2022
56/100 βββββββββββ[37mβββββββββ 46s 1s/step - loss: 49.7896 - sparse_categorical_accuracy: 0.2027
57/100 βββββββββββ[37mβββββββββ 45s 1s/step - loss: 49.6620 - sparse_categorical_accuracy: 0.2032
58/100 βββββββββββ[37mβββββββββ 44s 1s/step - loss: 49.5372 - sparse_categorical_accuracy: 0.2037
59/100 βββββββββββ[37mβββββββββ 43s 1s/step - loss: 49.4152 - sparse_categorical_accuracy: 0.2041
60/100 ββββββββββββ[37mββββββββ 42s 1s/step - loss: 49.2957 - sparse_categorical_accuracy: 0.2046
61/100 ββββββββββββ[37mββββββββ 41s 1s/step - loss: 49.1790 - sparse_categorical_accuracy: 0.2050
62/100 ββββββββββββ[37mββββββββ 40s 1s/step - loss: 49.0646 - sparse_categorical_accuracy: 0.2054
63/100 ββββββββββββ[37mββββββββ 39s 1s/step - loss: 48.9525 - sparse_categorical_accuracy: 0.2058
64/100 ββββββββββββ[37mββββββββ 37s 1s/step - loss: 48.8427 - sparse_categorical_accuracy: 0.2062
65/100 βββββββββββββ[37mβββββββ 36s 1s/step - loss: 48.7353 - sparse_categorical_accuracy: 0.2065
66/100 βββββββββββββ[37mβββββββ 35s 1s/step - loss: 48.6299 - sparse_categorical_accuracy: 0.2069
67/100 βββββββββββββ[37mβββββββ 34s 1s/step - loss: 48.5266 - sparse_categorical_accuracy: 0.2072
68/100 βββββββββββββ[37mβββββββ 33s 1s/step - loss: 48.4277 - sparse_categorical_accuracy: 0.2075
69/100 βββββββββββββ[37mβββββββ 32s 1s/step - loss: 48.3308 - sparse_categorical_accuracy: 0.2078
70/100 ββββββββββββββ[37mββββββ 31s 1s/step - loss: 48.2357 - sparse_categorical_accuracy: 0.2081
71/100 ββββββββββββββ[37mββββββ 30s 1s/step - loss: 48.1423 - sparse_categorical_accuracy: 0.2084
72/100 ββββββββββββββ[37mββββββ 29s 1s/step - loss: 48.0505 - sparse_categorical_accuracy: 0.2087
73/100 ββββββββββββββ[37mββββββ 28s 1s/step - loss: 47.9604 - sparse_categorical_accuracy: 0.2090
74/100 ββββββββββββββ[37mββββββ 27s 1s/step - loss: 47.8719 - sparse_categorical_accuracy: 0.2093
75/100 βββββββββββββββ[37mβββββ 26s 1s/step - loss: 47.7852 - sparse_categorical_accuracy: 0.2096
76/100 βββββββββββββββ[37mβββββ 25s 1s/step - loss: 47.7000 - sparse_categorical_accuracy: 0.2098
77/100 βββββββββββββββ[37mβββββ 24s 1s/step - loss: 47.6164 - sparse_categorical_accuracy: 0.2101
78/100 βββββββββββββββ[37mβββββ 23s 1s/step - loss: 47.5342 - sparse_categorical_accuracy: 0.2104
79/100 βββββββββββββββ[37mβββββ 22s 1s/step - loss: 47.4536 - sparse_categorical_accuracy: 0.2106
80/100 ββββββββββββββββ[37mββββ 21s 1s/step - loss: 47.3744 - sparse_categorical_accuracy: 0.2109
81/100 ββββββββββββββββ[37mββββ 19s 1s/step - loss: 47.2967 - sparse_categorical_accuracy: 0.2112
82/100 ββββββββββββββββ[37mββββ 18s 1s/step - loss: 47.2202 - sparse_categorical_accuracy: 0.2114
83/100 ββββββββββββββββ[37mββββ 17s 1s/step - loss: 47.1450 - sparse_categorical_accuracy: 0.2117
84/100 ββββββββββββββββ[37mββββ 16s 1s/step - loss: 47.0711 - sparse_categorical_accuracy: 0.2119
85/100 βββββββββββββββββ[37mβββ 15s 1s/step - loss: 46.9984 - sparse_categorical_accuracy: 0.2122
86/100 βββββββββββββββββ[37mβββ 14s 1s/step - loss: 46.9270 - sparse_categorical_accuracy: 0.2124
87/100 βββββββββββββββββ[37mβββ 13s 1s/step - loss: 46.8568 - sparse_categorical_accuracy: 0.2126
88/100 βββββββββββββββββ[37mβββ 12s 1s/step - loss: 46.7877 - sparse_categorical_accuracy: 0.2129
89/100 βββββββββββββββββ[37mβββ 11s 1s/step - loss: 46.7196 - sparse_categorical_accuracy: 0.2131
90/100 ββββββββββββββββββ[37mββ 10s 1s/step - loss: 46.6525 - sparse_categorical_accuracy: 0.2133
91/100 ββββββββββββββββββ[37mββ 9s 1s/step - loss: 46.5865 - sparse_categorical_accuracy: 0.2135
92/100 ββββββββββββββββββ[37mββ 8s 1s/step - loss: 46.5215 - sparse_categorical_accuracy: 0.2137
93/100 ββββββββββββββββββ[37mββ 7s 1s/step - loss: 46.4574 - sparse_categorical_accuracy: 0.2139
94/100 ββββββββββββββββββ[37mββ 6s 1s/step - loss: 46.3946 - sparse_categorical_accuracy: 0.2141
95/100 βββββββββββββββββββ[37mβ 5s 1s/step - loss: 46.3327 - sparse_categorical_accuracy: 0.2143
96/100 βββββββββββββββββββ[37mβ 4s 1s/step - loss: 46.2717 - sparse_categorical_accuracy: 0.2145
97/100 βββββββββββββββββββ[37mβ 3s 1s/step - loss: 46.2115 - sparse_categorical_accuracy: 0.2147
98/100 βββββββββββββββββββ[37mβ 2s 1s/step - loss: 46.1522 - sparse_categorical_accuracy: 0.2149
99/100 βββββββββββββββββββ[37mβ 1s 1s/step - loss: 46.0937 - sparse_categorical_accuracy: 0.2151
100/100 ββββββββββββββββββββ 0s 1s/step - loss: 46.0345 - sparse_categorical_accuracy: 0.2154
100/100 ββββββββββββββββββββ 119s 1s/step - loss: 45.9764 - sparse_categorical_accuracy: 0.2156 - val_loss: 4122951.0000 - val_sparse_categorical_accuracy: 0.3154
Epoch 2/20
1/100 [37mββββββββββββββββββββ 1:44 1s/step - loss: 36.7920 - sparse_categorical_accuracy: 0.2500
2/100 [37mββββββββββββββββββββ 1:42 1s/step - loss: 36.8501 - sparse_categorical_accuracy: 0.2188
3/100 [37mββββββββββββββββββββ 1:39 1s/step - loss: 36.8194 - sparse_categorical_accuracy: 0.2049
4/100 [37mββββββββββββββββββββ 1:37 1s/step - loss: 36.7948 - sparse_categorical_accuracy: 0.1947
5/100 β[37mβββββββββββββββββββ 1:35 1s/step - loss: 36.7802 - sparse_categorical_accuracy: 0.1907
6/100 β[37mβββββββββββββββββββ 1:34 1s/step - loss: 36.7761 - sparse_categorical_accuracy: 0.1911
7/100 β[37mβββββββββββββββββββ 1:33 1s/step - loss: 36.7720 - sparse_categorical_accuracy: 0.1937
8/100 β[37mβββββββββββββββββββ 1:33 1s/step - loss: 36.7660 - sparse_categorical_accuracy: 0.1964
9/100 β[37mβββββββββββββββββββ 1:32 1s/step - loss: 36.7617 - sparse_categorical_accuracy: 0.1977
10/100 ββ[37mββββββββββββββββββ 1:30 1s/step - loss: 36.7567 - sparse_categorical_accuracy: 0.1992
11/100 ββ[37mββββββββββββββββββ 1:30 1s/step - loss: 36.7558 - sparse_categorical_accuracy: 0.2007
12/100 ββ[37mββββββββββββββββββ 1:29 1s/step - loss: 36.7534 - sparse_categorical_accuracy: 0.2022
13/100 ββ[37mββββββββββββββββββ 1:28 1s/step - loss: 36.7539 - sparse_categorical_accuracy: 0.2033
14/100 ββ[37mββββββββββββββββββ 1:27 1s/step - loss: 36.7521 - sparse_categorical_accuracy: 0.2049
15/100 βββ[37mβββββββββββββββββ 1:26 1s/step - loss: 36.7500 - sparse_categorical_accuracy: 0.2064
16/100 βββ[37mβββββββββββββββββ 1:25 1s/step - loss: 36.7464 - sparse_categorical_accuracy: 0.2087
17/100 βββ[37mβββββββββββββββββ 1:25 1s/step - loss: 36.7410 - sparse_categorical_accuracy: 0.2116
18/100 βββ[37mβββββββββββββββββ 1:24 1s/step - loss: 36.7356 - sparse_categorical_accuracy: 0.2138
19/100 βββ[37mβββββββββββββββββ 1:23 1s/step - loss: 36.7314 - sparse_categorical_accuracy: 0.2157
20/100 ββββ[37mββββββββββββββββ 1:21 1s/step - loss: 36.7275 - sparse_categorical_accuracy: 0.2178
21/100 ββββ[37mββββββββββββββββ 1:20 1s/step - loss: 36.7235 - sparse_categorical_accuracy: 0.2196
22/100 ββββ[37mββββββββββββββββ 1:19 1s/step - loss: 36.7189 - sparse_categorical_accuracy: 0.2218
23/100 ββββ[37mββββββββββββββββ 1:18 1s/step - loss: 36.7141 - sparse_categorical_accuracy: 0.2241
24/100 ββββ[37mββββββββββββββββ 1:17 1s/step - loss: 36.7087 - sparse_categorical_accuracy: 0.2262
25/100 βββββ[37mβββββββββββββββ 1:16 1s/step - loss: 36.7027 - sparse_categorical_accuracy: 0.2283
26/100 βββββ[37mβββββββββββββββ 1:15 1s/step - loss: 36.6970 - sparse_categorical_accuracy: 0.2303
27/100 βββββ[37mβββββββββββββββ 1:14 1s/step - loss: 36.6911 - sparse_categorical_accuracy: 0.2325
28/100 βββββ[37mβββββββββββββββ 1:13 1s/step - loss: 36.6862 - sparse_categorical_accuracy: 0.2342
29/100 βββββ[37mβββββββββββββββ 1:12 1s/step - loss: 36.6818 - sparse_categorical_accuracy: 0.2357
30/100 ββββββ[37mββββββββββββββ 1:11 1s/step - loss: 36.6766 - sparse_categorical_accuracy: 0.2372
31/100 ββββββ[37mββββββββββββββ 1:10 1s/step - loss: 36.6717 - sparse_categorical_accuracy: 0.2387
32/100 ββββββ[37mββββββββββββββ 1:09 1s/step - loss: 36.6670 - sparse_categorical_accuracy: 0.2403
33/100 ββββββ[37mββββββββββββββ 1:08 1s/step - loss: 36.6629 - sparse_categorical_accuracy: 0.2418
34/100 ββββββ[37mββββββββββββββ 1:07 1s/step - loss: 36.6591 - sparse_categorical_accuracy: 0.2431
35/100 βββββββ[37mβββββββββββββ 1:06 1s/step - loss: 36.6551 - sparse_categorical_accuracy: 0.2444
36/100 βββββββ[37mβββββββββββββ 1:05 1s/step - loss: 36.6513 - sparse_categorical_accuracy: 0.2456
37/100 βββββββ[37mβββββββββββββ 1:04 1s/step - loss: 36.6478 - sparse_categorical_accuracy: 0.2467
38/100 βββββββ[37mβββββββββββββ 1:03 1s/step - loss: 36.6441 - sparse_categorical_accuracy: 0.2477
39/100 βββββββ[37mβββββββββββββ 1:02 1s/step - loss: 36.6405 - sparse_categorical_accuracy: 0.2487
40/100 ββββββββ[37mββββββββββββ 1:01 1s/step - loss: 36.6368 - sparse_categorical_accuracy: 0.2497
41/100 ββββββββ[37mββββββββββββ 1:00 1s/step - loss: 36.6331 - sparse_categorical_accuracy: 0.2507
42/100 ββββββββ[37mββββββββββββ 59s 1s/step - loss: 36.6330 - sparse_categorical_accuracy: 0.2515
43/100 ββββββββ[37mββββββββββββ 58s 1s/step - loss: 36.6330 - sparse_categorical_accuracy: 0.2523
44/100 ββββββββ[37mββββββββββββ 57s 1s/step - loss: 36.6331 - sparse_categorical_accuracy: 0.2531
45/100 βββββββββ[37mβββββββββββ 56s 1s/step - loss: 36.6330 - sparse_categorical_accuracy: 0.2538
46/100 βββββββββ[37mβββββββββββ 55s 1s/step - loss: 36.6330 - sparse_categorical_accuracy: 0.2546
47/100 βββββββββ[37mβββββββββββ 54s 1s/step - loss: 36.6330 - sparse_categorical_accuracy: 0.2554
48/100 βββββββββ[37mβββββββββββ 53s 1s/step - loss: 36.6330 - sparse_categorical_accuracy: 0.2561
49/100 βββββββββ[37mβββββββββββ 52s 1s/step - loss: 36.6331 - sparse_categorical_accuracy: 0.2568
50/100 ββββββββββ[37mββββββββββ 51s 1s/step - loss: 36.6331 - sparse_categorical_accuracy: 0.2575
51/100 ββββββββββ[37mββββββββββ 50s 1s/step - loss: 36.6332 - sparse_categorical_accuracy: 0.2582
52/100 ββββββββββ[37mββββββββββ 49s 1s/step - loss: 36.6332 - sparse_categorical_accuracy: 0.2588
53/100 ββββββββββ[37mββββββββββ 48s 1s/step - loss: 36.6331 - sparse_categorical_accuracy: 0.2594
54/100 ββββββββββ[37mββββββββββ 47s 1s/step - loss: 36.6330 - sparse_categorical_accuracy: 0.2600
55/100 βββββββββββ[37mβββββββββ 46s 1s/step - loss: 36.6329 - sparse_categorical_accuracy: 0.2606
56/100 βββββββββββ[37mβββββββββ 45s 1s/step - loss: 36.6331 - sparse_categorical_accuracy: 0.2612
57/100 βββββββββββ[37mβββββββββ 44s 1s/step - loss: 36.6332 - sparse_categorical_accuracy: 0.2618
58/100 βββββββββββ[37mβββββββββ 43s 1s/step - loss: 36.6332 - sparse_categorical_accuracy: 0.2624
59/100 βββββββββββ[37mβββββββββ 42s 1s/step - loss: 36.6331 - sparse_categorical_accuracy: 0.2630
60/100 ββββββββββββ[37mββββββββ 41s 1s/step - loss: 36.6331 - sparse_categorical_accuracy: 0.2636
61/100 ββββββββββββ[37mββββββββ 40s 1s/step - loss: 36.6330 - sparse_categorical_accuracy: 0.2641
62/100 ββββββββββββ[37mββββββββ 39s 1s/step - loss: 36.6330 - sparse_categorical_accuracy: 0.2646
63/100 ββββββββββββ[37mββββββββ 38s 1s/step - loss: 36.6329 - sparse_categorical_accuracy: 0.2652
64/100 ββββββββββββ[37mββββββββ 37s 1s/step - loss: 36.6329 - sparse_categorical_accuracy: 0.2657
65/100 βββββββββββββ[37mβββββββ 36s 1s/step - loss: 36.6330 - sparse_categorical_accuracy: 0.2662
66/100 βββββββββββββ[37mβββββββ 35s 1s/step - loss: 36.6332 - sparse_categorical_accuracy: 0.2667
67/100 βββββββββββββ[37mβββββββ 34s 1s/step - loss: 36.6336 - sparse_categorical_accuracy: 0.2671
68/100 βββββββββββββ[37mβββββββ 33s 1s/step - loss: 36.6340 - sparse_categorical_accuracy: 0.2674
69/100 βββββββββββββ[37mβββββββ 32s 1s/step - loss: 36.6346 - sparse_categorical_accuracy: 0.2678
70/100 ββββββββββββββ[37mββββββ 30s 1s/step - loss: 36.6352 - sparse_categorical_accuracy: 0.2682
71/100 ββββββββββββββ[37mββββββ 29s 1s/step - loss: 36.6359 - sparse_categorical_accuracy: 0.2685
72/100 ββββββββββββββ[37mββββββ 28s 1s/step - loss: 36.6365 - sparse_categorical_accuracy: 0.2688
73/100 ββββββββββββββ[37mββββββ 27s 1s/step - loss: 36.6371 - sparse_categorical_accuracy: 0.2690
74/100 ββββββββββββββ[37mββββββ 26s 1s/step - loss: 36.6377 - sparse_categorical_accuracy: 0.2693
75/100 βββββββββββββββ[37mβββββ 25s 1s/step - loss: 36.6384 - sparse_categorical_accuracy: 0.2696
76/100 βββββββββββββββ[37mβββββ 24s 1s/step - loss: 36.6389 - sparse_categorical_accuracy: 0.2698
77/100 βββββββββββββββ[37mβββββ 23s 1s/step - loss: 36.6394 - sparse_categorical_accuracy: 0.2700
78/100 βββββββββββββββ[37mβββββ 22s 1s/step - loss: 36.6398 - sparse_categorical_accuracy: 0.2703
79/100 βββββββββββββββ[37mβββββ 21s 1s/step - loss: 36.6401 - sparse_categorical_accuracy: 0.2706
80/100 ββββββββββββββββ[37mββββ 20s 1s/step - loss: 36.6406 - sparse_categorical_accuracy: 0.2708
81/100 ββββββββββββββββ[37mββββ 19s 1s/step - loss: 36.6411 - sparse_categorical_accuracy: 0.2710
82/100 ββββββββββββββββ[37mββββ 18s 1s/step - loss: 36.6415 - sparse_categorical_accuracy: 0.2712
83/100 ββββββββββββββββ[37mββββ 17s 1s/step - loss: 36.6419 - sparse_categorical_accuracy: 0.2714
84/100 ββββββββββββββββ[37mββββ 16s 1s/step - loss: 36.6423 - sparse_categorical_accuracy: 0.2716
85/100 βββββββββββββββββ[37mβββ 15s 1s/step - loss: 36.6426 - sparse_categorical_accuracy: 0.2718
86/100 βββββββββββββββββ[37mβββ 14s 1s/step - loss: 36.6429 - sparse_categorical_accuracy: 0.2720
87/100 βββββββββββββββββ[37mβββ 13s 1s/step - loss: 36.6431 - sparse_categorical_accuracy: 0.2723
88/100 βββββββββββββββββ[37mβββ 12s 1s/step - loss: 36.6432 - sparse_categorical_accuracy: 0.2725
89/100 βββββββββββββββββ[37mβββ 11s 1s/step - loss: 36.6433 - sparse_categorical_accuracy: 0.2727
90/100 ββββββββββββββββββ[37mββ 10s 1s/step - loss: 36.6434 - sparse_categorical_accuracy: 0.2730
91/100 ββββββββββββββββββ[37mββ 9s 1s/step - loss: 36.6435 - sparse_categorical_accuracy: 0.2732
92/100 ββββββββββββββββββ[37mββ 8s 1s/step - loss: 36.6435 - sparse_categorical_accuracy: 0.2734
93/100 ββββββββββββββββββ[37mββ 7s 1s/step - loss: 36.6434 - sparse_categorical_accuracy: 0.2736
94/100 ββββββββββββββββββ[37mββ 6s 1s/step - loss: 36.6432 - sparse_categorical_accuracy: 0.2738
95/100 βββββββββββββββββββ[37mβ 5s 1s/step - loss: 36.6430 - sparse_categorical_accuracy: 0.2740
96/100 βββββββββββββββββββ[37mβ 4s 1s/step - loss: 36.6427 - sparse_categorical_accuracy: 0.2742
97/100 βββββββββββββββββββ[37mβ 3s 1s/step - loss: 36.6424 - sparse_categorical_accuracy: 0.2744
98/100 βββββββββββββββββββ[37mβ 2s 1s/step - loss: 36.6421 - sparse_categorical_accuracy: 0.2746
99/100 βββββββββββββββββββ[37mβ 1s 1s/step - loss: 36.6418 - sparse_categorical_accuracy: 0.2748
100/100 ββββββββββββββββββββ 0s 1s/step - loss: 36.6402 - sparse_categorical_accuracy: 0.2749
100/100 ββββββββββββββββββββ 108s 1s/step - loss: 36.6386 - sparse_categorical_accuracy: 0.2751 - val_loss: 20961250112658389073920.0000 - val_sparse_categorical_accuracy: 0.3191
Epoch 3/20
1/100 [37mββββββββββββββββββββ 57:33 35s/step - loss: 35.9745 - sparse_categorical_accuracy: 0.3438
2/100 [37mββββββββββββββββββββ 1:39 1s/step - loss: 36.1432 - sparse_categorical_accuracy: 0.3359
3/100 [37mββββββββββββββββββββ 1:38 1s/step - loss: 36.1628 - sparse_categorical_accuracy: 0.3420
4/100 [37mββββββββββββββββββββ 1:39 1s/step - loss: 36.1912 - sparse_categorical_accuracy: 0.3424
5/100 β[37mβββββββββββββββββββ 1:38 1s/step - loss: 36.2222 - sparse_categorical_accuracy: 0.3390
6/100 β[37mβββββββββββββββββββ 1:37 1s/step - loss: 36.2318 - sparse_categorical_accuracy: 0.3345
7/100 β[37mβββββββββββββββββββ 1:36 1s/step - loss: 36.2484 - sparse_categorical_accuracy: 0.3301
8/100 β[37mβββββββββββββββββββ 1:35 1s/step - loss: 36.2639 - sparse_categorical_accuracy: 0.3284
9/100 β[37mβββββββββββββββββββ 1:33 1s/step - loss: 36.2697 - sparse_categorical_accuracy: 0.3282
10/100 ββ[37mββββββββββββββββββ 1:33 1s/step - loss: 36.2697 - sparse_categorical_accuracy: 0.3304
11/100 ββ[37mββββββββββββββββββ 1:32 1s/step - loss: 36.2697 - sparse_categorical_accuracy: 0.3316
12/100 ββ[37mββββββββββββββββββ 1:30 1s/step - loss: 36.2714 - sparse_categorical_accuracy: 0.3319
13/100 ββ[37mββββββββββββββββββ 1:29 1s/step - loss: 36.2731 - sparse_categorical_accuracy: 0.3319
14/100 ββ[37mββββββββββββββββββ 1:28 1s/step - loss: 36.2716 - sparse_categorical_accuracy: 0.3325
15/100 βββ[37mβββββββββββββββββ 1:27 1s/step - loss: 36.2714 - sparse_categorical_accuracy: 0.3327
16/100 βββ[37mβββββββββββββββββ 1:26 1s/step - loss: 36.2703 - sparse_categorical_accuracy: 0.3325
17/100 βββ[37mβββββββββββββββββ 1:25 1s/step - loss: 36.2685 - sparse_categorical_accuracy: 0.3322
18/100 βββ[37mβββββββββββββββββ 1:24 1s/step - loss: 36.2665 - sparse_categorical_accuracy: 0.3322
19/100 βββ[37mβββββββββββββββββ 1:23 1s/step - loss: 36.2672 - sparse_categorical_accuracy: 0.3320
20/100 ββββ[37mββββββββββββββββ 1:22 1s/step - loss: 36.2689 - sparse_categorical_accuracy: 0.3316
21/100 ββββ[37mββββββββββββββββ 1:22 1s/step - loss: 36.2700 - sparse_categorical_accuracy: 0.3311
22/100 ββββ[37mββββββββββββββββ 1:21 1s/step - loss: 36.2712 - sparse_categorical_accuracy: 0.3307
23/100 ββββ[37mββββββββββββββββ 1:20 1s/step - loss: 36.2732 - sparse_categorical_accuracy: 0.3301
24/100 ββββ[37mββββββββββββββββ 1:19 1s/step - loss: 36.2753 - sparse_categorical_accuracy: 0.3293
25/100 βββββ[37mβββββββββββββββ 1:18 1s/step - loss: 36.2772 - sparse_categorical_accuracy: 0.3284
26/100 βββββ[37mβββββββββββββββ 1:16 1s/step - loss: 36.2789 - sparse_categorical_accuracy: 0.3275
27/100 βββββ[37mβββββββββββββββ 1:15 1s/step - loss: 36.2803 - sparse_categorical_accuracy: 0.3266
28/100 βββββ[37mβββββββββββββββ 1:14 1s/step - loss: 36.2832 - sparse_categorical_accuracy: 0.3258
29/100 βββββ[37mβββββββββββββββ 1:13 1s/step - loss: 36.2886 - sparse_categorical_accuracy: 0.3251
30/100 ββββββ[37mββββββββββββββ 1:12 1s/step - loss: 36.2944 - sparse_categorical_accuracy: 0.3245
31/100 ββββββ[37mββββββββββββββ 1:11 1s/step - loss: 36.3001 - sparse_categorical_accuracy: 0.3237
32/100 ββββββ[37mββββββββββββββ 1:10 1s/step - loss: 36.3053 - sparse_categorical_accuracy: 0.3231
33/100 ββββββ[37mββββββββββββββ 1:09 1s/step - loss: 36.3102 - sparse_categorical_accuracy: 0.3226
34/100 ββββββ[37mββββββββββββββ 1:08 1s/step - loss: 36.3150 - sparse_categorical_accuracy: 0.3221
35/100 βββββββ[37mβββββββββββββ 1:07 1s/step - loss: 36.3196 - sparse_categorical_accuracy: 0.3216
36/100 βββββββ[37mβββββββββββββ 1:06 1s/step - loss: 36.3239 - sparse_categorical_accuracy: 0.3212
37/100 βββββββ[37mβββββββββββββ 1:05 1s/step - loss: 36.3281 - sparse_categorical_accuracy: 0.3209
38/100 βββββββ[37mβββββββββββββ 1:04 1s/step - loss: 36.3322 - sparse_categorical_accuracy: 0.3204
39/100 βββββββ[37mβββββββββββββ 1:03 1s/step - loss: 36.3358 - sparse_categorical_accuracy: 0.3201
40/100 ββββββββ[37mββββββββββββ 1:02 1s/step - loss: 36.3392 - sparse_categorical_accuracy: 0.3199
41/100 ββββββββ[37mββββββββββββ 1:01 1s/step - loss: 36.3423 - sparse_categorical_accuracy: 0.3196
42/100 ββββββββ[37mββββββββββββ 1:00 1s/step - loss: 36.3453 - sparse_categorical_accuracy: 0.3195
43/100 ββββββββ[37mββββββββββββ 58s 1s/step - loss: 36.3482 - sparse_categorical_accuracy: 0.3193
44/100 ββββββββ[37mββββββββββββ 57s 1s/step - loss: 36.3509 - sparse_categorical_accuracy: 0.3193
45/100 βββββββββ[37mβββββββββββ 56s 1s/step - loss: 36.3534 - sparse_categorical_accuracy: 0.3192
46/100 βββββββββ[37mβββββββββββ 55s 1s/step - loss: 36.3557 - sparse_categorical_accuracy: 0.3191
47/100 βββββββββ[37mβββββββββββ 54s 1s/step - loss: 36.3577 - sparse_categorical_accuracy: 0.3191
48/100 βββββββββ[37mβββββββββββ 53s 1s/step - loss: 36.3597 - sparse_categorical_accuracy: 0.3190
49/100 βββββββββ[37mβββββββββββ 52s 1s/step - loss: 36.3617 - sparse_categorical_accuracy: 0.3188
50/100 ββββββββββ[37mββββββββββ 51s 1s/step - loss: 36.3636 - sparse_categorical_accuracy: 0.3186
51/100 ββββββββββ[37mββββββββββ 50s 1s/step - loss: 36.3654 - sparse_categorical_accuracy: 0.3183
52/100 ββββββββββ[37mββββββββββ 49s 1s/step - loss: 36.3671 - sparse_categorical_accuracy: 0.3181
53/100 ββββββββββ[37mββββββββββ 48s 1s/step - loss: 36.3687 - sparse_categorical_accuracy: 0.3179
54/100 ββββββββββ[37mββββββββββ 47s 1s/step - loss: 36.3705 - sparse_categorical_accuracy: 0.3177
55/100 βββββββββββ[37mβββββββββ 46s 1s/step - loss: 36.3723 - sparse_categorical_accuracy: 0.3175
56/100 βββββββββββ[37mβββββββββ 45s 1s/step - loss: 36.3744 - sparse_categorical_accuracy: 0.3173
57/100 βββββββββββ[37mβββββββββ 44s 1s/step - loss: 36.3764 - sparse_categorical_accuracy: 0.3171
58/100 βββββββββββ[37mβββββββββ 43s 1s/step - loss: 36.3784 - sparse_categorical_accuracy: 0.3170
59/100 βββββββββββ[37mβββββββββ 42s 1s/step - loss: 36.3805 - sparse_categorical_accuracy: 0.3168
60/100 ββββββββββββ[37mββββββββ 41s 1s/step - loss: 36.3824 - sparse_categorical_accuracy: 0.3167
61/100 ββββββββββββ[37mββββββββ 40s 1s/step - loss: 36.3843 - sparse_categorical_accuracy: 0.3166
62/100 ββββββββββββ[37mββββββββ 39s 1s/step - loss: 36.3862 - sparse_categorical_accuracy: 0.3165
63/100 ββββββββββββ[37mββββββββ 38s 1s/step - loss: 36.3879 - sparse_categorical_accuracy: 0.3164
64/100 ββββββββββββ[37mββββββββ 37s 1s/step - loss: 36.3893 - sparse_categorical_accuracy: 0.3163
65/100 βββββββββββββ[37mβββββββ 36s 1s/step - loss: 36.3907 - sparse_categorical_accuracy: 0.3163
66/100 βββββββββββββ[37mβββββββ 35s 1s/step - loss: 36.3921 - sparse_categorical_accuracy: 0.3162
67/100 βββββββββββββ[37mβββββββ 34s 1s/step - loss: 36.3933 - sparse_categorical_accuracy: 0.3162
68/100 βββββββββββββ[37mβββββββ 33s 1s/step - loss: 36.3944 - sparse_categorical_accuracy: 0.3161
69/100 βββββββββββββ[37mβββββββ 32s 1s/step - loss: 36.3953 - sparse_categorical_accuracy: 0.3161
70/100 ββββββββββββββ[37mββββββ 31s 1s/step - loss: 36.3962 - sparse_categorical_accuracy: 0.3160
71/100 ββββββββββββββ[37mββββββ 30s 1s/step - loss: 36.3971 - sparse_categorical_accuracy: 0.3160
72/100 ββββββββββββββ[37mββββββ 29s 1s/step - loss: 36.3978 - sparse_categorical_accuracy: 0.3159
73/100 ββββββββββββββ[37mββββββ 27s 1s/step - loss: 36.3986 - sparse_categorical_accuracy: 0.3159
74/100 ββββββββββββββ[37mββββββ 26s 1s/step - loss: 36.3994 - sparse_categorical_accuracy: 0.3158
75/100 βββββββββββββββ[37mβββββ 25s 1s/step - loss: 36.4003 - sparse_categorical_accuracy: 0.3157
76/100 βββββββββββββββ[37mβββββ 24s 1s/step - loss: 36.4011 - sparse_categorical_accuracy: 0.3157
77/100 βββββββββββββββ[37mβββββ 23s 1s/step - loss: 36.4019 - sparse_categorical_accuracy: 0.3156
78/100 βββββββββββββββ[37mβββββ 22s 1s/step - loss: 36.4026 - sparse_categorical_accuracy: 0.3156
79/100 βββββββββββββββ[37mβββββ 21s 1s/step - loss: 36.4032 - sparse_categorical_accuracy: 0.3155
80/100 ββββββββββββββββ[37mββββ 20s 1s/step - loss: 36.4038 - sparse_categorical_accuracy: 0.3155
81/100 ββββββββββββββββ[37mββββ 19s 1s/step - loss: 36.4045 - sparse_categorical_accuracy: 0.3155
82/100 ββββββββββββββββ[37mββββ 18s 1s/step - loss: 36.4051 - sparse_categorical_accuracy: 0.3154
83/100 ββββββββββββββββ[37mββββ 17s 1s/step - loss: 36.4058 - sparse_categorical_accuracy: 0.3154
84/100 ββββββββββββββββ[37mββββ 16s 1s/step - loss: 36.4066 - sparse_categorical_accuracy: 0.3154
85/100 βββββββββββββββββ[37mβββ 15s 1s/step - loss: 36.4072 - sparse_categorical_accuracy: 0.3154
86/100 βββββββββββββββββ[37mβββ 14s 1s/step - loss: 36.4079 - sparse_categorical_accuracy: 0.3154
87/100 βββββββββββββββββ[37mβββ 13s 1s/step - loss: 36.4085 - sparse_categorical_accuracy: 0.3154
88/100 βββββββββββββββββ[37mβββ 12s 1s/step - loss: 36.4091 - sparse_categorical_accuracy: 0.3154
89/100 βββββββββββββββββ[37mβββ 11s 1s/step - loss: 36.4097 - sparse_categorical_accuracy: 0.3154
90/100 ββββββββββββββββββ[37mββ 10s 1s/step - loss: 36.4104 - sparse_categorical_accuracy: 0.3154
91/100 ββββββββββββββββββ[37mββ 9s 1s/step - loss: 36.4110 - sparse_categorical_accuracy: 0.3154
92/100 ββββββββββββββββββ[37mββ 8s 1s/step - loss: 36.4117 - sparse_categorical_accuracy: 0.3153
93/100 ββββββββββββββββββ[37mββ 7s 1s/step - loss: 36.4123 - sparse_categorical_accuracy: 0.3153
94/100 ββββββββββββββββββ[37mββ 6s 1s/step - loss: 36.4129 - sparse_categorical_accuracy: 0.3152
95/100 βββββββββββββββββββ[37mβ 5s 1s/step - loss: 36.4135 - sparse_categorical_accuracy: 0.3152
96/100 βββββββββββββββββββ[37mβ 4s 1s/step - loss: 36.4142 - sparse_categorical_accuracy: 0.3152
97/100 βββββββββββββββββββ[37mβ 3s 1s/step - loss: 36.4150 - sparse_categorical_accuracy: 0.3151
98/100 βββββββββββββββββββ[37mβ 2s 1s/step - loss: 36.4157 - sparse_categorical_accuracy: 0.3151
99/100 βββββββββββββββββββ[37mβ 1s 1s/step - loss: 36.4164 - sparse_categorical_accuracy: 0.3151
100/100 ββββββββββββββββββββ 0s 1s/step - loss: 36.4156 - sparse_categorical_accuracy: 0.3150
100/100 ββββββββββββββββββββ 142s 1s/step - loss: 36.4148 - sparse_categorical_accuracy: 0.3150 - val_loss: 14661139300352.0000 - val_sparse_categorical_accuracy: 0.2240
Epoch 4/20
1/100 [37mββββββββββββββββββββ 1:40 1s/step - loss: 36.7380 - sparse_categorical_accuracy: 0.5312
2/100 [37mββββββββββββββββββββ 1:40 1s/step - loss: 36.7969 - sparse_categorical_accuracy: 0.4844
3/100 [37mββββββββββββββββββββ 1:38 1s/step - loss: 36.7860 - sparse_categorical_accuracy: 0.4653
4/100 [37mββββββββββββββββββββ 1:36 1s/step - loss: 36.7852 - sparse_categorical_accuracy: 0.4447
5/100 β[37mβββββββββββββββββββ 1:35 1s/step - loss: 36.7560 - sparse_categorical_accuracy: 0.4370
6/100 β[37mβββββββββββββββββββ 1:35 1s/step - loss: 36.7412 - sparse_categorical_accuracy: 0.4293
7/100 β[37mβββββββββββββββββββ 1:34 1s/step - loss: 36.7300 - sparse_categorical_accuracy: 0.4221
8/100 β[37mβββββββββββββββββββ 1:33 1s/step - loss: 36.7233 - sparse_categorical_accuracy: 0.4148
9/100 β[37mβββββββββββββββββββ 1:32 1s/step - loss: 36.7190 - sparse_categorical_accuracy: 0.4073
10/100 ββ[37mββββββββββββββββββ 1:31 1s/step - loss: 36.7201 - sparse_categorical_accuracy: 0.3990
11/100 ββ[37mββββββββββββββββββ 1:30 1s/step - loss: 36.7176 - sparse_categorical_accuracy: 0.3925
12/100 ββ[37mββββββββββββββββββ 1:30 1s/step - loss: 36.7097 - sparse_categorical_accuracy: 0.3882
13/100 ββ[37mββββββββββββββββββ 1:29 1s/step - loss: 36.7017 - sparse_categorical_accuracy: 0.3850
14/100 ββ[37mββββββββββββββββββ 1:28 1s/step - loss: 36.6936 - sparse_categorical_accuracy: 0.3819
15/100 βββ[37mβββββββββββββββββ 1:27 1s/step - loss: 36.6858 - sparse_categorical_accuracy: 0.3786
16/100 βββ[37mβββββββββββββββββ 1:26 1s/step - loss: 36.6785 - sparse_categorical_accuracy: 0.3752
17/100 βββ[37mβββββββββββββββββ 1:26 1s/step - loss: 36.6711 - sparse_categorical_accuracy: 0.3723
18/100 βββ[37mβββββββββββββββββ 1:24 1s/step - loss: 36.6637 - sparse_categorical_accuracy: 0.3695
19/100 βββ[37mβββββββββββββββββ 1:23 1s/step - loss: 36.6692 - sparse_categorical_accuracy: 0.3668
20/100 ββββ[37mββββββββββββββββ 1:22 1s/step - loss: 36.6728 - sparse_categorical_accuracy: 0.3647
21/100 ββββ[37mββββββββββββββββ 1:21 1s/step - loss: 36.6748 - sparse_categorical_accuracy: 0.3631
22/100 ββββ[37mββββββββββββββββ 1:20 1s/step - loss: 36.6766 - sparse_categorical_accuracy: 0.3616
23/100 ββββ[37mββββββββββββββββ 1:19 1s/step - loss: 36.6783 - sparse_categorical_accuracy: 0.3601
24/100 ββββ[37mββββββββββββββββ 1:18 1s/step - loss: 36.6799 - sparse_categorical_accuracy: 0.3588
25/100 βββββ[37mβββββββββββββββ 1:17 1s/step - loss: 36.6818 - sparse_categorical_accuracy: 0.3576
26/100 βββββ[37mβββββββββββββββ 1:16 1s/step - loss: 36.6836 - sparse_categorical_accuracy: 0.3565
27/100 βββββ[37mβββββββββββββββ 1:15 1s/step - loss: 36.6852 - sparse_categorical_accuracy: 0.3555
28/100 βββββ[37mβββββββββββββββ 1:14 1s/step - loss: 36.6879 - sparse_categorical_accuracy: 0.3545
29/100 βββββ[37mβββββββββββββββ 1:13 1s/step - loss: 36.6908 - sparse_categorical_accuracy: 0.3535
30/100 ββββββ[37mββββββββββββββ 1:12 1s/step - loss: 36.6939 - sparse_categorical_accuracy: 0.3525
31/100 ββββββ[37mββββββββββββββ 1:11 1s/step - loss: 36.6971 - sparse_categorical_accuracy: 0.3515
32/100 ββββββ[37mββββββββββββββ 1:10 1s/step - loss: 36.7002 - sparse_categorical_accuracy: 0.3506
33/100 ββββββ[37mββββββββββββββ 1:09 1s/step - loss: 36.7032 - sparse_categorical_accuracy: 0.3498
34/100 ββββββ[37mββββββββββββββ 1:08 1s/step - loss: 36.7059 - sparse_categorical_accuracy: 0.3492
35/100 βββββββ[37mβββββββββββββ 1:07 1s/step - loss: 36.7085 - sparse_categorical_accuracy: 0.3487
36/100 βββββββ[37mβββββββββββββ 1:06 1s/step - loss: 36.7110 - sparse_categorical_accuracy: 0.3481
37/100 βββββββ[37mβββββββββββββ 1:05 1s/step - loss: 36.7138 - sparse_categorical_accuracy: 0.3476
38/100 βββββββ[37mβββββββββββββ 1:04 1s/step - loss: 36.7167 - sparse_categorical_accuracy: 0.3472
39/100 βββββββ[37mβββββββββββββ 1:03 1s/step - loss: 36.7196 - sparse_categorical_accuracy: 0.3468
40/100 ββββββββ[37mββββββββββββ 1:02 1s/step - loss: 36.7225 - sparse_categorical_accuracy: 0.3463
41/100 ββββββββ[37mββββββββββββ 1:01 1s/step - loss: 36.7254 - sparse_categorical_accuracy: 0.3459
42/100 ββββββββ[37mββββββββββββ 1:00 1s/step - loss: 36.7283 - sparse_categorical_accuracy: 0.3455
43/100 ββββββββ[37mββββββββββββ 59s 1s/step - loss: 36.7311 - sparse_categorical_accuracy: 0.3450
44/100 ββββββββ[37mββββββββββββ 58s 1s/step - loss: 36.7339 - sparse_categorical_accuracy: 0.3446
45/100 βββββββββ[37mβββββββββββ 57s 1s/step - loss: 36.7364 - sparse_categorical_accuracy: 0.3441
46/100 βββββββββ[37mβββββββββββ 56s 1s/step - loss: 36.7387 - sparse_categorical_accuracy: 0.3437
47/100 βββββββββ[37mβββββββββββ 55s 1s/step - loss: 36.7410 - sparse_categorical_accuracy: 0.3432
48/100 βββββββββ[37mβββββββββββ 54s 1s/step - loss: 36.7433 - sparse_categorical_accuracy: 0.3428
49/100 βββββββββ[37mβββββββββββ 53s 1s/step - loss: 36.7454 - sparse_categorical_accuracy: 0.3424
50/100 ββββββββββ[37mββββββββββ 51s 1s/step - loss: 36.7475 - sparse_categorical_accuracy: 0.3420
51/100 ββββββββββ[37mββββββββββ 50s 1s/step - loss: 36.7496 - sparse_categorical_accuracy: 0.3416
52/100 ββββββββββ[37mββββββββββ 49s 1s/step - loss: 36.7515 - sparse_categorical_accuracy: 0.3413
53/100 ββββββββββ[37mββββββββββ 48s 1s/step - loss: 36.7532 - sparse_categorical_accuracy: 0.3410
54/100 ββββββββββ[37mββββββββββ 47s 1s/step - loss: 36.7547 - sparse_categorical_accuracy: 0.3407
55/100 βββββββββββ[37mβββββββββ 46s 1s/step - loss: 36.7561 - sparse_categorical_accuracy: 0.3404
56/100 βββββββββββ[37mβββββββββ 45s 1s/step - loss: 36.7575 - sparse_categorical_accuracy: 0.3401
57/100 βββββββββββ[37mβββββββββ 44s 1s/step - loss: 36.7590 - sparse_categorical_accuracy: 0.3398
58/100 βββββββββββ[37mβββββββββ 43s 1s/step - loss: 36.7603 - sparse_categorical_accuracy: 0.3396
59/100 βββββββββββ[37mβββββββββ 42s 1s/step - loss: 36.7617 - sparse_categorical_accuracy: 0.3393
60/100 ββββββββββββ[37mββββββββ 41s 1s/step - loss: 36.7629 - sparse_categorical_accuracy: 0.3390
61/100 ββββββββββββ[37mββββββββ 40s 1s/step - loss: 36.7641 - sparse_categorical_accuracy: 0.3387
62/100 ββββββββββββ[37mββββββββ 39s 1s/step - loss: 36.7653 - sparse_categorical_accuracy: 0.3383
63/100 ββββββββββββ[37mββββββββ 38s 1s/step - loss: 36.7665 - sparse_categorical_accuracy: 0.3380
64/100 ββββββββββββ[37mββββββββ 37s 1s/step - loss: 36.7676 - sparse_categorical_accuracy: 0.3376
65/100 βββββββββββββ[37mβββββββ 36s 1s/step - loss: 36.7687 - sparse_categorical_accuracy: 0.3373
66/100 βββββββββββββ[37mβββββββ 35s 1s/step - loss: 36.7696 - sparse_categorical_accuracy: 0.3369
67/100 βββββββββββββ[37mβββββββ 34s 1s/step - loss: 36.7705 - sparse_categorical_accuracy: 0.3366
68/100 βββββββββββββ[37mβββββββ 33s 1s/step - loss: 36.7713 - sparse_categorical_accuracy: 0.3363
69/100 βββββββββββββ[37mβββββββ 32s 1s/step - loss: 36.7720 - sparse_categorical_accuracy: 0.3360
70/100 ββββββββββββββ[37mββββββ 31s 1s/step - loss: 36.7725 - sparse_categorical_accuracy: 0.3357
71/100 ββββββββββββββ[37mββββββ 30s 1s/step - loss: 36.7730 - sparse_categorical_accuracy: 0.3354
72/100 ββββββββββββββ[37mββββββ 29s 1s/step - loss: 36.7734 - sparse_categorical_accuracy: 0.3352
73/100 ββββββββββββββ[37mββββββ 28s 1s/step - loss: 36.7736 - sparse_categorical_accuracy: 0.3350
74/100 ββββββββββββββ[37mββββββ 27s 1s/step - loss: 36.7739 - sparse_categorical_accuracy: 0.3348
75/100 βββββββββββββββ[37mβββββ 26s 1s/step - loss: 36.7742 - sparse_categorical_accuracy: 0.3345
76/100 βββββββββββββββ[37mβββββ 25s 1s/step - loss: 36.7744 - sparse_categorical_accuracy: 0.3343
77/100 βββββββββββββββ[37mβββββ 24s 1s/step - loss: 36.7746 - sparse_categorical_accuracy: 0.3340
78/100 βββββββββββββββ[37mβββββ 23s 1s/step - loss: 36.7747 - sparse_categorical_accuracy: 0.3338
79/100 βββββββββββββββ[37mβββββ 22s 1s/step - loss: 36.7747 - sparse_categorical_accuracy: 0.3335
80/100 ββββββββββββββββ[37mββββ 20s 1s/step - loss: 36.7747 - sparse_categorical_accuracy: 0.3333
81/100 ββββββββββββββββ[37mββββ 19s 1s/step - loss: 36.7746 - sparse_categorical_accuracy: 0.3330
82/100 ββββββββββββββββ[37mββββ 18s 1s/step - loss: 36.7745 - sparse_categorical_accuracy: 0.3328
83/100 ββββββββββββββββ[37mββββ 17s 1s/step - loss: 36.7743 - sparse_categorical_accuracy: 0.3325
84/100 ββββββββββββββββ[37mββββ 16s 1s/step - loss: 36.7741 - sparse_categorical_accuracy: 0.3322
85/100 βββββββββββββββββ[37mβββ 15s 1s/step - loss: 36.7739 - sparse_categorical_accuracy: 0.3320
86/100 βββββββββββββββββ[37mβββ 14s 1s/step - loss: 36.7737 - sparse_categorical_accuracy: 0.3317
87/100 βββββββββββββββββ[37mβββ 13s 1s/step - loss: 36.7735 - sparse_categorical_accuracy: 0.3315