โ–บ Code examples / Natural Language Processing / Sequence to sequence learning for performing number addition

Sequence to sequence learning for performing number addition

Author: Smerity and others
Date created: 2015/08/17
Last modified: 2020/04/17
Description: A model that learns to add strings of numbers, e.g. "535+61" -> "596".

โ“˜ This example uses Keras 2

View in Colab โ€ข GitHub source


Introduction

In this example, we train a model to learn to add two numbers, provided as strings.

Example:

  • Input: "535+61"
  • Output: "596"

Input may optionally be reversed, which was shown to increase performance in many tasks in: Learning to Execute and Sequence to Sequence Learning with Neural Networks.

Theoretically, sequence order inversion introduces shorter term dependencies between source and target for this problem.

Results:

For two digits (reversed):

  • One layer LSTM (128 HN), 5k training examples = 99% train/test accuracy in 55 epochs

Three digits (reversed):

  • One layer LSTM (128 HN), 50k training examples = 99% train/test accuracy in 100 epochs

Four digits (reversed):

  • One layer LSTM (128 HN), 400k training examples = 99% train/test accuracy in 20 epochs

Five digits (reversed):

  • One layer LSTM (128 HN), 550k training examples = 99% train/test accuracy in 30 epochs

Setup

import keras
from keras import layers
import numpy as np

# Parameters for the model and dataset.
TRAINING_SIZE = 50000
DIGITS = 3
REVERSE = True

# Maximum length of input is 'int + int' (e.g., '345+678'). Maximum length of
# int is DIGITS.
MAXLEN = DIGITS + 1 + DIGITS

Generate the data

class CharacterTable:
    """Given a set of characters:
    + Encode them to a one-hot integer representation
    + Decode the one-hot or integer representation to their character output
    + Decode a vector of probabilities to their character output
    """

    def __init__(self, chars):
        """Initialize character table.
        # Arguments
            chars: Characters that can appear in the input.
        """
        self.chars = sorted(set(chars))
        self.char_indices = dict((c, i) for i, c in enumerate(self.chars))
        self.indices_char = dict((i, c) for i, c in enumerate(self.chars))

    def encode(self, C, num_rows):
        """One-hot encode given string C.
        # Arguments
            C: string, to be encoded.
            num_rows: Number of rows in the returned one-hot encoding. This is
                used to keep the # of rows for each data the same.
        """
        x = np.zeros((num_rows, len(self.chars)))
        for i, c in enumerate(C):
            x[i, self.char_indices[c]] = 1
        return x

    def decode(self, x, calc_argmax=True):
        """Decode the given vector or 2D array to their character output.
        # Arguments
            x: A vector or a 2D array of probabilities or one-hot representations;
                or a vector of character indices (used with `calc_argmax=False`).
            calc_argmax: Whether to find the character index with maximum
                probability, defaults to `True`.
        """
        if calc_argmax:
            x = x.argmax(axis=-1)
        return "".join(self.indices_char[x] for x in x)


# All the numbers, plus sign and space for padding.
chars = "0123456789+ "
ctable = CharacterTable(chars)

questions = []
expected = []
seen = set()
print("Generating data...")
while len(questions) < TRAINING_SIZE:
    f = lambda: int(
        "".join(
            np.random.choice(list("0123456789"))
            for i in range(np.random.randint(1, DIGITS + 1))
        )
    )
    a, b = f(), f()
    # Skip any addition questions we've already seen
    # Also skip any such that x+Y == Y+x (hence the sorting).
    key = tuple(sorted((a, b)))
    if key in seen:
        continue
    seen.add(key)
    # Pad the data with spaces such that it is always MAXLEN.
    q = "{}+{}".format(a, b)
    query = q + " " * (MAXLEN - len(q))
    ans = str(a + b)
    # Answers can be of maximum size DIGITS + 1.
    ans += " " * (DIGITS + 1 - len(ans))
    if REVERSE:
        # Reverse the query, e.g., '12+345  ' becomes '  543+21'. (Note the
        # space used for padding.)
        query = query[::-1]
    questions.append(query)
    expected.append(ans)
print("Total questions:", len(questions))
Generating data...
Total questions: 50000

Vectorize the data

print("Vectorization...")
x = np.zeros((len(questions), MAXLEN, len(chars)), dtype=bool)
y = np.zeros((len(questions), DIGITS + 1, len(chars)), dtype=bool)
for i, sentence in enumerate(questions):
    x[i] = ctable.encode(sentence, MAXLEN)
for i, sentence in enumerate(expected):
    y[i] = ctable.encode(sentence, DIGITS + 1)

# Shuffle (x, y) in unison as the later parts of x will almost all be larger
# digits.
indices = np.arange(len(y))
np.random.shuffle(indices)
x = x[indices]
y = y[indices]

# Explicitly set apart 10% for validation data that we never train over.
split_at = len(x) - len(x) // 10
(x_train, x_val) = x[:split_at], x[split_at:]
(y_train, y_val) = y[:split_at], y[split_at:]

print("Training Data:")
print(x_train.shape)
print(y_train.shape)

print("Validation Data:")
print(x_val.shape)
print(y_val.shape)
Vectorization...
Training Data:
(45000, 7, 12)
(45000, 4, 12)
Validation Data:
(5000, 7, 12)
(5000, 4, 12)

Build the model

print("Build model...")
num_layers = 1  # Try to add more LSTM layers!

model = keras.Sequential()
# "Encode" the input sequence using a LSTM, producing an output of size 128.
# Note: In a situation where your input sequences have a variable length,
# use input_shape=(None, num_feature).
model.add(layers.Input((MAXLEN, len(chars))))
model.add(layers.LSTM(128))
# As the decoder RNN's input, repeatedly provide with the last output of
# RNN for each time step. Repeat 'DIGITS + 1' times as that's the maximum
# length of output, e.g., when DIGITS=3, max output is 999+999=1998.
model.add(layers.RepeatVector(DIGITS + 1))
# The decoder RNN could be multiple layers stacked or a single layer.
for _ in range(num_layers):
    # By setting return_sequences to True, return not only the last output but
    # all the outputs so far in the form of (num_samples, timesteps,
    # output_dim). This is necessary as TimeDistributed in the below expects
    # the first dimension to be the timesteps.
    model.add(layers.LSTM(128, return_sequences=True))

# Apply a dense layer to the every temporal slice of an input. For each of step
# of the output sequence, decide which character should be chosen.
model.add(layers.Dense(len(chars), activation="softmax"))
model.compile(loss="categorical_crossentropy", optimizer="adam", metrics=["accuracy"])
model.summary()
Build model...
Model: "sequential"
โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”ณโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”ณโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”“
โ”ƒ Layer (type)                    โ”ƒ Output Shape              โ”ƒ    Param # โ”ƒ
โ”กโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ•‡โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ•‡โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”ฉ
โ”‚ lstm (LSTM)                     โ”‚ (None, 128)               โ”‚     72,192 โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚ repeat_vector (RepeatVector)    โ”‚ (None, 4, 128)            โ”‚          0 โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚ lstm_1 (LSTM)                   โ”‚ (None, 4, 128)            โ”‚    131,584 โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚ dense (Dense)                   โ”‚ (None, 4, 12)             โ”‚      1,548 โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
 Total params: 205,324 (802.05 KB)
 Trainable params: 205,324 (802.05 KB)
 Non-trainable params: 0 (0.00 B)

Train the model

epochs = 30
batch_size = 32


# Train the model each generation and show predictions against the validation
# dataset.
for epoch in range(1, epochs):
    print()
    print("Iteration", epoch)
    model.fit(
        x_train,
        y_train,
        batch_size=batch_size,
        epochs=1,
        validation_data=(x_val, y_val),
    )
    # Select 10 samples from the validation set at random so we can visualize
    # errors.
    for i in range(10):
        ind = np.random.randint(0, len(x_val))
        rowx, rowy = x_val[np.array([ind])], y_val[np.array([ind])]
        preds = np.argmax(model.predict(rowx), axis=-1)
        q = ctable.decode(rowx[0])
        correct = ctable.decode(rowy[0])
        guess = ctable.decode(preds[0], calc_argmax=False)
        print("Q", q[::-1] if REVERSE else q, end=" ")
        print("T", correct, end=" ")
        if correct == guess:
            print("โ˜‘ " + guess)
        else:
            print("โ˜’ " + guess)
Iteration 1
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 8s 4ms/step - accuracy: 0.3224 - loss: 1.8852 - val_accuracy: 0.4116 - val_loss: 1.5662
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 386ms/step
Q 986+63  T 1049 โ˜’ 903 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 905+57  T 962  โ˜’ 901 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 942us/step
Q 649+29  T 678  โ˜’ 606 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 924us/step
Q 53+870  T 923  โ˜’ 881 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 444+283 T 727  โ˜’ 513 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 29+601  T 630  โ˜’ 201 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 44+618  T 662  โ˜’ 571 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 941us/step
Q 73+989  T 1062 โ˜’ 906 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 858us/step
Q 108+928 T 1036 โ˜’ 103 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 206+61  T 267  โ˜’ 276 
Iteration 2
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.4672 - loss: 1.4301 - val_accuracy: 0.5708 - val_loss: 1.1566
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 34+611  T 645  โ˜’ 651 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 41+657  T 698  โ˜’ 619 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 28+461  T 489  โ˜’ 591 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 6+114   T 120  โ˜’ 121 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 80+42   T 122  โ˜’ 131 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 123+47  T 170  โ˜’ 175 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 563+540 T 1103 โ˜’ 1019
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 8+937   T 945  โ˜’ 960 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 3+568   T 571  โ˜’ 570 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 32+771  T 803  โ˜’ 819 
Iteration 3
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.5986 - loss: 1.0737 - val_accuracy: 0.6534 - val_loss: 0.9349
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 976us/step
Q 521+5   T 526  โ˜’ 524 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 4+250   T 254  โ˜’ 256 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 927us/step
Q 467+74  T 541  โ˜’ 542 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 584+5   T 589  โ˜’ 580 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 434+99  T 533  โ˜’ 526 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 549+8   T 557  โ˜’ 552 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 610+870 T 1480 โ˜’ 1376
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 828+969 T 1797 โ˜’ 1710
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 968+0   T 968  โ˜’ 969 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 629+654 T 1283 โ˜’ 1275
Iteration 4
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.6779 - loss: 0.8780 - val_accuracy: 0.7011 - val_loss: 0.7973
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 12+647  T 659  โ˜’ 657 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 889us/step
Q 82+769  T 851  โ˜’ 857 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 975us/step
Q 79+412  T 491  โ˜‘ 491 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 973us/step
Q 712+31  T 743  โ˜’ 745 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 37+37   T 74   โ˜’ 73  
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 576+28  T 604  โ˜’ 607 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 916us/step
Q 39+102  T 141  โ˜’ 140 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 922us/step
Q 649+472 T 1121 โ˜’ 1111
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 563+540 T 1103 โ˜’ 1100
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 86+391  T 477  โ˜‘ 477 
Iteration 5
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.7195 - loss: 0.7628 - val_accuracy: 0.7436 - val_loss: 0.6989
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 816+39  T 855  โ˜’ 859 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 27+99   T 126  โ˜’ 123 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 975us/step
Q 871+98  T 969  โ˜’ 965 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 966us/step
Q 394+10  T 404  โ˜’ 409 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 969us/step
Q 63+63   T 126  โ˜’ 129 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 78+334  T 412  โ˜’ 419 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 957us/step
Q 112+4   T 116  โ˜‘ 116 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 911us/step
Q 990+37  T 1027 โ˜’ 1029
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 882us/step
Q 75+63   T 138  โ˜’ 139 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 871us/step
Q 38+481  T 519  โ˜‘ 519 
Iteration 6
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.7527 - loss: 0.6754 - val_accuracy: 0.7705 - val_loss: 0.6260
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 475+21  T 496  โ˜’ 497 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 324+76  T 400  โ˜‘ 400 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 479+385 T 864  โ˜’ 867 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 892us/step
Q 36+213  T 249  โ˜’ 247 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 671+259 T 930  โ˜’ 934 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 3+10    T 13   โ˜’ 20  
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 86+319  T 405  โ˜‘ 405 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 63+63   T 126  โ˜’ 127 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 459+833 T 1292 โ˜’ 1390
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 991us/step
Q 17+465  T 482  โ˜’ 489 
Iteration 7
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.7927 - loss: 0.5712 - val_accuracy: 0.8573 - val_loss: 0.3966
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 965us/step
Q 43+945  T 988  โ˜‘ 988 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 821us/step
Q 371+96  T 467  โ˜’ 468 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 2ms/step  
Q 873+894 T 1767 โ˜’ 1768
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 912us/step
Q 117+82  T 199  โ˜‘ 199 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 978us/step
Q 25+95   T 120  โ˜’ 110 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 958us/step
Q 26+99   T 125  โ˜‘ 125 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 829us/step
Q 29+6    T 35   โ˜’ 34  
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 977us/step
Q 857+81  T 938  โ˜’ 939 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 972us/step
Q 668+97  T 765  โ˜‘ 765 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 916us/step
Q 85+903  T 988  โ˜‘ 988 
Iteration 8
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.8902 - loss: 0.3306 - val_accuracy: 0.9228 - val_loss: 0.2472
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 21+478  T 499  โ˜‘ 499 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 978us/step
Q 795+16  T 811  โ˜‘ 811 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 97+616  T 713  โ˜‘ 713 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 889+342 T 1231 โ˜’ 1221
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 266+274 T 540  โ˜’ 530 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 975us/step
Q 751+830 T 1581 โ˜‘ 1581
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 929us/step
Q 674+3   T 677  โ˜‘ 677 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 902+167 T 1069 โ˜’ 1068
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 505+1   T 506  โ˜‘ 506 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 955us/step
Q 944+775 T 1719 โ˜‘ 1719
Iteration 9
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9536 - loss: 0.1820 - val_accuracy: 0.9665 - val_loss: 0.1333
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 679+61  T 740  โ˜‘ 740 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 921+49  T 970  โ˜‘ 970 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 863+16  T 879  โ˜‘ 879 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 27+560  T 587  โ˜‘ 587 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 34+941  T 975  โ˜‘ 975 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 7+278   T 285  โ˜‘ 285 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 2ms/step  
Q 165+43  T 208  โ˜‘ 208 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 2ms/step
Q 695+44  T 739  โ˜‘ 739 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 25+165  T 190  โ˜‘ 190 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 2ms/step  
Q 34+184  T 218  โ˜‘ 218 
Iteration 10
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9739 - loss: 0.1127 - val_accuracy: 0.9774 - val_loss: 0.0889
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 79+85   T 164  โ˜‘ 164 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 816+353 T 1169 โ˜‘ 1169
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 950us/step
Q 405+16  T 421  โ˜‘ 421 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 1+615   T 616  โ˜‘ 616 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 954+996 T 1950 โ˜‘ 1950
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 62+254  T 316  โ˜‘ 316 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 31+196  T 227  โ˜‘ 227 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 885+88  T 973  โ˜‘ 973 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 975us/step
Q 586+74  T 660  โ˜‘ 660 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 299+94  T 393  โ˜‘ 393 
Iteration 11
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9863 - loss: 0.0675 - val_accuracy: 0.9807 - val_loss: 0.0721
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 844+638 T 1482 โ˜‘ 1482
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 833+98  T 931  โ˜‘ 931 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 85+68   T 153  โ˜‘ 153 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 201+18  T 219  โ˜‘ 219 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 7+386   T 393  โ˜‘ 393 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 854+66  T 920  โ˜‘ 920 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 80+624  T 704  โ˜’ 705 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 524+721 T 1245 โ˜‘ 1245
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 311+86  T 397  โ˜‘ 397 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 746+67  T 813  โ˜‘ 813 
Iteration 12
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9832 - loss: 0.0671 - val_accuracy: 0.9842 - val_loss: 0.0557
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 10+577  T 587  โ˜‘ 587 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 960us/step
Q 257+3   T 260  โ˜‘ 260 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 938us/step
Q 83+53   T 136  โ˜‘ 136 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 977us/step
Q 17+898  T 915  โ˜‘ 915 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 12+6    T 18   โ˜‘ 18  
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 761+54  T 815  โ˜‘ 815 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 935us/step
Q 813+742 T 1555 โ˜‘ 1555
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 3+10    T 13   โ˜‘ 13  
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 22+49   T 71   โ˜‘ 71  
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 81+618  T 699  โ˜‘ 699 
Iteration 13
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9888 - loss: 0.0459 - val_accuracy: 0.9810 - val_loss: 0.0623
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 991+45  T 1036 โ˜‘ 1036
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 952us/step
Q 683+1   T 684  โ˜‘ 684 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 994us/step
Q 49+70   T 119  โ˜‘ 119 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 1+500   T 501  โ˜‘ 501 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 969us/step
Q 51+444  T 495  โ˜‘ 495 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 60+47   T 107  โ˜‘ 107 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 976us/step
Q 76+921  T 997  โ˜‘ 997 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 83+732  T 815  โ˜‘ 815 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 936+22  T 958  โ˜‘ 958 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 975us/step
Q 790+770 T 1560 โ˜‘ 1560
Iteration 14
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9827 - loss: 0.0592 - val_accuracy: 0.9970 - val_loss: 0.0188
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 2+715   T 717  โ˜‘ 717 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 928us/step
Q 767+7   T 774  โ˜‘ 774 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 967+27  T 994  โ˜‘ 994 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 976+23  T 999  โ˜‘ 999 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 695+77  T 772  โ˜‘ 772 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 989us/step
Q 7+963   T 970  โ˜‘ 970 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 949us/step
Q 91+461  T 552  โ˜‘ 552 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 901us/step
Q 41+657  T 698  โ˜‘ 698 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 948us/step
Q 796+14  T 810  โ˜‘ 810 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 956us/step
Q 321+11  T 332  โ˜‘ 332 
Iteration 15
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9970 - loss: 0.0177 - val_accuracy: 0.9902 - val_loss: 0.0339
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 50+859  T 909  โ˜‘ 909 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 733+351 T 1084 โ˜‘ 1084
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 373+25  T 398  โ˜‘ 398 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 771+1   T 772  โ˜‘ 772 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 144+799 T 943  โ˜‘ 943 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 994us/step
Q 7+897   T 904  โ˜‘ 904 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 58+50   T 108  โ˜‘ 108 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 986us/step
Q 731+12  T 743  โ˜‘ 743 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 641+58  T 699  โ˜‘ 699 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 978us/step
Q 577+97  T 674  โ˜‘ 674 
Iteration 16
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9945 - loss: 0.0238 - val_accuracy: 0.9921 - val_loss: 0.0332
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 987us/step
Q 37+501  T 538  โ˜‘ 538 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 989us/step
Q 188+44  T 232  โ˜‘ 232 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 935us/step
Q 2+292   T 294  โ˜‘ 294 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 938us/step
Q 620+206 T 826  โ˜‘ 826 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 882us/step
Q 417+20  T 437  โ˜‘ 437 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 914us/step
Q 59+590  T 649  โ˜‘ 649 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 885us/step
Q 486+38  T 524  โ˜‘ 524 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 934us/step
Q 521+307 T 828  โ˜‘ 828 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 963us/step
Q 777+825 T 1602 โ˜’ 1502
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 922us/step
Q 9+285   T 294  โ˜‘ 294 
Iteration 17
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9965 - loss: 0.0171 - val_accuracy: 0.9711 - val_loss: 0.0850
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 899+99  T 998  โ˜‘ 998 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 4+516   T 520  โ˜‘ 520 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 568+45  T 613  โ˜‘ 613 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 742+339 T 1081 โ˜‘ 1081
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 124+655 T 779  โ˜‘ 779 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 958us/step
Q 7+640   T 647  โ˜‘ 647 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 958us/step
Q 77+922  T 999  โ˜‘ 999 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 888us/step
Q 478+54  T 532  โ˜‘ 532 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 836us/step
Q 62+260  T 322  โ˜‘ 322 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 997us/step
Q 344+426 T 770  โ˜‘ 770 
Iteration 18
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9867 - loss: 0.0433 - val_accuracy: 0.9565 - val_loss: 0.1465
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 918+4   T 922  โ˜‘ 922 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 5+657   T 662  โ˜’ 672 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 988us/step
Q 76+40   T 116  โ˜’ 117 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 999us/step
Q 704+807 T 1511 โ˜’ 1512
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 21+326  T 347  โ˜’ 348 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 6+859   T 865  โ˜‘ 865 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 980us/step
Q 533+804 T 1337 โ˜’ 1327
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 998us/step
Q 70+495  T 565  โ˜’ 566 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 919us/step
Q 50+477  T 527  โ˜‘ 527 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 463+33  T 496  โ˜‘ 496 
Iteration 19
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9879 - loss: 0.0406 - val_accuracy: 0.9965 - val_loss: 0.0162
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 717+39  T 756  โ˜‘ 756 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 45+518  T 563  โ˜‘ 563 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 461+5   T 466  โ˜‘ 466 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 12+6    T 18   โ˜‘ 18  
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 584+5   T 589  โ˜‘ 589 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 154+133 T 287  โ˜‘ 287 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 242+25  T 267  โ˜‘ 267 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 937us/step
Q 36+824  T 860  โ˜‘ 860 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 901us/step
Q 894+339 T 1233 โ˜‘ 1233
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 983us/step
Q 820+625 T 1445 โ˜‘ 1445
Iteration 20
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9973 - loss: 0.0128 - val_accuracy: 0.9791 - val_loss: 0.0587
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 587+606 T 1193 โ˜‘ 1193
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 742+67  T 809  โ˜‘ 809 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 92+84   T 176  โ˜‘ 176 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 681+695 T 1376 โ˜‘ 1376
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 936+0   T 936  โ˜‘ 936 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 983+0   T 983  โ˜‘ 983 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 544+95  T 639  โ˜‘ 639 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 659+411 T 1070 โ˜‘ 1070
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 876+63  T 939  โ˜‘ 939 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 856us/step
Q 206+82  T 288  โ˜‘ 288 
Iteration 21
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9909 - loss: 0.0293 - val_accuracy: 0.9982 - val_loss: 0.0087
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 459+4   T 463  โ˜‘ 463 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 85+903  T 988  โ˜‘ 988 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 484+17  T 501  โ˜‘ 501 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 136+412 T 548  โ˜‘ 548 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 0+761   T 761  โ˜‘ 761 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 41+945  T 986  โ˜‘ 986 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 952us/step
Q 450+517 T 967  โ˜‘ 967 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 162+15  T 177  โ˜‘ 177 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 365+807 T 1172 โ˜‘ 1172
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 34+499  T 533  โ˜‘ 533 
Iteration 22
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9959 - loss: 0.0158 - val_accuracy: 0.9953 - val_loss: 0.0197
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 820+51  T 871  โ˜‘ 871 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 904us/step
Q 3+228   T 231  โ˜‘ 231 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 87+634  T 721  โ˜‘ 721 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 2+715   T 717  โ˜‘ 717 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 76+885  T 961  โ˜‘ 961 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 917us/step
Q 92+896  T 988  โ˜‘ 988 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 434+417 T 851  โ˜‘ 851 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 91+346  T 437  โ˜‘ 437 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 174+697 T 871  โ˜‘ 871 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 44+506  T 550  โ˜‘ 550 
Iteration 23
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9968 - loss: 0.0136 - val_accuracy: 0.9984 - val_loss: 0.0085
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 4+966   T 970  โ˜‘ 970 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 10+53   T 63   โ˜‘ 63  
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 701+841 T 1542 โ˜‘ 1542
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 10+45   T 55   โ˜‘ 55  
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 294+600 T 894  โ˜‘ 894 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 9+182   T 191  โ˜‘ 191 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 722+522 T 1244 โ˜‘ 1244
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 811+38  T 849  โ˜‘ 849 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 531+69  T 600  โ˜‘ 600 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 978us/step
Q 17+59   T 76   โ˜‘ 76  
Iteration 24
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9981 - loss: 0.0084 - val_accuracy: 0.9952 - val_loss: 0.0175
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 6+668   T 674  โ˜‘ 674 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 994us/step
Q 198+295 T 493  โ˜‘ 493 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 988us/step
Q 89+828  T 917  โ˜‘ 917 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 915us/step
Q 286+907 T 1193 โ˜‘ 1193
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 146+16  T 162  โ˜‘ 162 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 81+267  T 348  โ˜‘ 348 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 907us/step
Q 95+921  T 1016 โ˜‘ 1016
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 914us/step
Q 6+475   T 481  โ˜‘ 481 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 85+521  T 606  โ˜‘ 606 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 976us/step
Q 597+819 T 1416 โ˜‘ 1416
Iteration 25
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9977 - loss: 0.0101 - val_accuracy: 0.9752 - val_loss: 0.0939
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 998us/step
Q 84+194  T 278  โ˜‘ 278 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 6+543   T 549  โ˜‘ 549 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 962us/step
Q 455+99  T 554  โ˜‘ 554 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 74+232  T 306  โ˜‘ 306 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 958us/step
Q 27+48   T 75   โ˜‘ 75  
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 184+435 T 619  โ˜‘ 619 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 959us/step
Q 257+674 T 931  โ˜’ 1031
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 978us/step
Q 887+7   T 894  โ˜‘ 894 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 957us/step
Q 43+0    T 43   โ˜‘ 43  
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 829us/step
Q 629+542 T 1171 โ˜‘ 1171
Iteration 26
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9949 - loss: 0.0188 - val_accuracy: 0.9983 - val_loss: 0.0081
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 964+524 T 1488 โ˜‘ 1488
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 991us/step
Q 556+47  T 603  โ˜‘ 603 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 433+56  T 489  โ˜‘ 489 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 982+11  T 993  โ˜‘ 993 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 887+39  T 926  โ˜‘ 926 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 656+57  T 713  โ˜‘ 713 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 843+186 T 1029 โ˜‘ 1029
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 367+596 T 963  โ˜‘ 963 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 921us/step
Q 40+133  T 173  โ˜‘ 173 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 407+80  T 487  โ˜‘ 487 
Iteration 27
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9966 - loss: 0.0126 - val_accuracy: 0.9985 - val_loss: 0.0076
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 985us/step
Q 462+0   T 462  โ˜‘ 462 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 13+861  T 874  โ˜‘ 874 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 919us/step
Q 122+439 T 561  โ˜‘ 561 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 85+420  T 505  โ˜‘ 505 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 938us/step
Q 371+69  T 440  โ˜‘ 440 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 933us/step
Q 11+150  T 161  โ˜‘ 161 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 930us/step
Q 694+26  T 720  โ˜‘ 720 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 932us/step
Q 422+485 T 907  โ˜‘ 907 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 954us/step
Q 146+130 T 276  โ˜‘ 276 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 921us/step
Q 103+19  T 122  โ˜‘ 122 
Iteration 28
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9963 - loss: 0.0134 - val_accuracy: 0.9754 - val_loss: 0.0840
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 923+68  T 991  โ˜‘ 991 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 398+74  T 472  โ˜‘ 472 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 319+805 T 1124 โ˜‘ 1124
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 936+10  T 946  โ˜‘ 946 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 933+721 T 1654 โ˜‘ 1654
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 89+965  T 1054 โ˜‘ 1054
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 37+469  T 506  โ˜’ 516 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 84+194  T 278  โ˜‘ 278 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 973us/step
Q 965+5   T 970  โ˜‘ 970 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 120+480 T 600  โ˜‘ 600 
Iteration 29
 1407/1407 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 3s 2ms/step - accuracy: 0.9942 - loss: 0.0236 - val_accuracy: 0.9883 - val_loss: 0.0342
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 626+584 T 1210 โ˜‘ 1210
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 952us/step
Q 501+615 T 1116 โ˜‘ 1116
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 1+827   T 828  โ˜‘ 828 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 527+651 T 1178 โ˜‘ 1178
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 977us/step
Q 53+44   T 97   โ˜‘ 97  
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 899us/step
Q 79+474  T 553  โ˜‘ 553 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 941us/step
Q 34+949  T 983  โ˜‘ 983 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 954us/step
Q 66+807  T 873  โ˜‘ 873 
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 827us/step
Q 49+28   T 77   โ˜‘ 77  
 1/1 โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ” 0s 1ms/step  
Q 628+93  T 721  โ˜‘ 721 

You'll get to 99+% validation accuracy after ~30 epochs.

Example available on HuggingFace.

Trained Model Demo
Generic badge Generic badge