Forecasting Air Passenger Traffic#

A Comparative Study on the International Airport Passengers Dataset

Introduction#

In this Jupyter Notebook, we delve into the realm of time series forecasting using the “International Airport Passengers” dataset. This dataset is a quintessential example of a time series featuring distinct trends and seasonal variations, making it ideal for benchmarking different forecasting models.

Objective#

Our primary objective is to assess the performance of various time series forecasting models in predicting future trends in passenger traffic. Through this comparison, we aim to discern the capabilities and limitations of each model, particularly in terms of handling complex patterns inherent in time series data.

Models in Comparison#

We will be evaluating four distinct models:

  • ARIMA (AutoRegressive Integrated Moving Average): A classic statistical model renowned for capturing linear trends in time series data.

  • LSTM (Long Short-Term Memory): A type of recurrent neural network that excels in recognizing long-term dependencies, ideal for non-linear data sequences.

  • LSTM with Attention Mechanism: Enhancing the standard LSTM’s capabilities by incorporating an attention mechanism for better context understanding.

  • CNN (Convolutional Neural Network): Typically used in image processing, but also effective in identifying local patterns in sequential data.

  • Transformer: A recent innovation that employs attention mechanisms to focus on different parts of input data, proving effective in sequence-to-sequence modeling.

Approach#

In our study, we employ two distinct prediction methodologies to evaluate the performance of our models. These methodologies help us understand how well each model can forecast future values under different scenarios.

  • Non-Rolling Prediction: This approach involves using the trained model to predict future time steps all at once, based solely on the historical data. It is a straightforward prediction method where the model uses the known data to forecast the next steps in the sequence. Non-Rolling Prediction is often used to evaluate the model’s performance when complete future data is available for prediction.

  • Rolling Prediction: Contrary to Non-Rolling Prediction, Rolling Prediction simulates a more realistic scenario where each future time step is predicted one at a time, and each prediction is fed back as input for the next prediction. This method mimics a real-world situation where each prediction depends on the previous ones, and the model does not have access to future data. It is particularly useful for evaluating how well a model adapts to new data and its effectiveness in a continuously updating environment.

Importing Libraries and Configuration#

import pandas as pd
import numpy as np
import random
import os
from statsmodels.tsa.arima.model import ARIMA
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, LSTM, Dropout, LayerNormalization, Conv1D, MaxPooling1D, Flatten, MultiHeadAttention, Input, GlobalAveragePooling1D, Concatenate, Attention
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.regularizers import l2
from tensorflow.keras.callbacks import ReduceLROnPlateau, EarlyStopping
from sklearn.preprocessing import MinMaxScaler
from sklearn.metrics import mean_squared_error
import matplotlib.pyplot as plt
%matplotlib inline
from matplotlib.pylab import rcParams
2025-04-08 14:10:12.877212: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.

Settings#

rcParams["figure.figsize"] = 15, 6
rcParams["axes.titlesize"] = "xx-large"
rcParams["axes.titleweight"] = "bold"
rcParams["legend.loc"] = "upper left"


# Setting random seeds for reproducibility
seed_value = 42
os.environ["PYTHONHASHSEED"] = str(seed_value)
random.seed(seed_value)
np.random.seed(seed_value)
tf.random.set_seed(seed_value)

l2_reg=0.001

Loading and Handling Time Series in Pandas#

data = pd.read_csv(
    "https://raw.githubusercontent.com/GVSU-CIS635/Datasets/master/airline-passengers.csv",
    index_col="Month"
)
data.index = pd.to_datetime(data.index, format="%Y-%m")
data.head()
Passengers
Month
1949-01-01 112
1949-02-01 118
1949-03-01 132
1949-04-01 129
1949-05-01 121
passengers = data["Passengers"]
plt.title(
    "International airline passengers: monthly totals in thousands. \n Jan 49 – Dec 60. Units: Thousands of passengers"
)
plt.plot(passengers)
[<matplotlib.lines.Line2D at 0x7f3c6090a990>]
../_images/54a97671132f2258307dde86d4b53efffd7fbb2acb181eec82430da53699260d.png

ARIMA#

Splitting Data into Training and Test Sets for ARIMA#

data_train, data_test = (
    data[:-24],
    data[-24:],
)
print(f"Training {len(data_train)}, Test {len(data_test)}")
print(type(data_train))
Training 120, Test 24
<class 'pandas.core.frame.DataFrame'>

ARIMA Model Experimentation#

model = ARIMA(data_train, order=(2, 2, 11))
results_ARIMA = model.fit()
yhat = results_ARIMA.forecast(steps=24)
plt.plot(passengers, label="Actual Data")
plt.plot(yhat, color="red", label="Forecasted Data")

# Calculating the RMSE (Root Mean Squared Error)
rmse = mean_squared_error(data_test["Passengers"], yhat, squared=False)
plt.title(f"RMSE: {rmse:.4f}")
plt.legend()
plt.show()
/opt/conda/lib/python3.11/site-packages/statsmodels/tsa/base/tsa_model.py:473: ValueWarning: No frequency information was provided, so inferred frequency MS will be used.
  self._init_dates(dates, freq)
/opt/conda/lib/python3.11/site-packages/statsmodels/tsa/base/tsa_model.py:473: ValueWarning: No frequency information was provided, so inferred frequency MS will be used.
  self._init_dates(dates, freq)
/opt/conda/lib/python3.11/site-packages/statsmodels/tsa/base/tsa_model.py:473: ValueWarning: No frequency information was provided, so inferred frequency MS will be used.
  self._init_dates(dates, freq)
/opt/conda/lib/python3.11/site-packages/statsmodels/tsa/statespace/sarimax.py:978: UserWarning: Non-invertible starting MA parameters found. Using zeros as starting parameters.
  warn('Non-invertible starting MA parameters found.'
/opt/conda/lib/python3.11/site-packages/statsmodels/base/model.py:607: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
  warnings.warn("Maximum Likelihood optimization failed to "
../_images/4536b2bfbd4cbefd0c97e0ba3f790aeec849ab6ce0bef33ec1d3596dd94faf61.png

Neural Networks#

Data Normalization for Neural Network Experiments#

scaler = MinMaxScaler(feature_range=(0, 1))
scaled_data = scaler.fit_transform(data)
print(scaled_data.shape)
plt.plot(scaled_data)
(144, 1)
[<matplotlib.lines.Line2D at 0x7f3c60663d90>]
../_images/f0989515260f5fdb92254aaefb981573d66c2711d89f1ddf9b9cc5ec08892da7.png

Preparing Dataset for Neural Network Models#

def create_dataset(sequence, look_back=1):
    X, y = [], []
    for i in range(len(sequence) - look_back):
        end_ix = i + look_back
        # gather input and output parts of the pattern
        seq_x, seq_y = sequence[i:end_ix], sequence[end_ix]
        X.append(seq_x)
        y.append(seq_y)
    return np.array(X), np.array(y)
# Using past 12 time steps to predict the next step
look_back = 12

X, y = create_dataset(scaled_data, look_back)
print("(samples, timesteps, features): " + str(X.shape))
(samples, timesteps, features): (132, 12, 1)

Splitting Data for Neural Network Models#

X_train, y_train = X[:-36], y[:-36]
X_valid, y_valid = X[-36:-24], y[-36:-24]
X_test, y_test = X[-24:], y[-24:]
print(f"training samples shape: {X_train.shape}")
print(f"validation samples shape: {X_valid.shape}")
print(f"test samples shape: {X_test.shape}")
training samples shape: (96, 12, 1)
validation samples shape: (12, 12, 1)
test samples shape: (24, 12, 1)

LSTM Model Building and Training#

lstm = Sequential()
lstm.add(
    LSTM(200, activation="relu", return_sequences=False, input_shape=(look_back, 1), kernel_regularizer=l2(l2_reg)))
lstm.add(LayerNormalization())
lstm.add(Dropout(0.2))
lstm.add(Dense(50, activation='relu', kernel_regularizer=l2(l2_reg)))
lstm.add(Dense(1, kernel_regularizer=l2(l2_reg)))
def network_training(model, model_name):
    learning_rate = 0.001

    # Configure the optimizer with the specified learning rate
    adam_optimizer = Adam(learning_rate=learning_rate)
    model.compile(loss="mean_squared_error", optimizer=adam_optimizer)

    # Early Stopping Callback
    early_stopping = EarlyStopping(
        monitor="val_loss", patience=20, verbose=1, mode="auto"
    )

    # Reduce learning rate when the validation loss plateaus
    reduce_lr = ReduceLROnPlateau(
        monitor="val_loss", factor=0.1, patience=10, verbose=1, min_lr=0.000000001
    )

    history = model.fit(
        X_train,
        y_train,
        validation_data=(X_valid, y_valid),
        epochs=500,
        verbose=1,
        callbacks=[early_stopping, reduce_lr],
    )

    # Plotting the evaluation loss
    plt.plot(history.history["val_loss"], label="Validation Loss")
    plt.title(f"{model_name} Validation Loss")
    plt.ylabel("Loss")
    plt.xlabel("Epoch")
    plt.legend()
    plt.show()
network_training(lstm, "LSTM")
Epoch 1/500
1/3 [=========>....................] - ETA: 1s - loss: 0.0937

3/3 [==============================] - 1s 114ms/step - loss: 0.2277 - val_loss: 0.1734 - lr: 0.0010
Epoch 2/500
1/3 [=========>....................] - ETA: 0s - loss: 0.1132

3/3 [==============================] - 0s 26ms/step - loss: 0.1660 - val_loss: 0.2898 - lr: 0.0010
Epoch 3/500
1/3 [=========>....................] - ETA: 0s - loss: 0.1469

3/3 [==============================] - 0s 24ms/step - loss: 0.1245 - val_loss: 0.0973 - lr: 0.0010
Epoch 4/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0921

3/3 [==============================] - 0s 24ms/step - loss: 0.1007 - val_loss: 0.1130 - lr: 0.0010
Epoch 5/500
1/3 [=========>....................] - ETA: 0s - loss: 0.1104

3/3 [==============================] - 0s 24ms/step - loss: 0.1062 - val_loss: 0.0937 - lr: 0.0010
Epoch 6/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0969

3/3 [==============================] - 0s 39ms/step - loss: 0.0932 - val_loss: 0.1339 - lr: 0.0010
Epoch 7/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0935

3/3 [==============================] - 0s 40ms/step - loss: 0.0948 - val_loss: 0.1122 - lr: 0.0010
Epoch 8/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0864

3/3 [==============================] - 0s 29ms/step - loss: 0.0877 - val_loss: 0.0931 - lr: 0.0010
Epoch 9/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0874

3/3 [==============================] - 0s 27ms/step - loss: 0.0893 - val_loss: 0.0910 - lr: 0.0010
Epoch 10/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0941

3/3 [==============================] - 0s 24ms/step - loss: 0.0871 - val_loss: 0.0982 - lr: 0.0010
Epoch 11/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0815

3/3 [==============================] - 0s 24ms/step - loss: 0.0832 - val_loss: 0.1047 - lr: 0.0010
Epoch 12/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0839

3/3 [==============================] - 0s 24ms/step - loss: 0.0848 - val_loss: 0.1036 - lr: 0.0010
Epoch 13/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0806

3/3 [==============================] - 0s 26ms/step - loss: 0.0807 - val_loss: 0.0971 - lr: 0.0010
Epoch 14/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0786

3/3 [==============================] - 0s 26ms/step - loss: 0.0798 - val_loss: 0.0924 - lr: 0.0010
Epoch 15/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0808

3/3 [==============================] - 0s 28ms/step - loss: 0.0795 - val_loss: 0.0888 - lr: 0.0010
Epoch 16/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0799

3/3 [==============================] - 0s 29ms/step - loss: 0.0786 - val_loss: 0.0883 - lr: 0.0010
Epoch 17/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0802

3/3 [==============================] - 0s 25ms/step - loss: 0.0785 - val_loss: 0.0891 - lr: 0.0010
Epoch 18/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0798

3/3 [==============================] - 0s 25ms/step - loss: 0.0779 - val_loss: 0.0927 - lr: 0.0010
Epoch 19/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0735

3/3 [==============================] - 0s 24ms/step - loss: 0.0771 - val_loss: 0.0927 - lr: 0.0010
Epoch 20/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0784

3/3 [==============================] - 0s 24ms/step - loss: 0.0777 - val_loss: 0.0855 - lr: 0.0010
Epoch 21/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0781

3/3 [==============================] - 0s 29ms/step - loss: 0.0760 - val_loss: 0.0832 - lr: 0.0010
Epoch 22/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0745

3/3 [==============================] - 0s 34ms/step - loss: 0.0765 - val_loss: 0.0844 - lr: 0.0010
Epoch 23/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0749

3/3 [==============================] - 0s 35ms/step - loss: 0.0745 - val_loss: 0.0859 - lr: 0.0010
Epoch 24/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0734

3/3 [==============================] - 0s 31ms/step - loss: 0.0743 - val_loss: 0.0839 - lr: 0.0010
Epoch 25/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0777

3/3 [==============================] - 0s 37ms/step - loss: 0.0748 - val_loss: 0.0855 - lr: 0.0010
Epoch 26/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0713

3/3 [==============================] - 0s 28ms/step - loss: 0.0724 - val_loss: 0.0872 - lr: 0.0010
Epoch 27/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0709

3/3 [==============================] - 0s 27ms/step - loss: 0.0702 - val_loss: 0.0855 - lr: 0.0010
Epoch 28/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0702

3/3 [==============================] - 0s 26ms/step - loss: 0.0706 - val_loss: 0.0835 - lr: 0.0010
Epoch 29/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0700

3/3 [==============================] - 0s 27ms/step - loss: 0.0692 - val_loss: 0.0809 - lr: 0.0010
Epoch 30/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0694

3/3 [==============================] - 0s 25ms/step - loss: 0.0698 - val_loss: 0.0804 - lr: 0.0010
Epoch 31/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0704

3/3 [==============================] - 0s 28ms/step - loss: 0.0697 - val_loss: 0.0791 - lr: 0.0010
Epoch 32/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0681

3/3 [==============================] - 0s 34ms/step - loss: 0.0687 - val_loss: 0.0806 - lr: 0.0010
Epoch 33/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0673

3/3 [==============================] - 0s 33ms/step - loss: 0.0673 - val_loss: 0.0808 - lr: 0.0010
Epoch 34/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0677

3/3 [==============================] - 0s 30ms/step - loss: 0.0667 - val_loss: 0.0762 - lr: 0.0010
Epoch 35/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0670

3/3 [==============================] - 0s 28ms/step - loss: 0.0675 - val_loss: 0.0746 - lr: 0.0010
Epoch 36/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0662

3/3 [==============================] - 0s 34ms/step - loss: 0.0667 - val_loss: 0.0749 - lr: 0.0010
Epoch 37/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0667

3/3 [==============================] - 0s 30ms/step - loss: 0.0665 - val_loss: 0.0770 - lr: 0.0010
Epoch 38/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0661

3/3 [==============================] - 0s 28ms/step - loss: 0.0657 - val_loss: 0.0784 - lr: 0.0010
Epoch 39/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0689

3/3 [==============================] - 0s 28ms/step - loss: 0.0673 - val_loss: 0.0768 - lr: 0.0010
Epoch 40/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0672

3/3 [==============================] - 0s 29ms/step - loss: 0.0666 - val_loss: 0.0714 - lr: 0.0010
Epoch 41/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0670

3/3 [==============================] - 0s 32ms/step - loss: 0.0644 - val_loss: 0.0705 - lr: 0.0010
Epoch 42/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0625

3/3 [==============================] - 0s 57ms/step - loss: 0.0637 - val_loss: 0.0728 - lr: 0.0010
Epoch 43/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0628

3/3 [==============================] - ETA: 0s - loss: 0.0638

3/3 [==============================] - 0s 37ms/step - loss: 0.0638 - val_loss: 0.0762 - lr: 0.0010
Epoch 44/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0650

3/3 [==============================] - 0s 38ms/step - loss: 0.0620 - val_loss: 0.0698 - lr: 0.0010
Epoch 45/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0631

3/3 [==============================] - 0s 26ms/step - loss: 0.0616 - val_loss: 0.0695 - lr: 0.0010
Epoch 46/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0608

3/3 [==============================] - 0s 26ms/step - loss: 0.0612 - val_loss: 0.0716 - lr: 0.0010
Epoch 47/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0605

3/3 [==============================] - 0s 24ms/step - loss: 0.0615 - val_loss: 0.0720 - lr: 0.0010
Epoch 48/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0587

3/3 [==============================] - 0s 25ms/step - loss: 0.0600 - val_loss: 0.0692 - lr: 0.0010
Epoch 49/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0601

3/3 [==============================] - 0s 28ms/step - loss: 0.0602 - val_loss: 0.0678 - lr: 0.0010
Epoch 50/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0596

3/3 [==============================] - 0s 29ms/step - loss: 0.0584 - val_loss: 0.0691 - lr: 0.0010
Epoch 51/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0564

3/3 [==============================] - 0s 30ms/step - loss: 0.0587 - val_loss: 0.0679 - lr: 0.0010
Epoch 52/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0578

3/3 [==============================] - 0s 30ms/step - loss: 0.0578 - val_loss: 0.0666 - lr: 0.0010
Epoch 53/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0578

3/3 [==============================] - 0s 27ms/step - loss: 0.0587 - val_loss: 0.0658 - lr: 0.0010
Epoch 54/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0567

3/3 [==============================] - 0s 25ms/step - loss: 0.0571 - val_loss: 0.0672 - lr: 0.0010
Epoch 55/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0553

3/3 [==============================] - 0s 23ms/step - loss: 0.0573 - val_loss: 0.0672 - lr: 0.0010
Epoch 56/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0536

3/3 [==============================] - 0s 25ms/step - loss: 0.0565 - val_loss: 0.0647 - lr: 0.0010
Epoch 57/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0578

3/3 [==============================] - 0s 23ms/step - loss: 0.0552 - val_loss: 0.0639 - lr: 0.0010
Epoch 58/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0560

3/3 [==============================] - 0s 24ms/step - loss: 0.0550 - val_loss: 0.0638 - lr: 0.0010
Epoch 59/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0539

3/3 [==============================] - 0s 24ms/step - loss: 0.0546 - val_loss: 0.0642 - lr: 0.0010
Epoch 60/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0550

3/3 [==============================] - 0s 26ms/step - loss: 0.0540 - val_loss: 0.0620 - lr: 0.0010
Epoch 61/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0531

3/3 [==============================] - 0s 33ms/step - loss: 0.0529 - val_loss: 0.0620 - lr: 0.0010
Epoch 62/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0545

3/3 [==============================] - 0s 27ms/step - loss: 0.0529 - val_loss: 0.0612 - lr: 0.0010
Epoch 63/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0538

3/3 [==============================] - 0s 25ms/step - loss: 0.0528 - val_loss: 0.0599 - lr: 0.0010
Epoch 64/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0523

3/3 [==============================] - 0s 25ms/step - loss: 0.0520 - val_loss: 0.0607 - lr: 0.0010
Epoch 65/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0524

3/3 [==============================] - 0s 40ms/step - loss: 0.0532 - val_loss: 0.0594 - lr: 0.0010
Epoch 66/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0514

3/3 [==============================] - 0s 33ms/step - loss: 0.0520 - val_loss: 0.0593 - lr: 0.0010
Epoch 67/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0508

3/3 [==============================] - 0s 27ms/step - loss: 0.0499 - val_loss: 0.0591 - lr: 0.0010
Epoch 68/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0511

3/3 [==============================] - 0s 28ms/step - loss: 0.0500 - val_loss: 0.0562 - lr: 0.0010
Epoch 69/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0497

3/3 [==============================] - 0s 25ms/step - loss: 0.0497 - val_loss: 0.0549 - lr: 0.0010
Epoch 70/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0512

3/3 [==============================] - 0s 25ms/step - loss: 0.0515 - val_loss: 0.0612 - lr: 0.0010
Epoch 71/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0518

3/3 [==============================] - 0s 23ms/step - loss: 0.0503 - val_loss: 0.0619 - lr: 0.0010
Epoch 72/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0512

3/3 [==============================] - 0s 24ms/step - loss: 0.0494 - val_loss: 0.0537 - lr: 0.0010
Epoch 73/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0510

3/3 [==============================] - 0s 23ms/step - loss: 0.0498 - val_loss: 0.0528 - lr: 0.0010
Epoch 74/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0466

3/3 [==============================] - 0s 24ms/step - loss: 0.0472 - val_loss: 0.0558 - lr: 0.0010
Epoch 75/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0476

3/3 [==============================] - 0s 29ms/step - loss: 0.0482 - val_loss: 0.0536 - lr: 0.0010
Epoch 76/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0455

3/3 [==============================] - 0s 27ms/step - loss: 0.0474 - val_loss: 0.0519 - lr: 0.0010
Epoch 77/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0457

3/3 [==============================] - 0s 30ms/step - loss: 0.0465 - val_loss: 0.0505 - lr: 0.0010
Epoch 78/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0468

3/3 [==============================] - 0s 25ms/step - loss: 0.0462 - val_loss: 0.0554 - lr: 0.0010
Epoch 79/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0475

3/3 [==============================] - 0s 24ms/step - loss: 0.0468 - val_loss: 0.0568 - lr: 0.0010
Epoch 80/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0437

3/3 [==============================] - 0s 23ms/step - loss: 0.0454 - val_loss: 0.0489 - lr: 0.0010
Epoch 81/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0444

3/3 [==============================] - 0s 22ms/step - loss: 0.0462 - val_loss: 0.0484 - lr: 0.0010
Epoch 82/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0445

3/3 [==============================] - 0s 22ms/step - loss: 0.0442 - val_loss: 0.0534 - lr: 0.0010
Epoch 83/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0442

3/3 [==============================] - 0s 23ms/step - loss: 0.0447 - val_loss: 0.0541 - lr: 0.0010
Epoch 84/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0440

3/3 [==============================] - 0s 23ms/step - loss: 0.0438 - val_loss: 0.0487 - lr: 0.0010
Epoch 85/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0435

3/3 [==============================] - 0s 23ms/step - loss: 0.0432 - val_loss: 0.0475 - lr: 0.0010
Epoch 86/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0428

3/3 [==============================] - 0s 22ms/step - loss: 0.0427 - val_loss: 0.0480 - lr: 0.0010
Epoch 87/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0413

3/3 [==============================] - 0s 23ms/step - loss: 0.0430 - val_loss: 0.0483 - lr: 0.0010
Epoch 88/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0423

3/3 [==============================] - 0s 22ms/step - loss: 0.0436 - val_loss: 0.0462 - lr: 0.0010
Epoch 89/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0414

3/3 [==============================] - 0s 22ms/step - loss: 0.0416 - val_loss: 0.0446 - lr: 0.0010
Epoch 90/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0420

3/3 [==============================] - 0s 23ms/step - loss: 0.0418 - val_loss: 0.0442 - lr: 0.0010
Epoch 91/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0416

3/3 [==============================] - 0s 23ms/step - loss: 0.0421 - val_loss: 0.0443 - lr: 0.0010
Epoch 92/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0419

3/3 [==============================] - 0s 22ms/step - loss: 0.0410 - val_loss: 0.0426 - lr: 0.0010
Epoch 93/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0415

3/3 [==============================] - 0s 22ms/step - loss: 0.0408 - val_loss: 0.0424 - lr: 0.0010
Epoch 94/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0419

3/3 [==============================] - 0s 22ms/step - loss: 0.0407 - val_loss: 0.0424 - lr: 0.0010
Epoch 95/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0390

3/3 [==============================] - 0s 22ms/step - loss: 0.0395 - val_loss: 0.0434 - lr: 0.0010
Epoch 96/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0384

3/3 [==============================] - 0s 22ms/step - loss: 0.0393 - val_loss: 0.0416 - lr: 0.0010
Epoch 97/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0399

3/3 [==============================] - 0s 26ms/step - loss: 0.0394 - val_loss: 0.0416 - lr: 0.0010
Epoch 98/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0389

3/3 [==============================] - 0s 24ms/step - loss: 0.0384 - val_loss: 0.0419 - lr: 0.0010
Epoch 99/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0397

3/3 [==============================] - 0s 39ms/step - loss: 0.0390 - val_loss: 0.0399 - lr: 0.0010
Epoch 100/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0373

3/3 [==============================] - 0s 31ms/step - loss: 0.0382 - val_loss: 0.0395 - lr: 0.0010
Epoch 101/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0371

3/3 [==============================] - 0s 30ms/step - loss: 0.0373 - val_loss: 0.0386 - lr: 0.0010
Epoch 102/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0368

3/3 [==============================] - 0s 29ms/step - loss: 0.0368 - val_loss: 0.0384 - lr: 0.0010
Epoch 103/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0361

3/3 [==============================] - 0s 28ms/step - loss: 0.0370 - val_loss: 0.0385 - lr: 0.0010
Epoch 104/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0378

3/3 [==============================] - 0s 27ms/step - loss: 0.0371 - val_loss: 0.0374 - lr: 0.0010
Epoch 105/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0369

3/3 [==============================] - 0s 30ms/step - loss: 0.0366 - val_loss: 0.0374 - lr: 0.0010
Epoch 106/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0362

3/3 [==============================] - 0s 27ms/step - loss: 0.0363 - val_loss: 0.0368 - lr: 0.0010
Epoch 107/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0365

3/3 [==============================] - 0s 26ms/step - loss: 0.0358 - val_loss: 0.0365 - lr: 0.0010
Epoch 108/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0355

3/3 [==============================] - 0s 27ms/step - loss: 0.0361 - val_loss: 0.0360 - lr: 0.0010
Epoch 109/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0345

3/3 [==============================] - 0s 27ms/step - loss: 0.0352 - val_loss: 0.0370 - lr: 0.0010
Epoch 110/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0356

3/3 [==============================] - 0s 29ms/step - loss: 0.0354 - val_loss: 0.0360 - lr: 0.0010
Epoch 111/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0342

3/3 [==============================] - 0s 30ms/step - loss: 0.0343 - val_loss: 0.0357 - lr: 0.0010
Epoch 112/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0353

3/3 [==============================] - 0s 31ms/step - loss: 0.0348 - val_loss: 0.0349 - lr: 0.0010
Epoch 113/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0336

3/3 [==============================] - 0s 40ms/step - loss: 0.0343 - val_loss: 0.0351 - lr: 0.0010
Epoch 114/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0335

3/3 [==============================] - 0s 45ms/step - loss: 0.0339 - val_loss: 0.0349 - lr: 0.0010
Epoch 115/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0328

3/3 [==============================] - 0s 52ms/step - loss: 0.0332 - val_loss: 0.0346 - lr: 0.0010
Epoch 116/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0334

3/3 [==============================] - 0s 64ms/step - loss: 0.0338 - val_loss: 0.0342 - lr: 0.0010
Epoch 117/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0328

3/3 [==============================] - ETA: 0s - loss: 0.0328

3/3 [==============================] - 0s 44ms/step - loss: 0.0328 - val_loss: 0.0340 - lr: 0.0010
Epoch 118/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0316

3/3 [==============================] - 0s 35ms/step - loss: 0.0329 - val_loss: 0.0336 - lr: 0.0010
Epoch 119/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0327

3/3 [==============================] - 0s 31ms/step - loss: 0.0326 - val_loss: 0.0332 - lr: 0.0010
Epoch 120/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0316

3/3 [==============================] - 0s 30ms/step - loss: 0.0319 - val_loss: 0.0338 - lr: 0.0010
Epoch 121/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0322

3/3 [==============================] - 0s 28ms/step - loss: 0.0322 - val_loss: 0.0326 - lr: 0.0010
Epoch 122/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0315

3/3 [==============================] - 0s 25ms/step - loss: 0.0315 - val_loss: 0.0328 - lr: 0.0010
Epoch 123/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0311

3/3 [==============================] - 0s 25ms/step - loss: 0.0311 - val_loss: 0.0323 - lr: 0.0010
Epoch 124/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0304

3/3 [==============================] - 0s 24ms/step - loss: 0.0306 - val_loss: 0.0319 - lr: 0.0010
Epoch 125/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0316

3/3 [==============================] - 0s 25ms/step - loss: 0.0310 - val_loss: 0.0317 - lr: 0.0010
Epoch 126/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0300

3/3 [==============================] - 0s 25ms/step - loss: 0.0309 - val_loss: 0.0312 - lr: 0.0010
Epoch 127/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0310

3/3 [==============================] - 0s 25ms/step - loss: 0.0305 - val_loss: 0.0308 - lr: 0.0010
Epoch 128/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0296

3/3 [==============================] - 0s 27ms/step - loss: 0.0300 - val_loss: 0.0304 - lr: 0.0010
Epoch 129/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0306

3/3 [==============================] - 0s 30ms/step - loss: 0.0301 - val_loss: 0.0306 - lr: 0.0010
Epoch 130/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0322

3/3 [==============================] - 0s 36ms/step - loss: 0.0307 - val_loss: 0.0311 - lr: 0.0010
Epoch 131/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0298

3/3 [==============================] - 0s 30ms/step - loss: 0.0296 - val_loss: 0.0301 - lr: 0.0010
Epoch 132/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0308

3/3 [==============================] - 0s 29ms/step - loss: 0.0300 - val_loss: 0.0311 - lr: 0.0010
Epoch 133/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0292

3/3 [==============================] - 0s 27ms/step - loss: 0.0289 - val_loss: 0.0299 - lr: 0.0010
Epoch 134/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0285

3/3 [==============================] - 0s 24ms/step - loss: 0.0287 - val_loss: 0.0298 - lr: 0.0010
Epoch 135/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0287

3/3 [==============================] - 0s 24ms/step - loss: 0.0289 - val_loss: 0.0296 - lr: 0.0010
Epoch 136/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0284

3/3 [==============================] - 0s 25ms/step - loss: 0.0284 - val_loss: 0.0292 - lr: 0.0010
Epoch 137/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0281

3/3 [==============================] - 0s 24ms/step - loss: 0.0282 - val_loss: 0.0289 - lr: 0.0010
Epoch 138/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0277

3/3 [==============================] - 0s 24ms/step - loss: 0.0281 - val_loss: 0.0289 - lr: 0.0010
Epoch 139/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0288

3/3 [==============================] - 0s 22ms/step - loss: 0.0280 - val_loss: 0.0285 - lr: 0.0010
Epoch 140/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0275

3/3 [==============================] - 0s 26ms/step - loss: 0.0273 - val_loss: 0.0282 - lr: 0.0010
Epoch 141/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0275

3/3 [==============================] - 0s 35ms/step - loss: 0.0275 - val_loss: 0.0276 - lr: 0.0010
Epoch 142/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0271

3/3 [==============================] - 0s 31ms/step - loss: 0.0270 - val_loss: 0.0276 - lr: 0.0010
Epoch 143/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0269

3/3 [==============================] - 0s 30ms/step - loss: 0.0265 - val_loss: 0.0273 - lr: 0.0010
Epoch 144/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0261

3/3 [==============================] - 0s 25ms/step - loss: 0.0266 - val_loss: 0.0271 - lr: 0.0010
Epoch 145/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0264

3/3 [==============================] - 0s 25ms/step - loss: 0.0267 - val_loss: 0.0269 - lr: 0.0010
Epoch 146/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0263

3/3 [==============================] - 0s 26ms/step - loss: 0.0262 - val_loss: 0.0268 - lr: 0.0010
Epoch 147/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0257

3/3 [==============================] - 0s 24ms/step - loss: 0.0263 - val_loss: 0.0266 - lr: 0.0010
Epoch 148/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0262

3/3 [==============================] - 0s 30ms/step - loss: 0.0258 - val_loss: 0.0267 - lr: 0.0010
Epoch 149/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0255

3/3 [==============================] - 0s 31ms/step - loss: 0.0257 - val_loss: 0.0266 - lr: 0.0010
Epoch 150/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0257

3/3 [==============================] - 0s 28ms/step - loss: 0.0253 - val_loss: 0.0264 - lr: 0.0010
Epoch 151/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0252

3/3 [==============================] - 0s 25ms/step - loss: 0.0252 - val_loss: 0.0262 - lr: 0.0010
Epoch 152/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0257

3/3 [==============================] - 0s 26ms/step - loss: 0.0250 - val_loss: 0.0265 - lr: 0.0010
Epoch 153/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0253

3/3 [==============================] - 0s 26ms/step - loss: 0.0254 - val_loss: 0.0259 - lr: 0.0010
Epoch 154/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0246

3/3 [==============================] - 0s 23ms/step - loss: 0.0251 - val_loss: 0.0257 - lr: 0.0010
Epoch 155/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0243

3/3 [==============================] - 0s 23ms/step - loss: 0.0248 - val_loss: 0.0257 - lr: 0.0010
Epoch 156/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0249

3/3 [==============================] - 0s 23ms/step - loss: 0.0246 - val_loss: 0.0257 - lr: 0.0010
Epoch 157/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0250

3/3 [==============================] - 0s 24ms/step - loss: 0.0243 - val_loss: 0.0252 - lr: 0.0010
Epoch 158/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0238

3/3 [==============================] - 0s 23ms/step - loss: 0.0238 - val_loss: 0.0251 - lr: 0.0010
Epoch 159/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0241

3/3 [==============================] - 0s 25ms/step - loss: 0.0240 - val_loss: 0.0248 - lr: 0.0010
Epoch 160/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0242

3/3 [==============================] - 0s 27ms/step - loss: 0.0239 - val_loss: 0.0245 - lr: 0.0010
Epoch 161/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0238

3/3 [==============================] - 0s 29ms/step - loss: 0.0240 - val_loss: 0.0250 - lr: 0.0010
Epoch 162/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0237

3/3 [==============================] - 0s 33ms/step - loss: 0.0236 - val_loss: 0.0241 - lr: 0.0010
Epoch 163/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0233

3/3 [==============================] - 0s 30ms/step - loss: 0.0231 - val_loss: 0.0239 - lr: 0.0010
Epoch 164/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0237

3/3 [==============================] - 0s 32ms/step - loss: 0.0231 - val_loss: 0.0238 - lr: 0.0010
Epoch 165/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0228

3/3 [==============================] - 0s 33ms/step - loss: 0.0229 - val_loss: 0.0238 - lr: 0.0010
Epoch 166/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0231

3/3 [==============================] - 0s 28ms/step - loss: 0.0229 - val_loss: 0.0238 - lr: 0.0010
Epoch 167/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0221

3/3 [==============================] - 0s 26ms/step - loss: 0.0225 - val_loss: 0.0235 - lr: 0.0010
Epoch 168/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0221

3/3 [==============================] - 0s 25ms/step - loss: 0.0222 - val_loss: 0.0228 - lr: 0.0010
Epoch 169/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0224

3/3 [==============================] - 0s 25ms/step - loss: 0.0221 - val_loss: 0.0227 - lr: 0.0010
Epoch 170/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0227

3/3 [==============================] - 0s 23ms/step - loss: 0.0221 - val_loss: 0.0227 - lr: 0.0010
Epoch 171/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0224

3/3 [==============================] - 0s 23ms/step - loss: 0.0221 - val_loss: 0.0239 - lr: 0.0010
Epoch 172/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0219

3/3 [==============================] - 0s 24ms/step - loss: 0.0219 - val_loss: 0.0222 - lr: 0.0010
Epoch 173/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0216

3/3 [==============================] - 0s 25ms/step - loss: 0.0218 - val_loss: 0.0225 - lr: 0.0010
Epoch 174/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0211

3/3 [==============================] - 0s 25ms/step - loss: 0.0212 - val_loss: 0.0220 - lr: 0.0010
Epoch 175/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0213

3/3 [==============================] - 0s 25ms/step - loss: 0.0214 - val_loss: 0.0219 - lr: 0.0010
Epoch 176/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0210

3/3 [==============================] - 0s 23ms/step - loss: 0.0215 - val_loss: 0.0226 - lr: 0.0010
Epoch 177/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0211

3/3 [==============================] - 0s 25ms/step - loss: 0.0209 - val_loss: 0.0216 - lr: 0.0010
Epoch 178/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0212

3/3 [==============================] - 0s 32ms/step - loss: 0.0212 - val_loss: 0.0222 - lr: 0.0010
Epoch 179/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0211

3/3 [==============================] - 0s 32ms/step - loss: 0.0208 - val_loss: 0.0218 - lr: 0.0010
Epoch 180/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0212

3/3 [==============================] - 0s 26ms/step - loss: 0.0209 - val_loss: 0.0216 - lr: 0.0010
Epoch 181/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0205

3/3 [==============================] - 0s 27ms/step - loss: 0.0204 - val_loss: 0.0212 - lr: 0.0010
Epoch 182/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0205

3/3 [==============================] - 0s 30ms/step - loss: 0.0208 - val_loss: 0.0236 - lr: 0.0010
Epoch 183/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0210

3/3 [==============================] - 0s 27ms/step - loss: 0.0208 - val_loss: 0.0208 - lr: 0.0010
Epoch 184/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0203

3/3 [==============================] - 0s 28ms/step - loss: 0.0204 - val_loss: 0.0210 - lr: 0.0010
Epoch 185/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0197

3/3 [==============================] - 0s 31ms/step - loss: 0.0202 - val_loss: 0.0210 - lr: 0.0010
Epoch 186/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0201

3/3 [==============================] - 0s 32ms/step - loss: 0.0201 - val_loss: 0.0206 - lr: 0.0010
Epoch 187/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0200

3/3 [==============================] - 0s 30ms/step - loss: 0.0199 - val_loss: 0.0220 - lr: 0.0010
Epoch 188/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0200

3/3 [==============================] - 0s 40ms/step - loss: 0.0196 - val_loss: 0.0206 - lr: 0.0010
Epoch 189/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0197

3/3 [==============================] - 0s 32ms/step - loss: 0.0195 - val_loss: 0.0211 - lr: 0.0010
Epoch 190/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0192

3/3 [==============================] - 0s 29ms/step - loss: 0.0191 - val_loss: 0.0209 - lr: 0.0010
Epoch 191/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0193

3/3 [==============================] - 0s 28ms/step - loss: 0.0191 - val_loss: 0.0199 - lr: 0.0010
Epoch 192/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0186

3/3 [==============================] - 0s 33ms/step - loss: 0.0190 - val_loss: 0.0203 - lr: 0.0010
Epoch 193/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0189

3/3 [==============================] - 0s 35ms/step - loss: 0.0188 - val_loss: 0.0199 - lr: 0.0010
Epoch 194/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0188

3/3 [==============================] - 0s 35ms/step - loss: 0.0187 - val_loss: 0.0196 - lr: 0.0010
Epoch 195/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0194

3/3 [==============================] - 0s 38ms/step - loss: 0.0189 - val_loss: 0.0202 - lr: 0.0010
Epoch 196/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0183

3/3 [==============================] - 0s 30ms/step - loss: 0.0183 - val_loss: 0.0196 - lr: 0.0010
Epoch 197/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0188

3/3 [==============================] - 0s 29ms/step - loss: 0.0187 - val_loss: 0.0194 - lr: 0.0010
Epoch 198/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0183

3/3 [==============================] - 0s 32ms/step - loss: 0.0184 - val_loss: 0.0191 - lr: 0.0010
Epoch 199/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0185

3/3 [==============================] - 0s 30ms/step - loss: 0.0184 - val_loss: 0.0187 - lr: 0.0010
Epoch 200/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0179

3/3 [==============================] - 0s 26ms/step - loss: 0.0180 - val_loss: 0.0186 - lr: 0.0010
Epoch 201/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0178

3/3 [==============================] - 0s 24ms/step - loss: 0.0178 - val_loss: 0.0187 - lr: 0.0010
Epoch 202/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0175

3/3 [==============================] - 0s 25ms/step - loss: 0.0178 - val_loss: 0.0183 - lr: 0.0010
Epoch 203/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0176

3/3 [==============================] - 0s 24ms/step - loss: 0.0179 - val_loss: 0.0182 - lr: 0.0010
Epoch 204/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0172

3/3 [==============================] - 0s 24ms/step - loss: 0.0176 - val_loss: 0.0181 - lr: 0.0010
Epoch 205/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0174

3/3 [==============================] - 0s 26ms/step - loss: 0.0173 - val_loss: 0.0186 - lr: 0.0010
Epoch 206/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0170

3/3 [==============================] - 0s 28ms/step - loss: 0.0172 - val_loss: 0.0179 - lr: 0.0010
Epoch 207/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0174

3/3 [==============================] - 0s 27ms/step - loss: 0.0174 - val_loss: 0.0184 - lr: 0.0010
Epoch 208/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0170

3/3 [==============================] - 0s 25ms/step - loss: 0.0169 - val_loss: 0.0180 - lr: 0.0010
Epoch 209/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0175

3/3 [==============================] - 0s 24ms/step - loss: 0.0172 - val_loss: 0.0187 - lr: 0.0010
Epoch 210/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0170

3/3 [==============================] - 0s 23ms/step - loss: 0.0170 - val_loss: 0.0175 - lr: 0.0010
Epoch 211/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0167

3/3 [==============================] - 0s 25ms/step - loss: 0.0167 - val_loss: 0.0177 - lr: 0.0010
Epoch 212/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0170

3/3 [==============================] - 0s 24ms/step - loss: 0.0167 - val_loss: 0.0179 - lr: 0.0010
Epoch 213/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0164

3/3 [==============================] - 0s 37ms/step - loss: 0.0162 - val_loss: 0.0173 - lr: 0.0010
Epoch 214/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0167

3/3 [==============================] - 0s 33ms/step - loss: 0.0164 - val_loss: 0.0180 - lr: 0.0010
Epoch 215/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0166

3/3 [==============================] - 0s 29ms/step - loss: 0.0163 - val_loss: 0.0170 - lr: 0.0010
Epoch 216/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0163

3/3 [==============================] - 0s 25ms/step - loss: 0.0162 - val_loss: 0.0175 - lr: 0.0010
Epoch 217/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0162

3/3 [==============================] - 0s 26ms/step - loss: 0.0159 - val_loss: 0.0167 - lr: 0.0010
Epoch 218/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0160

3/3 [==============================] - 0s 25ms/step - loss: 0.0158 - val_loss: 0.0173 - lr: 0.0010
Epoch 219/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0162

3/3 [==============================] - 0s 24ms/step - loss: 0.0157 - val_loss: 0.0165 - lr: 0.0010
Epoch 220/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0155

3/3 [==============================] - 0s 24ms/step - loss: 0.0154 - val_loss: 0.0176 - lr: 0.0010
Epoch 221/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0156

3/3 [==============================] - 0s 24ms/step - loss: 0.0154 - val_loss: 0.0163 - lr: 0.0010
Epoch 222/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0157

3/3 [==============================] - 0s 23ms/step - loss: 0.0154 - val_loss: 0.0170 - lr: 0.0010
Epoch 223/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0150

3/3 [==============================] - 0s 23ms/step - loss: 0.0151 - val_loss: 0.0161 - lr: 0.0010
Epoch 224/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0155

3/3 [==============================] - 0s 23ms/step - loss: 0.0153 - val_loss: 0.0168 - lr: 0.0010
Epoch 225/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0154

3/3 [==============================] - 0s 23ms/step - loss: 0.0152 - val_loss: 0.0159 - lr: 0.0010
Epoch 226/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0148

3/3 [==============================] - 0s 23ms/step - loss: 0.0150 - val_loss: 0.0165 - lr: 0.0010
Epoch 227/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0151

3/3 [==============================] - 0s 23ms/step - loss: 0.0149 - val_loss: 0.0161 - lr: 0.0010
Epoch 228/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0144

3/3 [==============================] - 0s 25ms/step - loss: 0.0147 - val_loss: 0.0162 - lr: 0.0010
Epoch 229/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0144

3/3 [==============================] - 0s 27ms/step - loss: 0.0146 - val_loss: 0.0166 - lr: 0.0010
Epoch 230/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0147

3/3 [==============================] - 0s 53ms/step - loss: 0.0148 - val_loss: 0.0154 - lr: 0.0010
Epoch 231/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0146

3/3 [==============================] - 0s 30ms/step - loss: 0.0147 - val_loss: 0.0166 - lr: 0.0010
Epoch 232/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0146

3/3 [==============================] - 0s 32ms/step - loss: 0.0147 - val_loss: 0.0152 - lr: 0.0010
Epoch 233/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0145

3/3 [==============================] - 0s 25ms/step - loss: 0.0147 - val_loss: 0.0175 - lr: 0.0010
Epoch 234/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0144

3/3 [==============================] - 0s 26ms/step - loss: 0.0143 - val_loss: 0.0150 - lr: 0.0010
Epoch 235/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0145

3/3 [==============================] - 0s 24ms/step - loss: 0.0144 - val_loss: 0.0159 - lr: 0.0010
Epoch 236/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0143

3/3 [==============================] - 0s 22ms/step - loss: 0.0143 - val_loss: 0.0151 - lr: 0.0010
Epoch 237/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0139

3/3 [==============================] - 0s 22ms/step - loss: 0.0139 - val_loss: 0.0153 - lr: 0.0010
Epoch 238/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0139

3/3 [==============================] - 0s 23ms/step - loss: 0.0138 - val_loss: 0.0157 - lr: 0.0010
Epoch 239/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0138

3/3 [==============================] - 0s 22ms/step - loss: 0.0138 - val_loss: 0.0154 - lr: 0.0010
Epoch 240/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0135

3/3 [==============================] - 0s 25ms/step - loss: 0.0137 - val_loss: 0.0149 - lr: 0.0010
Epoch 241/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0137

3/3 [==============================] - 0s 26ms/step - loss: 0.0136 - val_loss: 0.0146 - lr: 0.0010
Epoch 242/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0137

3/3 [==============================] - 0s 29ms/step - loss: 0.0135 - val_loss: 0.0151 - lr: 0.0010
Epoch 243/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0136

3/3 [==============================] - 0s 26ms/step - loss: 0.0135 - val_loss: 0.0152 - lr: 0.0010
Epoch 244/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0137

3/3 [==============================] - 0s 27ms/step - loss: 0.0135 - val_loss: 0.0146 - lr: 0.0010
Epoch 245/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0132

3/3 [==============================] - 0s 32ms/step - loss: 0.0132 - val_loss: 0.0148 - lr: 0.0010
Epoch 246/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0129

3/3 [==============================] - 0s 30ms/step - loss: 0.0132 - val_loss: 0.0142 - lr: 0.0010
Epoch 247/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0134

3/3 [==============================] - 0s 28ms/step - loss: 0.0133 - val_loss: 0.0142 - lr: 0.0010
Epoch 248/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0130

3/3 [==============================] - 0s 26ms/step - loss: 0.0132 - val_loss: 0.0138 - lr: 0.0010
Epoch 249/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0127

3/3 [==============================] - 0s 26ms/step - loss: 0.0129 - val_loss: 0.0145 - lr: 0.0010
Epoch 250/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0128

3/3 [==============================] - 0s 26ms/step - loss: 0.0129 - val_loss: 0.0145 - lr: 0.0010
Epoch 251/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0131

3/3 [==============================] - 0s 25ms/step - loss: 0.0130 - val_loss: 0.0140 - lr: 0.0010
Epoch 252/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0129

3/3 [==============================] - 0s 27ms/step - loss: 0.0129 - val_loss: 0.0141 - lr: 0.0010
Epoch 253/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0123

3/3 [==============================] - 0s 25ms/step - loss: 0.0125 - val_loss: 0.0136 - lr: 0.0010
Epoch 254/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0125

3/3 [==============================] - 0s 26ms/step - loss: 0.0124 - val_loss: 0.0147 - lr: 0.0010
Epoch 255/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0126

3/3 [==============================] - 0s 25ms/step - loss: 0.0124 - val_loss: 0.0135 - lr: 0.0010
Epoch 256/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0121

3/3 [==============================] - 0s 25ms/step - loss: 0.0124 - val_loss: 0.0162 - lr: 0.0010
Epoch 257/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0130

3/3 [==============================] - 0s 26ms/step - loss: 0.0125 - val_loss: 0.0130 - lr: 0.0010
Epoch 258/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0135

3/3 [==============================] - 0s 27ms/step - loss: 0.0126 - val_loss: 0.0148 - lr: 0.0010
Epoch 259/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0121

3/3 [==============================] - 0s 29ms/step - loss: 0.0121 - val_loss: 0.0135 - lr: 0.0010
Epoch 260/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0121

3/3 [==============================] - 0s 28ms/step - loss: 0.0120 - val_loss: 0.0137 - lr: 0.0010
Epoch 261/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0117

3/3 [==============================] - 0s 56ms/step - loss: 0.0120 - val_loss: 0.0134 - lr: 0.0010
Epoch 262/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0117

3/3 [==============================] - 0s 48ms/step - loss: 0.0121 - val_loss: 0.0131 - lr: 0.0010
Epoch 263/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0121

3/3 [==============================] - 0s 30ms/step - loss: 0.0118 - val_loss: 0.0147 - lr: 0.0010
Epoch 264/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0121

3/3 [==============================] - 0s 29ms/step - loss: 0.0121 - val_loss: 0.0128 - lr: 0.0010
Epoch 265/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0117

3/3 [==============================] - 0s 27ms/step - loss: 0.0118 - val_loss: 0.0146 - lr: 0.0010
Epoch 266/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0121

3/3 [==============================] - 0s 27ms/step - loss: 0.0118 - val_loss: 0.0126 - lr: 0.0010
Epoch 267/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0117

3/3 [==============================] - 0s 35ms/step - loss: 0.0115 - val_loss: 0.0125 - lr: 0.0010
Epoch 268/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0115

3/3 [==============================] - 0s 25ms/step - loss: 0.0116 - val_loss: 0.0125 - lr: 0.0010
Epoch 269/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0115

3/3 [==============================] - 0s 31ms/step - loss: 0.0115 - val_loss: 0.0130 - lr: 0.0010
Epoch 270/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0114

3/3 [==============================] - 0s 33ms/step - loss: 0.0115 - val_loss: 0.0122 - lr: 0.0010
Epoch 271/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0116

3/3 [==============================] - 0s 34ms/step - loss: 0.0113 - val_loss: 0.0141 - lr: 0.0010
Epoch 272/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0119

3/3 [==============================] - 0s 42ms/step - loss: 0.0114 - val_loss: 0.0119 - lr: 0.0010
Epoch 273/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0114

3/3 [==============================] - 0s 47ms/step - loss: 0.0112 - val_loss: 0.0126 - lr: 0.0010
Epoch 274/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0110

3/3 [==============================] - 0s 34ms/step - loss: 0.0110 - val_loss: 0.0122 - lr: 0.0010
Epoch 275/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0111

3/3 [==============================] - 0s 28ms/step - loss: 0.0111 - val_loss: 0.0132 - lr: 0.0010
Epoch 276/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0109

3/3 [==============================] - 0s 27ms/step - loss: 0.0108 - val_loss: 0.0124 - lr: 0.0010
Epoch 277/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0107

3/3 [==============================] - 0s 26ms/step - loss: 0.0109 - val_loss: 0.0125 - lr: 0.0010
Epoch 278/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0107

3/3 [==============================] - 0s 26ms/step - loss: 0.0109 - val_loss: 0.0119 - lr: 0.0010
Epoch 279/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0107

3/3 [==============================] - 0s 25ms/step - loss: 0.0111 - val_loss: 0.0127 - lr: 0.0010
Epoch 280/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0107

3/3 [==============================] - 0s 25ms/step - loss: 0.0109 - val_loss: 0.0114 - lr: 0.0010
Epoch 281/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0112

3/3 [==============================] - 0s 25ms/step - loss: 0.0108 - val_loss: 0.0128 - lr: 0.0010
Epoch 282/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0107

3/3 [==============================] - 0s 25ms/step - loss: 0.0108 - val_loss: 0.0121 - lr: 0.0010
Epoch 283/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0103

3/3 [==============================] - 0s 26ms/step - loss: 0.0105 - val_loss: 0.0129 - lr: 0.0010
Epoch 284/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0104

3/3 [==============================] - 0s 25ms/step - loss: 0.0105 - val_loss: 0.0117 - lr: 0.0010
Epoch 285/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0105

3/3 [==============================] - 0s 24ms/step - loss: 0.0103 - val_loss: 0.0118 - lr: 0.0010
Epoch 286/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0102

3/3 [==============================] - 0s 27ms/step - loss: 0.0102 - val_loss: 0.0145 - lr: 0.0010
Epoch 287/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0107

3/3 [==============================] - 0s 29ms/step - loss: 0.0105 - val_loss: 0.0121 - lr: 0.0010
Epoch 288/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0104

3/3 [==============================] - 0s 34ms/step - loss: 0.0103 - val_loss: 0.0132 - lr: 0.0010
Epoch 289/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0102

3/3 [==============================] - 0s 32ms/step - loss: 0.0102 - val_loss: 0.0114 - lr: 0.0010
Epoch 290/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0101
Epoch 290: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.

3/3 [==============================] - 0s 87ms/step - loss: 0.0099 - val_loss: 0.0116 - lr: 0.0010
Epoch 291/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0099

3/3 [==============================] - 0s 39ms/step - loss: 0.0100 - val_loss: 0.0117 - lr: 1.0000e-04
Epoch 292/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0098

3/3 [==============================] - 0s 41ms/step - loss: 0.0099 - val_loss: 0.0117 - lr: 1.0000e-04
Epoch 293/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0099

3/3 [==============================] - 0s 36ms/step - loss: 0.0099 - val_loss: 0.0116 - lr: 1.0000e-04
Epoch 294/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0100

3/3 [==============================] - 0s 37ms/step - loss: 0.0098 - val_loss: 0.0115 - lr: 1.0000e-04
Epoch 295/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0098

3/3 [==============================] - ETA: 0s - loss: 0.0098

3/3 [==============================] - 0s 44ms/step - loss: 0.0098 - val_loss: 0.0113 - lr: 1.0000e-04
Epoch 296/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0101

3/3 [==============================] - 0s 38ms/step - loss: 0.0100 - val_loss: 0.0113 - lr: 1.0000e-04
Epoch 297/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0101

3/3 [==============================] - 0s 41ms/step - loss: 0.0100 - val_loss: 0.0116 - lr: 1.0000e-04
Epoch 298/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0099

3/3 [==============================] - 0s 35ms/step - loss: 0.0099 - val_loss: 0.0121 - lr: 1.0000e-04
Epoch 299/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0099

3/3 [==============================] - 0s 28ms/step - loss: 0.0099 - val_loss: 0.0121 - lr: 1.0000e-04
Epoch 300/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0099
Epoch 300: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.

3/3 [==============================] - 0s 26ms/step - loss: 0.0098 - val_loss: 0.0116 - lr: 1.0000e-04
Epoch 301/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0097

3/3 [==============================] - 0s 27ms/step - loss: 0.0098 - val_loss: 0.0116 - lr: 1.0000e-05
Epoch 302/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0097

3/3 [==============================] - 0s 26ms/step - loss: 0.0099 - val_loss: 0.0116 - lr: 1.0000e-05
Epoch 303/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0100

3/3 [==============================] - 0s 24ms/step - loss: 0.0100 - val_loss: 0.0116 - lr: 1.0000e-05
Epoch 304/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0101

3/3 [==============================] - 0s 23ms/step - loss: 0.0099 - val_loss: 0.0116 - lr: 1.0000e-05
Epoch 305/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0098

3/3 [==============================] - 0s 24ms/step - loss: 0.0100 - val_loss: 0.0116 - lr: 1.0000e-05
Epoch 306/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0099

3/3 [==============================] - 0s 23ms/step - loss: 0.0100 - val_loss: 0.0117 - lr: 1.0000e-05
Epoch 307/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0100

3/3 [==============================] - 0s 24ms/step - loss: 0.0099 - val_loss: 0.0117 - lr: 1.0000e-05
Epoch 308/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0099

3/3 [==============================] - 0s 24ms/step - loss: 0.0097 - val_loss: 0.0117 - lr: 1.0000e-05
Epoch 309/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0097

3/3 [==============================] - 0s 26ms/step - loss: 0.0098 - val_loss: 0.0117 - lr: 1.0000e-05
Epoch 310/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0099
Epoch 310: ReduceLROnPlateau reducing learning rate to 1.0000000656873453e-06.

3/3 [==============================] - 0s 27ms/step - loss: 0.0100 - val_loss: 0.0117 - lr: 1.0000e-05
Epoch 311/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0102

3/3 [==============================] - 0s 27ms/step - loss: 0.0100 - val_loss: 0.0117 - lr: 1.0000e-06
Epoch 312/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0097

3/3 [==============================] - 0s 33ms/step - loss: 0.0098 - val_loss: 0.0117 - lr: 1.0000e-06
Epoch 313/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0099

3/3 [==============================] - 0s 31ms/step - loss: 0.0099 - val_loss: 0.0117 - lr: 1.0000e-06
Epoch 314/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0099

3/3 [==============================] - 0s 68ms/step - loss: 0.0100 - val_loss: 0.0117 - lr: 1.0000e-06
Epoch 315/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0100

3/3 [==============================] - 0s 24ms/step - loss: 0.0098 - val_loss: 0.0117 - lr: 1.0000e-06
Epoch 316/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0095

3/3 [==============================] - 0s 23ms/step - loss: 0.0097 - val_loss: 0.0117 - lr: 1.0000e-06
Epoch 316: early stopping
../_images/3c35e5f4e1faf26a6012ef84169f0ba389c7e8166a55e6e70fdfd5d38c420d55.png

Non-Rolling Prediction#

def non_rolling_prediction(model, scaler):
    yhat = model.predict(X_test)
    yhat = scaler.inverse_transform(yhat)
    plt.plot(data)
    plt.plot(
        pd.DataFrame(yhat, index=data.index[-24:], columns=["Predictions"]), color="red"
    )

    rmse = mean_squared_error(data[-24:].values, yhat, squared=False)
    plt.title(f"RMSE: {rmse:.4f}")
non_rolling_prediction(lstm, scaler)
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 137ms/step
../_images/17839ec92c86747b6f6b0229b7570fa281706524ee83a98c35deda9905d0ce11.png

Rolling Prediction#

def make_rolling_predictions(model, starting_input, steps, scaler):
    """
    Make rolling predictions using the trained model.

    :param model: The trained model.
    :param starting_input: The initial input data (last known data points).
    :param steps: Number of future time steps to predict.
    :param scaler: The scaler used for data normalization.
    :return: Predictions in original scale.
    """
    input_seq = starting_input.copy()
    predictions = []

    for _ in range(steps):
        # Reshape the input to the format the model expects
        reshaped_input = np.reshape(input_seq, (1, len(input_seq), 1))

        # Make a prediction
        prediction = model.predict(reshaped_input)

        # Append the prediction
        predictions.append(prediction[0, 0])

        # Update the input sequence for the next prediction
        input_seq = np.append(input_seq[1:], prediction)

    return scaler.inverse_transform(np.array(predictions).reshape(-1, 1))
def rolling_prediction(model, scaler):
    yhat = make_rolling_predictions(
        model, scaled_data[-24 - look_back : -24], 24, scaler
    )
    plt.plot(data)
    plt.plot(
        pd.DataFrame(yhat, index=data.index[-24:], columns=["Predictions"]), color="red"
    )
    plt.legend()
    rmse = mean_squared_error(data[-24:].values, yhat, squared=False)
    plt.title(f"RMSE: {rmse:.4f}")

    plt.show()
rolling_prediction(lstm, scaler)
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 15ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 15ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 15ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 82ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 17ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 16ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 15ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 17ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 17ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 16ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 16ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 15ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 13ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 15ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 13ms/step
No artists with labels found to put in legend.  Note that artists whose label start with an underscore are ignored when legend() is called with no argument.
../_images/55d555931edd0247fdd86ff62e843ffe22214ee7575ca9f275116822500fdf28.png

LSTM with Attention Mechanism: Model Building and Training#

# Model parameters
input_shape = (look_back, 1)
lstm_units = 200
dropout_rate = 0.2

# Define the model
inputs = Input(shape=input_shape)
lstm_out, state_h, state_c = LSTM(lstm_units, return_sequences=True, return_state=True, kernel_regularizer=l2(l2_reg))(
    inputs
)
attention = Attention()([lstm_out, lstm_out])
context_vector = Concatenate(axis=-1)([attention, lstm_out])
flat = Flatten()(context_vector)
# flat = LayerNormalization()(flat)
dense = Dense(50, activation="relu", kernel_regularizer=l2(l2_reg))(flat)
drop = Dropout(dropout_rate)(dense)
outputs = Dense(1, kernel_regularizer=l2(l2_reg))(drop)

rnn_attention = tf.keras.Model(inputs=inputs, outputs=outputs)
network_training(rnn_attention, "RNN-Attention")
Epoch 1/500
1/3 [=========>....................] - ETA: 1s - loss: 0.1808
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
W0000 00:00:1744121449.930527   19535 op_level_cost_estimator.cc:699] Error in PredictCost() for the op: op: "Softmax" attr { key: "T" value { type: DT_FLOAT } } inputs { dtype: DT_FLOAT shape { unknown_rank: true } } device { type: "CPU" vendor: "GenuineIntel" model: "106" frequency: 2611 num_cores: 12 environment { key: "cpu_instruction_set" value: "AVX SSE, SSE2, SSE3, SSSE3, SSE4.1, SSE4.2" } environment { key: "eigen" value: "3.4.90" } l1_cache_size: 49152 l2_cache_size: 1310720 l3_cache_size: 12582912 memory_size: 268435456 } outputs { dtype: DT_FLOAT shape { unknown_rank: true } }

3/3 [==============================] - 1s 155ms/step - loss: 0.1359 - val_loss: 0.2688 - lr: 0.0010
Epoch 2/500
1/3 [=========>....................] - ETA: 0s - loss: 0.1292

3/3 [==============================] - 0s 21ms/step - loss: 0.1155 - val_loss: 0.0958 - lr: 0.0010
Epoch 3/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0865

3/3 [==============================] - 0s 23ms/step - loss: 0.0904 - val_loss: 0.1024 - lr: 0.0010
Epoch 4/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0869
W0000 00:00:1744121450.365161   19535 op_level_cost_estimator.cc:699] Error in PredictCost() for the op: op: "Softmax" attr { key: "T" value { type: DT_FLOAT } } inputs { dtype: DT_FLOAT shape { unknown_rank: true } } device { type: "CPU" vendor: "GenuineIntel" model: "106" frequency: 2611 num_cores: 12 environment { key: "cpu_instruction_set" value: "AVX SSE, SSE2, SSE3, SSSE3, SSE4.1, SSE4.2" } environment { key: "eigen" value: "3.4.90" } l1_cache_size: 49152 l2_cache_size: 1310720 l3_cache_size: 12582912 memory_size: 268435456 } outputs { dtype: DT_FLOAT shape { unknown_rank: true } }

3/3 [==============================] - 0s 24ms/step - loss: 0.0813 - val_loss: 0.0789 - lr: 0.0010
Epoch 5/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0684

3/3 [==============================] - 0s 25ms/step - loss: 0.0689 - val_loss: 0.0954 - lr: 0.0010
Epoch 6/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0707

3/3 [==============================] - 0s 25ms/step - loss: 0.0641 - val_loss: 0.0700 - lr: 0.0010
Epoch 7/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0571

3/3 [==============================] - 0s 25ms/step - loss: 0.0553 - val_loss: 0.0601 - lr: 0.0010
Epoch 8/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0503

3/3 [==============================] - 0s 25ms/step - loss: 0.0530 - val_loss: 0.0567 - lr: 0.0010
Epoch 9/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0478

3/3 [==============================] - 0s 25ms/step - loss: 0.0475 - val_loss: 0.0522 - lr: 0.0010
Epoch 10/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0436

3/3 [==============================] - 0s 26ms/step - loss: 0.0430 - val_loss: 0.0547 - lr: 0.0010
Epoch 11/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0416

3/3 [==============================] - 0s 26ms/step - loss: 0.0403 - val_loss: 0.0463 - lr: 0.0010
Epoch 12/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0372

3/3 [==============================] - 0s 25ms/step - loss: 0.0368 - val_loss: 0.0420 - lr: 0.0010
Epoch 13/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0366

3/3 [==============================] - 0s 25ms/step - loss: 0.0361 - val_loss: 0.0396 - lr: 0.0010
Epoch 14/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0320

3/3 [==============================] - 0s 25ms/step - loss: 0.0324 - val_loss: 0.0381 - lr: 0.0010
Epoch 15/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0300

3/3 [==============================] - 0s 26ms/step - loss: 0.0313 - val_loss: 0.0361 - lr: 0.0010
Epoch 16/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0288

3/3 [==============================] - 0s 27ms/step - loss: 0.0278 - val_loss: 0.0343 - lr: 0.0010
Epoch 17/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0297

3/3 [==============================] - 0s 27ms/step - loss: 0.0282 - val_loss: 0.0335 - lr: 0.0010
Epoch 18/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0263

3/3 [==============================] - 0s 30ms/step - loss: 0.0257 - val_loss: 0.0308 - lr: 0.0010
Epoch 19/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0238

3/3 [==============================] - 0s 29ms/step - loss: 0.0257 - val_loss: 0.0295 - lr: 0.0010
Epoch 20/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0255

3/3 [==============================] - 0s 29ms/step - loss: 0.0239 - val_loss: 0.0299 - lr: 0.0010
Epoch 21/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0254

3/3 [==============================] - 0s 39ms/step - loss: 0.0235 - val_loss: 0.0274 - lr: 0.0010
Epoch 22/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0217

3/3 [==============================] - 0s 37ms/step - loss: 0.0212 - val_loss: 0.0256 - lr: 0.0010
Epoch 23/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0189

3/3 [==============================] - 0s 29ms/step - loss: 0.0194 - val_loss: 0.0252 - lr: 0.0010
Epoch 24/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0191

3/3 [==============================] - 0s 28ms/step - loss: 0.0195 - val_loss: 0.0270 - lr: 0.0010
Epoch 25/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0194

3/3 [==============================] - 0s 26ms/step - loss: 0.0199 - val_loss: 0.0235 - lr: 0.0010
Epoch 26/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0169

3/3 [==============================] - 0s 26ms/step - loss: 0.0174 - val_loss: 0.0219 - lr: 0.0010
Epoch 27/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0180

3/3 [==============================] - 0s 25ms/step - loss: 0.0172 - val_loss: 0.0226 - lr: 0.0010
Epoch 28/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0160

3/3 [==============================] - 0s 26ms/step - loss: 0.0164 - val_loss: 0.0226 - lr: 0.0010
Epoch 29/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0190

3/3 [==============================] - 0s 25ms/step - loss: 0.0162 - val_loss: 0.0198 - lr: 0.0010
Epoch 30/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0153

3/3 [==============================] - 0s 25ms/step - loss: 0.0152 - val_loss: 0.0193 - lr: 0.0010
Epoch 31/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0180

3/3 [==============================] - 0s 33ms/step - loss: 0.0163 - val_loss: 0.0222 - lr: 0.0010
Epoch 32/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0144

3/3 [==============================] - 0s 35ms/step - loss: 0.0143 - val_loss: 0.0182 - lr: 0.0010
Epoch 33/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0160

3/3 [==============================] - 0s 26ms/step - loss: 0.0142 - val_loss: 0.0178 - lr: 0.0010
Epoch 34/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0137

3/3 [==============================] - 0s 26ms/step - loss: 0.0158 - val_loss: 0.0186 - lr: 0.0010
Epoch 35/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0140

3/3 [==============================] - 0s 26ms/step - loss: 0.0132 - val_loss: 0.0217 - lr: 0.0010
Epoch 36/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0146

3/3 [==============================] - 0s 26ms/step - loss: 0.0130 - val_loss: 0.0161 - lr: 0.0010
Epoch 37/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0121

3/3 [==============================] - 0s 42ms/step - loss: 0.0120 - val_loss: 0.0157 - lr: 0.0010
Epoch 38/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0124

3/3 [==============================] - 0s 28ms/step - loss: 0.0121 - val_loss: 0.0191 - lr: 0.0010
Epoch 39/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0101

3/3 [==============================] - 0s 26ms/step - loss: 0.0114 - val_loss: 0.0184 - lr: 0.0010
Epoch 40/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0127

3/3 [==============================] - 0s 25ms/step - loss: 0.0108 - val_loss: 0.0162 - lr: 0.0010
Epoch 41/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0113

3/3 [==============================] - 0s 25ms/step - loss: 0.0111 - val_loss: 0.0145 - lr: 0.0010
Epoch 42/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0098

3/3 [==============================] - 0s 26ms/step - loss: 0.0104 - val_loss: 0.0160 - lr: 0.0010
Epoch 43/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0096

3/3 [==============================] - 0s 25ms/step - loss: 0.0106 - val_loss: 0.0180 - lr: 0.0010
Epoch 44/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0114

3/3 [==============================] - 0s 26ms/step - loss: 0.0099 - val_loss: 0.0135 - lr: 0.0010
Epoch 45/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0124

3/3 [==============================] - 0s 25ms/step - loss: 0.0128 - val_loss: 0.0134 - lr: 0.0010
Epoch 46/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0111

3/3 [==============================] - 0s 25ms/step - loss: 0.0106 - val_loss: 0.0195 - lr: 0.0010
Epoch 47/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0105

3/3 [==============================] - 0s 24ms/step - loss: 0.0097 - val_loss: 0.0140 - lr: 0.0010
Epoch 48/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0107

3/3 [==============================] - 0s 25ms/step - loss: 0.0106 - val_loss: 0.0124 - lr: 0.0010
Epoch 49/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0089

3/3 [==============================] - 0s 25ms/step - loss: 0.0091 - val_loss: 0.0161 - lr: 0.0010
Epoch 50/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0084

3/3 [==============================] - 0s 24ms/step - loss: 0.0086 - val_loss: 0.0125 - lr: 0.0010
Epoch 51/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0079

3/3 [==============================] - 0s 24ms/step - loss: 0.0089 - val_loss: 0.0119 - lr: 0.0010
Epoch 52/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0084

3/3 [==============================] - 0s 25ms/step - loss: 0.0076 - val_loss: 0.0128 - lr: 0.0010
Epoch 53/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0093

3/3 [==============================] - 0s 25ms/step - loss: 0.0087 - val_loss: 0.0116 - lr: 0.0010
Epoch 54/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0072

3/3 [==============================] - 0s 24ms/step - loss: 0.0090 - val_loss: 0.0134 - lr: 0.0010
Epoch 55/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0087

3/3 [==============================] - 0s 24ms/step - loss: 0.0081 - val_loss: 0.0113 - lr: 0.0010
Epoch 56/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0070

3/3 [==============================] - 0s 25ms/step - loss: 0.0084 - val_loss: 0.0124 - lr: 0.0010
Epoch 57/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0077

3/3 [==============================] - 0s 26ms/step - loss: 0.0079 - val_loss: 0.0178 - lr: 0.0010
Epoch 58/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0068

3/3 [==============================] - 0s 27ms/step - loss: 0.0077 - val_loss: 0.0105 - lr: 0.0010
Epoch 59/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0071

3/3 [==============================] - 0s 25ms/step - loss: 0.0079 - val_loss: 0.0131 - lr: 0.0010
Epoch 60/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0094

3/3 [==============================] - 0s 24ms/step - loss: 0.0081 - val_loss: 0.0117 - lr: 0.0010
Epoch 61/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0061

3/3 [==============================] - 0s 25ms/step - loss: 0.0085 - val_loss: 0.0102 - lr: 0.0010
Epoch 62/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0079

3/3 [==============================] - 0s 25ms/step - loss: 0.0077 - val_loss: 0.0175 - lr: 0.0010
Epoch 63/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0083

3/3 [==============================] - 0s 26ms/step - loss: 0.0076 - val_loss: 0.0102 - lr: 0.0010
Epoch 64/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0053

3/3 [==============================] - 0s 25ms/step - loss: 0.0074 - val_loss: 0.0096 - lr: 0.0010
Epoch 65/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0078

3/3 [==============================] - 0s 25ms/step - loss: 0.0075 - val_loss: 0.0219 - lr: 0.0010
Epoch 66/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0082

3/3 [==============================] - 0s 25ms/step - loss: 0.0077 - val_loss: 0.0100 - lr: 0.0010
Epoch 67/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0075

3/3 [==============================] - 0s 24ms/step - loss: 0.0080 - val_loss: 0.0132 - lr: 0.0010
Epoch 68/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0081

3/3 [==============================] - 0s 25ms/step - loss: 0.0088 - val_loss: 0.0115 - lr: 0.0010
Epoch 69/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0062

3/3 [==============================] - 0s 27ms/step - loss: 0.0073 - val_loss: 0.0099 - lr: 0.0010
Epoch 70/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0087

3/3 [==============================] - 0s 30ms/step - loss: 0.0077 - val_loss: 0.0200 - lr: 0.0010
Epoch 71/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0072

3/3 [==============================] - 0s 31ms/step - loss: 0.0083 - val_loss: 0.0122 - lr: 0.0010
Epoch 72/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0070

3/3 [==============================] - 0s 29ms/step - loss: 0.0068 - val_loss: 0.0094 - lr: 0.0010
Epoch 73/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0088

3/3 [==============================] - 0s 26ms/step - loss: 0.0075 - val_loss: 0.0131 - lr: 0.0010
Epoch 74/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0068

3/3 [==============================] - 0s 28ms/step - loss: 0.0078 - val_loss: 0.0114 - lr: 0.0010
Epoch 75/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0079

3/3 [==============================] - 0s 26ms/step - loss: 0.0075 - val_loss: 0.0087 - lr: 0.0010
Epoch 76/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0078

3/3 [==============================] - 0s 25ms/step - loss: 0.0076 - val_loss: 0.0084 - lr: 0.0010
Epoch 77/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0060

3/3 [==============================] - 0s 25ms/step - loss: 0.0065 - val_loss: 0.0109 - lr: 0.0010
Epoch 78/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0076

3/3 [==============================] - 0s 26ms/step - loss: 0.0069 - val_loss: 0.0110 - lr: 0.0010
Epoch 79/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0051

3/3 [==============================] - 0s 25ms/step - loss: 0.0057 - val_loss: 0.0087 - lr: 0.0010
Epoch 80/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0064

3/3 [==============================] - 0s 27ms/step - loss: 0.0062 - val_loss: 0.0082 - lr: 0.0010
Epoch 81/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0050

3/3 [==============================] - 0s 25ms/step - loss: 0.0053 - val_loss: 0.0138 - lr: 0.0010
Epoch 82/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0063

3/3 [==============================] - 0s 25ms/step - loss: 0.0061 - val_loss: 0.0071 - lr: 0.0010
Epoch 83/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0052

3/3 [==============================] - 0s 25ms/step - loss: 0.0056 - val_loss: 0.0084 - lr: 0.0010
Epoch 84/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0077

3/3 [==============================] - 0s 25ms/step - loss: 0.0075 - val_loss: 0.0145 - lr: 0.0010
Epoch 85/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0072

3/3 [==============================] - 0s 27ms/step - loss: 0.0061 - val_loss: 0.0068 - lr: 0.0010
Epoch 86/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0060

3/3 [==============================] - 0s 26ms/step - loss: 0.0064 - val_loss: 0.0099 - lr: 0.0010
Epoch 87/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0046

3/3 [==============================] - 0s 25ms/step - loss: 0.0053 - val_loss: 0.0075 - lr: 0.0010
Epoch 88/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0053

3/3 [==============================] - 0s 24ms/step - loss: 0.0057 - val_loss: 0.0074 - lr: 0.0010
Epoch 89/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0068

3/3 [==============================] - 0s 24ms/step - loss: 0.0064 - val_loss: 0.0097 - lr: 0.0010
Epoch 90/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0073

3/3 [==============================] - 0s 25ms/step - loss: 0.0066 - val_loss: 0.0062 - lr: 0.0010
Epoch 91/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0074

3/3 [==============================] - 0s 25ms/step - loss: 0.0070 - val_loss: 0.0133 - lr: 0.0010
Epoch 92/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0060

3/3 [==============================] - 0s 27ms/step - loss: 0.0056 - val_loss: 0.0060 - lr: 0.0010
Epoch 93/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0055

3/3 [==============================] - 0s 24ms/step - loss: 0.0048 - val_loss: 0.0177 - lr: 0.0010
Epoch 94/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0075

3/3 [==============================] - 0s 25ms/step - loss: 0.0063 - val_loss: 0.0076 - lr: 0.0010
Epoch 95/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0044

3/3 [==============================] - 0s 26ms/step - loss: 0.0060 - val_loss: 0.0088 - lr: 0.0010
Epoch 96/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0057

3/3 [==============================] - 0s 24ms/step - loss: 0.0063 - val_loss: 0.0095 - lr: 0.0010
Epoch 97/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0048

3/3 [==============================] - 0s 23ms/step - loss: 0.0061 - val_loss: 0.0060 - lr: 0.0010
Epoch 98/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0073

3/3 [==============================] - 0s 24ms/step - loss: 0.0066 - val_loss: 0.0139 - lr: 0.0010
Epoch 99/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0062

3/3 [==============================] - 0s 26ms/step - loss: 0.0058 - val_loss: 0.0057 - lr: 0.0010
Epoch 100/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0062

3/3 [==============================] - 0s 23ms/step - loss: 0.0060 - val_loss: 0.0115 - lr: 0.0010
Epoch 101/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0054

3/3 [==============================] - 0s 23ms/step - loss: 0.0057 - val_loss: 0.0082 - lr: 0.0010
Epoch 102/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0056

3/3 [==============================] - 0s 27ms/step - loss: 0.0050 - val_loss: 0.0054 - lr: 0.0010
Epoch 103/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0052

3/3 [==============================] - ETA: 0s - loss: 0.0060

3/3 [==============================] - 0s 54ms/step - loss: 0.0060 - val_loss: 0.0124 - lr: 0.0010
Epoch 104/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0070

3/3 [==============================] - ETA: 0s - loss: 0.0054

3/3 [==============================] - 0s 40ms/step - loss: 0.0054 - val_loss: 0.0052 - lr: 0.0010
Epoch 105/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0061

3/3 [==============================] - 0s 28ms/step - loss: 0.0053 - val_loss: 0.0051 - lr: 0.0010
Epoch 106/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0059

3/3 [==============================] - 0s 25ms/step - loss: 0.0052 - val_loss: 0.0218 - lr: 0.0010
Epoch 107/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0059

3/3 [==============================] - 0s 28ms/step - loss: 0.0051 - val_loss: 0.0053 - lr: 0.0010
Epoch 108/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0051

3/3 [==============================] - 0s 25ms/step - loss: 0.0068 - val_loss: 0.0070 - lr: 0.0010
Epoch 109/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0050

3/3 [==============================] - 0s 26ms/step - loss: 0.0056 - val_loss: 0.0174 - lr: 0.0010
Epoch 110/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0041

3/3 [==============================] - 0s 27ms/step - loss: 0.0051 - val_loss: 0.0053 - lr: 0.0010
Epoch 111/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0040

3/3 [==============================] - 0s 57ms/step - loss: 0.0059 - val_loss: 0.0075 - lr: 0.0010
Epoch 112/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0039

3/3 [==============================] - 0s 31ms/step - loss: 0.0049 - val_loss: 0.0133 - lr: 0.0010
Epoch 113/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0049

3/3 [==============================] - 0s 29ms/step - loss: 0.0047 - val_loss: 0.0062 - lr: 0.0010
Epoch 114/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0041

3/3 [==============================] - 0s 33ms/step - loss: 0.0048 - val_loss: 0.0063 - lr: 0.0010
Epoch 115/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0047
Epoch 115: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.

3/3 [==============================] - 0s 38ms/step - loss: 0.0058 - val_loss: 0.0079 - lr: 0.0010
Epoch 116/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0042

3/3 [==============================] - ETA: 0s - loss: 0.0041

3/3 [==============================] - 0s 40ms/step - loss: 0.0041 - val_loss: 0.0063 - lr: 1.0000e-04
Epoch 117/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0044

3/3 [==============================] - 0s 29ms/step - loss: 0.0046 - val_loss: 0.0059 - lr: 1.0000e-04
Epoch 118/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0044

3/3 [==============================] - 0s 28ms/step - loss: 0.0042 - val_loss: 0.0060 - lr: 1.0000e-04
Epoch 119/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0039

3/3 [==============================] - 0s 28ms/step - loss: 0.0044 - val_loss: 0.0065 - lr: 1.0000e-04
Epoch 120/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0043

3/3 [==============================] - 0s 27ms/step - loss: 0.0045 - val_loss: 0.0071 - lr: 1.0000e-04
Epoch 121/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0046

3/3 [==============================] - 0s 27ms/step - loss: 0.0045 - val_loss: 0.0071 - lr: 1.0000e-04
Epoch 122/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0057

3/3 [==============================] - 0s 24ms/step - loss: 0.0045 - val_loss: 0.0071 - lr: 1.0000e-04
Epoch 123/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0034

3/3 [==============================] - 0s 29ms/step - loss: 0.0037 - val_loss: 0.0066 - lr: 1.0000e-04
Epoch 124/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0049

3/3 [==============================] - 0s 33ms/step - loss: 0.0050 - val_loss: 0.0062 - lr: 1.0000e-04
Epoch 125/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0044
Epoch 125: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.

3/3 [==============================] - 0s 28ms/step - loss: 0.0042 - val_loss: 0.0060 - lr: 1.0000e-04
Epoch 125: early stopping
../_images/a20d9d5697166e468e0c3d2d0b3bf044b4b7af1363cc090ccabfb0a1e8846a2f.png

Non-Rolling Prediction#

non_rolling_prediction(rnn_attention, scaler)
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 376ms/step
W0000 00:00:1744121459.818569   19535 op_level_cost_estimator.cc:699] Error in PredictCost() for the op: op: "Softmax" attr { key: "T" value { type: DT_FLOAT } } inputs { dtype: DT_FLOAT shape { unknown_rank: true } } device { type: "CPU" vendor: "GenuineIntel" model: "106" frequency: 2611 num_cores: 12 environment { key: "cpu_instruction_set" value: "AVX SSE, SSE2, SSE3, SSSE3, SSE4.1, SSE4.2" } environment { key: "eigen" value: "3.4.90" } l1_cache_size: 49152 l2_cache_size: 1310720 l3_cache_size: 12582912 memory_size: 268435456 } outputs { dtype: DT_FLOAT shape { unknown_rank: true } }
../_images/8ff01190f587f97dff48a06d9eb21dc4c08cfa11d684bfba50c63f279b84e47c.png

Rolling Prediction#

rolling_prediction(rnn_attention, scaler)
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 18ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 15ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 13ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 13ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 18ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 33ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 17ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 16ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 15ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 15ms/step
No artists with labels found to put in legend.  Note that artists whose label start with an underscore are ignored when legend() is called with no argument.
../_images/cea3955ff5007c7aa2cb7fed87ffbf816a7b94da3a45ce3fa54d0e96f7d233b4.png

CNN Model Building and Training#

cnn = Sequential()

# Input layer - Convolutional layer
cnn.add(
    Conv1D(filters=64, kernel_size=5, activation="relu", input_shape=(look_back, 1)))

# Additional convolutional layer
cnn.add(Conv1D(filters=64, kernel_size=5, activation="relu"))

# Max pooling layer
cnn.add(MaxPooling1D(pool_size=2))

# Flatten layer to prepare data for Dense layer
cnn.add(Flatten())

# Dense layer for prediction
cnn.add(Dense(50, activation='relu', kernel_regularizer=l2(l2_reg)))
cnn.add(Dense(1, kernel_regularizer=l2(l2_reg)))
network_training(cnn, "CNN")
Epoch 1/500
1/3 [=========>....................] - ETA: 0s - loss: 0.1902

3/3 [==============================] - 1s 104ms/step - loss: 0.1824 - val_loss: 0.2944 - lr: 0.0010
Epoch 2/500
1/3 [=========>....................] - ETA: 0s - loss: 0.1396

3/3 [==============================] - 0s 17ms/step - loss: 0.1279 - val_loss: 0.1975 - lr: 0.0010
Epoch 3/500
1/3 [=========>....................] - ETA: 0s - loss: 0.1099

3/3 [==============================] - 0s 19ms/step - loss: 0.0987 - val_loss: 0.1225 - lr: 0.0010
Epoch 4/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0814

3/3 [==============================] - 0s 16ms/step - loss: 0.0772 - val_loss: 0.0856 - lr: 0.0010
Epoch 5/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0702

3/3 [==============================] - 0s 17ms/step - loss: 0.0726 - val_loss: 0.0943 - lr: 0.0010
Epoch 6/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0758

3/3 [==============================] - 0s 16ms/step - loss: 0.0754 - val_loss: 0.0921 - lr: 0.0010
Epoch 7/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0731

3/3 [==============================] - 0s 18ms/step - loss: 0.0705 - val_loss: 0.0796 - lr: 0.0010
Epoch 8/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0665

3/3 [==============================] - 0s 18ms/step - loss: 0.0648 - val_loss: 0.0769 - lr: 0.0010
Epoch 9/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0615

3/3 [==============================] - 0s 15ms/step - loss: 0.0619 - val_loss: 0.0793 - lr: 0.0010
Epoch 10/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0598

3/3 [==============================] - 0s 15ms/step - loss: 0.0609 - val_loss: 0.0785 - lr: 0.0010
Epoch 11/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0595

3/3 [==============================] - 0s 12ms/step - loss: 0.0589 - val_loss: 0.0731 - lr: 0.0010
Epoch 12/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0582

3/3 [==============================] - 0s 16ms/step - loss: 0.0559 - val_loss: 0.0674 - lr: 0.0010
Epoch 13/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0564

3/3 [==============================] - 0s 21ms/step - loss: 0.0537 - val_loss: 0.0641 - lr: 0.0010
Epoch 14/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0505

3/3 [==============================] - 0s 14ms/step - loss: 0.0516 - val_loss: 0.0625 - lr: 0.0010
Epoch 15/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0508

3/3 [==============================] - 0s 17ms/step - loss: 0.0500 - val_loss: 0.0602 - lr: 0.0010
Epoch 16/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0493

3/3 [==============================] - 0s 14ms/step - loss: 0.0481 - val_loss: 0.0576 - lr: 0.0010
Epoch 17/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0495

3/3 [==============================] - 0s 14ms/step - loss: 0.0461 - val_loss: 0.0556 - lr: 0.0010
Epoch 18/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0451

3/3 [==============================] - 0s 14ms/step - loss: 0.0446 - val_loss: 0.0537 - lr: 0.0010
Epoch 19/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0414

3/3 [==============================] - 0s 12ms/step - loss: 0.0430 - val_loss: 0.0516 - lr: 0.0010
Epoch 20/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0445

3/3 [==============================] - 0s 12ms/step - loss: 0.0414 - val_loss: 0.0499 - lr: 0.0010
Epoch 21/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0419

3/3 [==============================] - 0s 11ms/step - loss: 0.0399 - val_loss: 0.0486 - lr: 0.0010
Epoch 22/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0395

3/3 [==============================] - 0s 14ms/step - loss: 0.0385 - val_loss: 0.0467 - lr: 0.0010
Epoch 23/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0370

3/3 [==============================] - 0s 16ms/step - loss: 0.0371 - val_loss: 0.0447 - lr: 0.0010
Epoch 24/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0365

3/3 [==============================] - 0s 26ms/step - loss: 0.0358 - val_loss: 0.0433 - lr: 0.0010
Epoch 25/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0366

3/3 [==============================] - 0s 24ms/step - loss: 0.0345 - val_loss: 0.0420 - lr: 0.0010
Epoch 26/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0338

3/3 [==============================] - 0s 19ms/step - loss: 0.0333 - val_loss: 0.0404 - lr: 0.0010
Epoch 27/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0335

3/3 [==============================] - 0s 16ms/step - loss: 0.0321 - val_loss: 0.0390 - lr: 0.0010
Epoch 28/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0319

3/3 [==============================] - 0s 17ms/step - loss: 0.0310 - val_loss: 0.0379 - lr: 0.0010
Epoch 29/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0315

3/3 [==============================] - 0s 17ms/step - loss: 0.0299 - val_loss: 0.0366 - lr: 0.0010
Epoch 30/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0295

3/3 [==============================] - 0s 13ms/step - loss: 0.0289 - val_loss: 0.0348 - lr: 0.0010
Epoch 31/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0289

3/3 [==============================] - 0s 17ms/step - loss: 0.0279 - val_loss: 0.0337 - lr: 0.0010
Epoch 32/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0273

3/3 [==============================] - 0s 50ms/step - loss: 0.0270 - val_loss: 0.0319 - lr: 0.0010
Epoch 33/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0278

3/3 [==============================] - 0s 17ms/step - loss: 0.0260 - val_loss: 0.0314 - lr: 0.0010
Epoch 34/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0255

3/3 [==============================] - 0s 15ms/step - loss: 0.0251 - val_loss: 0.0307 - lr: 0.0010
Epoch 35/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0249

3/3 [==============================] - 0s 14ms/step - loss: 0.0243 - val_loss: 0.0299 - lr: 0.0010
Epoch 36/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0236

3/3 [==============================] - 0s 15ms/step - loss: 0.0235 - val_loss: 0.0278 - lr: 0.0010
Epoch 37/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0235

3/3 [==============================] - 0s 11ms/step - loss: 0.0228 - val_loss: 0.0264 - lr: 0.0010
Epoch 38/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0227

3/3 [==============================] - 0s 12ms/step - loss: 0.0220 - val_loss: 0.0264 - lr: 0.0010
Epoch 39/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0219

3/3 [==============================] - 0s 14ms/step - loss: 0.0213 - val_loss: 0.0260 - lr: 0.0010
Epoch 40/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0212

3/3 [==============================] - 0s 13ms/step - loss: 0.0207 - val_loss: 0.0250 - lr: 0.0010
Epoch 41/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0205

3/3 [==============================] - 0s 13ms/step - loss: 0.0200 - val_loss: 0.0224 - lr: 0.0010
Epoch 42/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0191

3/3 [==============================] - 0s 13ms/step - loss: 0.0193 - val_loss: 0.0221 - lr: 0.0010
Epoch 43/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0189

3/3 [==============================] - 0s 12ms/step - loss: 0.0187 - val_loss: 0.0226 - lr: 0.0010
Epoch 44/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0188

3/3 [==============================] - 0s 12ms/step - loss: 0.0181 - val_loss: 0.0219 - lr: 0.0010
Epoch 45/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0177

3/3 [==============================] - 0s 11ms/step - loss: 0.0176 - val_loss: 0.0196 - lr: 0.0010
Epoch 46/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0172

3/3 [==============================] - 0s 20ms/step - loss: 0.0171 - val_loss: 0.0197 - lr: 0.0010
Epoch 47/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0170

3/3 [==============================] - 0s 11ms/step - loss: 0.0165 - val_loss: 0.0196 - lr: 0.0010
Epoch 48/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0160

3/3 [==============================] - 0s 11ms/step - loss: 0.0161 - val_loss: 0.0185 - lr: 0.0010
Epoch 49/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0158

3/3 [==============================] - 0s 12ms/step - loss: 0.0156 - val_loss: 0.0178 - lr: 0.0010
Epoch 50/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0150

3/3 [==============================] - 0s 11ms/step - loss: 0.0152 - val_loss: 0.0173 - lr: 0.0010
Epoch 51/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0147

3/3 [==============================] - 0s 11ms/step - loss: 0.0147 - val_loss: 0.0174 - lr: 0.0010
Epoch 52/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0144

3/3 [==============================] - 0s 11ms/step - loss: 0.0144 - val_loss: 0.0168 - lr: 0.0010
Epoch 53/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0142

3/3 [==============================] - 0s 12ms/step - loss: 0.0140 - val_loss: 0.0160 - lr: 0.0010
Epoch 54/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0137

3/3 [==============================] - 0s 11ms/step - loss: 0.0136 - val_loss: 0.0156 - lr: 0.0010
Epoch 55/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0131

3/3 [==============================] - 0s 10ms/step - loss: 0.0133 - val_loss: 0.0157 - lr: 0.0010
Epoch 56/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0130

3/3 [==============================] - 0s 10ms/step - loss: 0.0129 - val_loss: 0.0154 - lr: 0.0010
Epoch 57/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0129

3/3 [==============================] - 0s 10ms/step - loss: 0.0126 - val_loss: 0.0144 - lr: 0.0010
Epoch 58/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0123

3/3 [==============================] - 0s 10ms/step - loss: 0.0123 - val_loss: 0.0143 - lr: 0.0010
Epoch 59/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0120

3/3 [==============================] - 0s 10ms/step - loss: 0.0120 - val_loss: 0.0143 - lr: 0.0010
Epoch 60/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0120

3/3 [==============================] - 0s 10ms/step - loss: 0.0117 - val_loss: 0.0143 - lr: 0.0010
Epoch 61/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0115

3/3 [==============================] - 0s 10ms/step - loss: 0.0114 - val_loss: 0.0136 - lr: 0.0010
Epoch 62/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0113

3/3 [==============================] - 0s 10ms/step - loss: 0.0112 - val_loss: 0.0134 - lr: 0.0010
Epoch 63/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0112

3/3 [==============================] - 0s 9ms/step - loss: 0.0109 - val_loss: 0.0132 - lr: 0.0010
Epoch 64/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0104

3/3 [==============================] - 0s 10ms/step - loss: 0.0107 - val_loss: 0.0126 - lr: 0.0010
Epoch 65/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0106

3/3 [==============================] - 0s 9ms/step - loss: 0.0104 - val_loss: 0.0123 - lr: 0.0010
Epoch 66/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0104

3/3 [==============================] - 0s 9ms/step - loss: 0.0102 - val_loss: 0.0127 - lr: 0.0010
Epoch 67/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0100

3/3 [==============================] - 0s 10ms/step - loss: 0.0100 - val_loss: 0.0120 - lr: 0.0010
Epoch 68/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0099

3/3 [==============================] - 0s 10ms/step - loss: 0.0097 - val_loss: 0.0118 - lr: 0.0010
Epoch 69/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0094

3/3 [==============================] - 0s 10ms/step - loss: 0.0095 - val_loss: 0.0116 - lr: 0.0010
Epoch 70/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0096

3/3 [==============================] - 0s 10ms/step - loss: 0.0093 - val_loss: 0.0113 - lr: 0.0010
Epoch 71/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0092

3/3 [==============================] - 0s 10ms/step - loss: 0.0091 - val_loss: 0.0114 - lr: 0.0010
Epoch 72/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0090

3/3 [==============================] - 0s 10ms/step - loss: 0.0089 - val_loss: 0.0115 - lr: 0.0010
Epoch 73/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0087

3/3 [==============================] - 0s 10ms/step - loss: 0.0088 - val_loss: 0.0109 - lr: 0.0010
Epoch 74/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0087

3/3 [==============================] - 0s 10ms/step - loss: 0.0086 - val_loss: 0.0103 - lr: 0.0010
Epoch 75/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0083

3/3 [==============================] - 0s 9ms/step - loss: 0.0084 - val_loss: 0.0107 - lr: 0.0010
Epoch 76/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0081

3/3 [==============================] - 0s 10ms/step - loss: 0.0083 - val_loss: 0.0104 - lr: 0.0010
Epoch 77/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0080

3/3 [==============================] - 0s 10ms/step - loss: 0.0081 - val_loss: 0.0100 - lr: 0.0010
Epoch 78/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0082

3/3 [==============================] - 0s 9ms/step - loss: 0.0079 - val_loss: 0.0095 - lr: 0.0010
Epoch 79/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0078

3/3 [==============================] - 0s 10ms/step - loss: 0.0078 - val_loss: 0.0101 - lr: 0.0010
Epoch 80/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0076

3/3 [==============================] - 0s 10ms/step - loss: 0.0076 - val_loss: 0.0101 - lr: 0.0010
Epoch 81/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0073

3/3 [==============================] - 0s 10ms/step - loss: 0.0075 - val_loss: 0.0093 - lr: 0.0010
Epoch 82/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0074

3/3 [==============================] - 0s 9ms/step - loss: 0.0074 - val_loss: 0.0093 - lr: 0.0010
Epoch 83/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0072

3/3 [==============================] - 0s 9ms/step - loss: 0.0072 - val_loss: 0.0094 - lr: 0.0010
Epoch 84/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0069

3/3 [==============================] - 0s 9ms/step - loss: 0.0071 - val_loss: 0.0095 - lr: 0.0010
Epoch 85/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0071

3/3 [==============================] - 0s 10ms/step - loss: 0.0070 - val_loss: 0.0087 - lr: 0.0010
Epoch 86/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0068

3/3 [==============================] - 0s 10ms/step - loss: 0.0068 - val_loss: 0.0093 - lr: 0.0010
Epoch 87/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0067

3/3 [==============================] - 0s 11ms/step - loss: 0.0067 - val_loss: 0.0089 - lr: 0.0010
Epoch 88/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0065

3/3 [==============================] - 0s 11ms/step - loss: 0.0066 - val_loss: 0.0086 - lr: 0.0010
Epoch 89/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0065

3/3 [==============================] - 0s 10ms/step - loss: 0.0065 - val_loss: 0.0087 - lr: 0.0010
Epoch 90/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0064

3/3 [==============================] - 0s 11ms/step - loss: 0.0064 - val_loss: 0.0085 - lr: 0.0010
Epoch 91/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0064

3/3 [==============================] - 0s 12ms/step - loss: 0.0063 - val_loss: 0.0084 - lr: 0.0010
Epoch 92/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0064

3/3 [==============================] - 0s 11ms/step - loss: 0.0062 - val_loss: 0.0081 - lr: 0.0010
Epoch 93/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0061

3/3 [==============================] - 0s 10ms/step - loss: 0.0061 - val_loss: 0.0083 - lr: 0.0010
Epoch 94/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0061

3/3 [==============================] - 0s 10ms/step - loss: 0.0060 - val_loss: 0.0081 - lr: 0.0010
Epoch 95/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0059

3/3 [==============================] - 0s 11ms/step - loss: 0.0059 - val_loss: 0.0080 - lr: 0.0010
Epoch 96/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0058

3/3 [==============================] - 0s 11ms/step - loss: 0.0058 - val_loss: 0.0081 - lr: 0.0010
Epoch 97/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0057

3/3 [==============================] - 0s 15ms/step - loss: 0.0057 - val_loss: 0.0075 - lr: 0.0010
Epoch 98/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0056

3/3 [==============================] - 0s 12ms/step - loss: 0.0056 - val_loss: 0.0079 - lr: 0.0010
Epoch 99/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0056

3/3 [==============================] - 0s 11ms/step - loss: 0.0055 - val_loss: 0.0076 - lr: 0.0010
Epoch 100/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0054

3/3 [==============================] - 0s 11ms/step - loss: 0.0054 - val_loss: 0.0072 - lr: 0.0010
Epoch 101/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0054

3/3 [==============================] - 0s 12ms/step - loss: 0.0053 - val_loss: 0.0079 - lr: 0.0010
Epoch 102/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0053

3/3 [==============================] - 0s 11ms/step - loss: 0.0053 - val_loss: 0.0074 - lr: 0.0010
Epoch 103/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0055

3/3 [==============================] - 0s 10ms/step - loss: 0.0053 - val_loss: 0.0065 - lr: 0.0010
Epoch 104/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0050

3/3 [==============================] - 0s 10ms/step - loss: 0.0051 - val_loss: 0.0081 - lr: 0.0010
Epoch 105/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0050

3/3 [==============================] - 0s 11ms/step - loss: 0.0051 - val_loss: 0.0077 - lr: 0.0010
Epoch 106/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0052

3/3 [==============================] - 0s 10ms/step - loss: 0.0050 - val_loss: 0.0062 - lr: 0.0010
Epoch 107/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0049

3/3 [==============================] - 0s 12ms/step - loss: 0.0049 - val_loss: 0.0074 - lr: 0.0010
Epoch 108/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0048

3/3 [==============================] - 0s 16ms/step - loss: 0.0048 - val_loss: 0.0073 - lr: 0.0010
Epoch 109/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0046

3/3 [==============================] - 0s 23ms/step - loss: 0.0047 - val_loss: 0.0064 - lr: 0.0010
Epoch 110/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0047

3/3 [==============================] - 0s 11ms/step - loss: 0.0047 - val_loss: 0.0069 - lr: 0.0010
Epoch 111/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0047

3/3 [==============================] - 0s 10ms/step - loss: 0.0046 - val_loss: 0.0067 - lr: 0.0010
Epoch 112/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0045

3/3 [==============================] - 0s 11ms/step - loss: 0.0045 - val_loss: 0.0066 - lr: 0.0010
Epoch 113/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0044

3/3 [==============================] - 0s 11ms/step - loss: 0.0045 - val_loss: 0.0066 - lr: 0.0010
Epoch 114/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0044

3/3 [==============================] - 0s 11ms/step - loss: 0.0044 - val_loss: 0.0061 - lr: 0.0010
Epoch 115/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0043

3/3 [==============================] - 0s 10ms/step - loss: 0.0044 - val_loss: 0.0067 - lr: 0.0010
Epoch 116/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0044

3/3 [==============================] - 0s 11ms/step - loss: 0.0043 - val_loss: 0.0062 - lr: 0.0010
Epoch 117/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0043

3/3 [==============================] - 0s 10ms/step - loss: 0.0042 - val_loss: 0.0063 - lr: 0.0010
Epoch 118/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0042

3/3 [==============================] - 0s 11ms/step - loss: 0.0042 - val_loss: 0.0064 - lr: 0.0010
Epoch 119/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0040

3/3 [==============================] - 0s 12ms/step - loss: 0.0041 - val_loss: 0.0059 - lr: 0.0010
Epoch 120/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0040

3/3 [==============================] - 0s 10ms/step - loss: 0.0041 - val_loss: 0.0067 - lr: 0.0010
Epoch 121/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0041

3/3 [==============================] - 0s 10ms/step - loss: 0.0040 - val_loss: 0.0059 - lr: 0.0010
Epoch 122/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0038

3/3 [==============================] - 0s 10ms/step - loss: 0.0040 - val_loss: 0.0057 - lr: 0.0010
Epoch 123/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0039

3/3 [==============================] - 0s 9ms/step - loss: 0.0039 - val_loss: 0.0064 - lr: 0.0010
Epoch 124/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0037

3/3 [==============================] - 0s 9ms/step - loss: 0.0039 - val_loss: 0.0061 - lr: 0.0010
Epoch 125/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0041

3/3 [==============================] - 0s 9ms/step - loss: 0.0038 - val_loss: 0.0055 - lr: 0.0010
Epoch 126/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0037

3/3 [==============================] - 0s 10ms/step - loss: 0.0038 - val_loss: 0.0062 - lr: 0.0010
Epoch 127/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0036

3/3 [==============================] - 0s 9ms/step - loss: 0.0037 - val_loss: 0.0054 - lr: 0.0010
Epoch 128/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0037

3/3 [==============================] - 0s 10ms/step - loss: 0.0037 - val_loss: 0.0054 - lr: 0.0010
Epoch 129/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0036

3/3 [==============================] - 0s 9ms/step - loss: 0.0036 - val_loss: 0.0061 - lr: 0.0010
Epoch 130/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0036

3/3 [==============================] - 0s 9ms/step - loss: 0.0036 - val_loss: 0.0055 - lr: 0.0010
Epoch 131/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0034

3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - val_loss: 0.0053 - lr: 0.0010
Epoch 132/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0036

3/3 [==============================] - 0s 13ms/step - loss: 0.0035 - val_loss: 0.0057 - lr: 0.0010
Epoch 133/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0033

3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - val_loss: 0.0055 - lr: 0.0010
Epoch 134/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0035

3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - val_loss: 0.0053 - lr: 0.0010
Epoch 135/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0034

3/3 [==============================] - 0s 12ms/step - loss: 0.0034 - val_loss: 0.0057 - lr: 0.0010
Epoch 136/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0035

3/3 [==============================] - 0s 16ms/step - loss: 0.0034 - val_loss: 0.0053 - lr: 0.0010
Epoch 137/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0034

3/3 [==============================] - 0s 20ms/step - loss: 0.0033 - val_loss: 0.0061 - lr: 0.0010
Epoch 138/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0033
Epoch 138: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.

3/3 [==============================] - 0s 24ms/step - loss: 0.0033 - val_loss: 0.0055 - lr: 0.0010
Epoch 139/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0032

3/3 [==============================] - 0s 21ms/step - loss: 0.0032 - val_loss: 0.0054 - lr: 1.0000e-04
Epoch 140/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0031

3/3 [==============================] - 0s 17ms/step - loss: 0.0032 - val_loss: 0.0053 - lr: 1.0000e-04
Epoch 141/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0031

3/3 [==============================] - 0s 18ms/step - loss: 0.0032 - val_loss: 0.0052 - lr: 1.0000e-04
Epoch 142/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0031

3/3 [==============================] - 0s 16ms/step - loss: 0.0032 - val_loss: 0.0053 - lr: 1.0000e-04
Epoch 143/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0033

3/3 [==============================] - 0s 17ms/step - loss: 0.0032 - val_loss: 0.0053 - lr: 1.0000e-04
Epoch 144/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0031

3/3 [==============================] - 0s 14ms/step - loss: 0.0032 - val_loss: 0.0053 - lr: 1.0000e-04
Epoch 145/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0032

3/3 [==============================] - 0s 14ms/step - loss: 0.0032 - val_loss: 0.0054 - lr: 1.0000e-04
Epoch 146/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0033

3/3 [==============================] - 0s 15ms/step - loss: 0.0032 - val_loss: 0.0054 - lr: 1.0000e-04
Epoch 147/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0031

3/3 [==============================] - 0s 13ms/step - loss: 0.0032 - val_loss: 0.0054 - lr: 1.0000e-04
Epoch 148/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0032

3/3 [==============================] - 0s 13ms/step - loss: 0.0032 - val_loss: 0.0054 - lr: 1.0000e-04
Epoch 149/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0031

3/3 [==============================] - 0s 13ms/step - loss: 0.0032 - val_loss: 0.0053 - lr: 1.0000e-04
Epoch 150/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0033

3/3 [==============================] - 0s 14ms/step - loss: 0.0032 - val_loss: 0.0053 - lr: 1.0000e-04
Epoch 151/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0031
Epoch 151: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.

3/3 [==============================] - 0s 14ms/step - loss: 0.0032 - val_loss: 0.0052 - lr: 1.0000e-04
Epoch 152/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0033

3/3 [==============================] - 0s 14ms/step - loss: 0.0032 - val_loss: 0.0052 - lr: 1.0000e-05
Epoch 153/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0031

3/3 [==============================] - 0s 13ms/step - loss: 0.0032 - val_loss: 0.0052 - lr: 1.0000e-05
Epoch 154/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0031

3/3 [==============================] - 0s 13ms/step - loss: 0.0032 - val_loss: 0.0052 - lr: 1.0000e-05
Epoch 155/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0031

3/3 [==============================] - 0s 13ms/step - loss: 0.0032 - val_loss: 0.0053 - lr: 1.0000e-05
Epoch 156/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0031

3/3 [==============================] - 0s 12ms/step - loss: 0.0032 - val_loss: 0.0053 - lr: 1.0000e-05
Epoch 157/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0032

3/3 [==============================] - 0s 12ms/step - loss: 0.0032 - val_loss: 0.0053 - lr: 1.0000e-05
Epoch 158/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0031

3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - val_loss: 0.0053 - lr: 1.0000e-05
Epoch 159/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0033

3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - val_loss: 0.0053 - lr: 1.0000e-05
Epoch 160/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0031

3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - val_loss: 0.0053 - lr: 1.0000e-05
Epoch 161/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0033
Epoch 161: ReduceLROnPlateau reducing learning rate to 1.0000000656873453e-06.

3/3 [==============================] - 0s 13ms/step - loss: 0.0032 - val_loss: 0.0053 - lr: 1.0000e-05
Epoch 161: early stopping
../_images/cab57a41b5a5653d54ff80c2f5192d6e699727f2ac20226eca468e271933e69a.png

Non-Rolling Prediction#

non_rolling_prediction(cnn, scaler)
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 48ms/step
../_images/860950aa038b2b6e18b9bb5017677e8c2e72258f3c85855193c407c748fad03b.png

Rolling Prediction#

rolling_prediction(cnn, scaler)
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 15ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 16ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 13ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 13ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 15ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 11ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 14ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 13ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 13ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 13ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 13ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 13ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 13ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 13ms/step
No artists with labels found to put in legend.  Note that artists whose label start with an underscore are ignored when legend() is called with no argument.
../_images/eefe9a0a31f1943f4966731158adea640bca69cfa447f5da94ab41dfe39513ae.png

Transformer Model Building and Training#

# Transformer Encoder Block
def transformer_encoder(inputs, head_size, num_heads, ff_dim, dropout=0):
    # Normalization and Attention
    # x = LayerNormalization()(inputs)
    x = MultiHeadAttention(key_dim=head_size, num_heads=num_heads, dropout=dropout)(
        inputs, inputs
    )
    x = Dropout(dropout)(x)
    res = x + inputs
    # res = LayerNormalization()(res)

    # Feed Forward Part
    # x = LayerNormalization()(res)
    x = Dense(ff_dim, activation="relu")(x)
    x = Dropout(dropout)(x)
    x = Dense(inputs.shape[-1])(x)
    res = x + res
    # res = LayerNormalization()(res)
    return res
# Model parameters
ff_dim = 64  # Hidden layer size in feed forward network inside transformer
num_heads = 4  # Number of attention heads
dropout_rate = 0.2  # Dropout rate
head_size = 64  # Attention head size

# Model building
inputs = Input(shape=(look_back, 1))
x = inputs

# Transformer layers
x = transformer_encoder(x, head_size, num_heads, ff_dim, dropout_rate)

# Global Pooling and Output
x = GlobalAveragePooling1D(data_format="channels_first")(x)
x = Dropout(dropout_rate)(x)
x = Dense(50, activation='relu', kernel_regularizer=l2(l2_reg))(x)
x = Dense(1, activation='relu', kernel_regularizer=l2(l2_reg))(x)



# Build the model
transformer = tf.keras.Model(inputs=inputs, outputs=x)
network_training(transformer, "Transformer")
Epoch 1/500
1/3 [=========>....................] - ETA: 1s - loss: 0.1022

3/3 [==============================] - 1s 66ms/step - loss: 0.1194 - val_loss: 0.3215 - lr: 0.0010
Epoch 2/500
1/3 [=========>....................] - ETA: 0s - loss: 0.1180

3/3 [==============================] - 0s 16ms/step - loss: 0.1187 - val_loss: 0.3209 - lr: 0.0010
Epoch 3/500
1/3 [=========>....................] - ETA: 0s - loss: 0.1222

3/3 [==============================] - 0s 18ms/step - loss: 0.1181 - val_loss: 0.3203 - lr: 0.0010
Epoch 4/500
1/3 [=========>....................] - ETA: 0s - loss: 0.1199

3/3 [==============================] - 0s 23ms/step - loss: 0.1175 - val_loss: 0.3198 - lr: 0.0010
Epoch 5/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0971

3/3 [==============================] - 0s 13ms/step - loss: 0.1139 - val_loss: 0.3193 - lr: 0.0010
Epoch 6/500
1/3 [=========>....................] - ETA: 0s - loss: 0.1468

3/3 [==============================] - 0s 12ms/step - loss: 0.1049 - val_loss: 0.2613 - lr: 0.0010
Epoch 7/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0972

3/3 [==============================] - 0s 15ms/step - loss: 0.0887 - val_loss: 0.1303 - lr: 0.0010
Epoch 8/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0670

3/3 [==============================] - 0s 12ms/step - loss: 0.0851 - val_loss: 0.0749 - lr: 0.0010
Epoch 9/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0711

3/3 [==============================] - 0s 13ms/step - loss: 0.0716 - val_loss: 0.0852 - lr: 0.0010
Epoch 10/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0930

3/3 [==============================] - 0s 13ms/step - loss: 0.0745 - val_loss: 0.1157 - lr: 0.0010
Epoch 11/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0743

3/3 [==============================] - 0s 13ms/step - loss: 0.0709 - val_loss: 0.1216 - lr: 0.0010
Epoch 12/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0462

3/3 [==============================] - 0s 14ms/step - loss: 0.0561 - val_loss: 0.1062 - lr: 0.0010
Epoch 13/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0747

3/3 [==============================] - 0s 16ms/step - loss: 0.0668 - val_loss: 0.0745 - lr: 0.0010
Epoch 14/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0475

3/3 [==============================] - 0s 14ms/step - loss: 0.0468 - val_loss: 0.0451 - lr: 0.0010
Epoch 15/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0594

3/3 [==============================] - 0s 15ms/step - loss: 0.0544 - val_loss: 0.0318 - lr: 0.0010
Epoch 16/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0668

3/3 [==============================] - 0s 14ms/step - loss: 0.0496 - val_loss: 0.0389 - lr: 0.0010
Epoch 17/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0299

3/3 [==============================] - 0s 14ms/step - loss: 0.0391 - val_loss: 0.0405 - lr: 0.0010
Epoch 18/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0350

3/3 [==============================] - 0s 14ms/step - loss: 0.0352 - val_loss: 0.0290 - lr: 0.0010
Epoch 19/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0370

3/3 [==============================] - 0s 13ms/step - loss: 0.0376 - val_loss: 0.0269 - lr: 0.0010
Epoch 20/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0416

3/3 [==============================] - 0s 14ms/step - loss: 0.0315 - val_loss: 0.0271 - lr: 0.0010
Epoch 21/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0446

3/3 [==============================] - 0s 14ms/step - loss: 0.0376 - val_loss: 0.0274 - lr: 0.0010
Epoch 22/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0342

3/3 [==============================] - 0s 16ms/step - loss: 0.0310 - val_loss: 0.0273 - lr: 0.0010
Epoch 23/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0303

3/3 [==============================] - 0s 13ms/step - loss: 0.0294 - val_loss: 0.0266 - lr: 0.0010
Epoch 24/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0277

3/3 [==============================] - 0s 13ms/step - loss: 0.0279 - val_loss: 0.0273 - lr: 0.0010
Epoch 25/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0266

3/3 [==============================] - 0s 13ms/step - loss: 0.0283 - val_loss: 0.0259 - lr: 0.0010
Epoch 26/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0246

3/3 [==============================] - 0s 13ms/step - loss: 0.0297 - val_loss: 0.0258 - lr: 0.0010
Epoch 27/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0298

3/3 [==============================] - 0s 14ms/step - loss: 0.0269 - val_loss: 0.0256 - lr: 0.0010
Epoch 28/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0356

3/3 [==============================] - 0s 14ms/step - loss: 0.0307 - val_loss: 0.0266 - lr: 0.0010
Epoch 29/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0302

3/3 [==============================] - 0s 13ms/step - loss: 0.0270 - val_loss: 0.0281 - lr: 0.0010
Epoch 30/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0244

3/3 [==============================] - 0s 13ms/step - loss: 0.0291 - val_loss: 0.0278 - lr: 0.0010
Epoch 31/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0329

3/3 [==============================] - 0s 13ms/step - loss: 0.0289 - val_loss: 0.0254 - lr: 0.0010
Epoch 32/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0237

3/3 [==============================] - 0s 13ms/step - loss: 0.0238 - val_loss: 0.0278 - lr: 0.0010
Epoch 33/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0242

3/3 [==============================] - 0s 13ms/step - loss: 0.0235 - val_loss: 0.0258 - lr: 0.0010
Epoch 34/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0283

3/3 [==============================] - 0s 14ms/step - loss: 0.0235 - val_loss: 0.0241 - lr: 0.0010
Epoch 35/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0264

3/3 [==============================] - 0s 13ms/step - loss: 0.0228 - val_loss: 0.0237 - lr: 0.0010
Epoch 36/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0282

3/3 [==============================] - 0s 14ms/step - loss: 0.0234 - val_loss: 0.0251 - lr: 0.0010
Epoch 37/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0269

3/3 [==============================] - 0s 15ms/step - loss: 0.0224 - val_loss: 0.0249 - lr: 0.0010
Epoch 38/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0232

3/3 [==============================] - 0s 12ms/step - loss: 0.0240 - val_loss: 0.0250 - lr: 0.0010
Epoch 39/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0277

3/3 [==============================] - 0s 12ms/step - loss: 0.0225 - val_loss: 0.0249 - lr: 0.0010
Epoch 40/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0220

3/3 [==============================] - 0s 12ms/step - loss: 0.0248 - val_loss: 0.0269 - lr: 0.0010
Epoch 41/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0231

3/3 [==============================] - 0s 12ms/step - loss: 0.0201 - val_loss: 0.0280 - lr: 0.0010
Epoch 42/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0188

3/3 [==============================] - 0s 12ms/step - loss: 0.0198 - val_loss: 0.0267 - lr: 0.0010
Epoch 43/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0211

3/3 [==============================] - 0s 16ms/step - loss: 0.0206 - val_loss: 0.0238 - lr: 0.0010
Epoch 44/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0209

3/3 [==============================] - 0s 16ms/step - loss: 0.0225 - val_loss: 0.0241 - lr: 0.0010
Epoch 45/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0221
Epoch 45: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.

3/3 [==============================] - 0s 14ms/step - loss: 0.0203 - val_loss: 0.0263 - lr: 0.0010
Epoch 46/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0220

3/3 [==============================] - 0s 14ms/step - loss: 0.0195 - val_loss: 0.0266 - lr: 1.0000e-04
Epoch 47/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0196

3/3 [==============================] - 0s 12ms/step - loss: 0.0216 - val_loss: 0.0268 - lr: 1.0000e-04
Epoch 48/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0208

3/3 [==============================] - 0s 14ms/step - loss: 0.0207 - val_loss: 0.0265 - lr: 1.0000e-04
Epoch 49/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0283

3/3 [==============================] - 0s 14ms/step - loss: 0.0216 - val_loss: 0.0259 - lr: 1.0000e-04
Epoch 50/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0165

3/3 [==============================] - 0s 11ms/step - loss: 0.0211 - val_loss: 0.0254 - lr: 1.0000e-04
Epoch 51/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0216

3/3 [==============================] - 0s 11ms/step - loss: 0.0211 - val_loss: 0.0249 - lr: 1.0000e-04
Epoch 52/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0241

3/3 [==============================] - 0s 16ms/step - loss: 0.0214 - val_loss: 0.0244 - lr: 1.0000e-04
Epoch 53/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0215

3/3 [==============================] - 0s 12ms/step - loss: 0.0226 - val_loss: 0.0244 - lr: 1.0000e-04
Epoch 54/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0184

3/3 [==============================] - 0s 12ms/step - loss: 0.0201 - val_loss: 0.0244 - lr: 1.0000e-04
Epoch 55/500
1/3 [=========>....................] - ETA: 0s - loss: 0.0186
Epoch 55: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.

3/3 [==============================] - 0s 15ms/step - loss: 0.0218 - val_loss: 0.0245 - lr: 1.0000e-04
Epoch 55: early stopping
../_images/fc8fb18d8ca8be473f61fe8950511bdd42973f4c3a8a020edce478a09afcd7cf.png

Non-Rolling Prediction#

  non_rolling_prediction(transformer, scaler)
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 75ms/step
../_images/8d82e19f3d0751082d61dda4160506be069b2f10434796a946448849f342d5f2.png

Rolling Prediction#

rolling_prediction(transformer, scaler)
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 13ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 11ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 11ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 11ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 11ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 11ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 11ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 11ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 12ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 11ms/step
1/1 [==============================] - ETA: 0s

1/1 [==============================] - 0s 16ms/step
No artists with labels found to put in legend.  Note that artists whose label start with an underscore are ignored when legend() is called with no argument.
../_images/59612c8236498e16dc91c5a2b751ffa3cdbca84bd058818eb88d7eed790d52d9.png