I decided to use RNN for time series data analysis, implemented it simply, and used it as time series data. Most of them are the following imitations, so please like them. Sine wave prediction using RNN in deep learning library Keras Predicting sine waves with LSTM
from keras.models import Sequential
from keras.layers.core import Dense, Activation
from keras.layers.recurrent import LSTM
from keras.optimizers import Adam
from keras.callbacks import EarlyStopping
import numpy as np
import matplotlib.pyplot as plt
This time, to build the LSTM, import the LSTM from recurrent. Early Stopping is also imported to save resources such as learning time.
First, a sine wave is generated.
def sin(x, T=100):
return np.sin(2.0 * np.pi * x / T)
#Add noise to sine waves
def toy_problem(T=100, ampl=0.05):
x = np.arange(0, 2 * T + 1)
noise = ampl * np.random.uniform(low=-1.0, high=1.0, size=len(x))
return sin(x) + noise
f = toy_problem()
As shown below, one learning data is expressed by one label data for every 25 steps as learning data.
def make_dataset(low_data, n_prev=100):
data, target = [], []
maxlen = 25
for i in range(len(low_data)-maxlen):
data.append(low_data[i:i + maxlen])
target.append(low_data[i + maxlen])
re_data = np.array(data).reshape(len(data), maxlen, 1)
re_target = np.array(target).reshape(len(data), 1)
return re_data, re_target
#g ->Training data, h->Learning label
g, h = make_dataset(f)
Create a simple LSTM learning model. The concept of LSTM is very easy to understand as follows. [Overview of LSTM network](http://qiita.com/KojiOhki/items/89cd7b69a8a6239d67ca#lstm network)
#Model building
#Number of steps of one learning data(This time 25)
length_of_sequence = g.shape[1]
in_out_neurons = 1
n_hidden = 300
model = Sequential()
model.add(LSTM(n_hidden, batch_input_shape=(None, length_of_sequence, in_out_neurons), return_sequences=False))
model.add(Dense(in_out_neurons))
model.add(Activation("linear"))
optimizer = Adam(lr=0.001)
model.compile(loss="mean_squared_error", optimizer=optimizer)
Learning is performed using the generated training data and the defined model. This time, 10% of the training data was used for validation and trained at 100 epoch. By defining early_stopping in the first line with callbacks, learning is automatically terminated when it is determined that the change in the error value (val_loss) of validation has converged. By setting the mode to auto, the convergence test is automatically performed. patience learns epoch for the value of patience from the judgment value, and judges that it ends if there is no change. Therefore, if patience = 0, learning will end the moment val_loss rises.
early_stopping = EarlyStopping(monitor='val_loss', mode='auto', patience=20)
model.fit(g, h,
batch_size=300,
epochs=100,
validation_split=0.1,
callbacks=[early_stopping]
)
Predict the training data and check if the sine wave can be reproduced.
#Forecast
predicted = model.predict(g)
Now we can make predicted predict the sine wave after t = 25. Let's actually plot.
plt.figure()
plt.plot(range(25,len(predicted)+25),predicted, color="r", label="predict_data")
plt.plot(range(0, len(f)), f, color="b", label="row_data")
plt.legend()
plt.show()
Prediction The sin wave can be predicted with almost no influence of noise.
Using the generated training model, the coordinates of the sine wave at the time after the training data are predicted.
#Length of time for one training data-> 25
time_length = future_test.shape[1]
#Variables that store future forecast data
future_result = np.empty((1))
#Future forecast
for step2 in range(400):
test_data = np.reshape(future_test, (1, time_length, 1))
batch_predict = model.predict(test_data)
future_test = np.delete(future_test, 0)
future_test = np.append(future_test, batch_predict)
future_result = np.append(future_result, batch_predict)
#Plot sine wave
plt.figure()
plt.plot(range(25,len(predicted)+25),predicted, color="r", label="predict_data")
plt.plot(range(0, len(f)), f, color="b", label="row_data")
plt.plot(range(0+len(f), len(future_result)+len(f)), future_result, color="g", label="future_predict")
plt.legend()
plt.show()
The amplitude has become smaller little by little ... Perhaps the length of one training data is short at 25 steps? ?? Next time, I will try to predict in about 40 steps.
Source Code https://github.com/sasayabaku/Machine-Learning/blob/master/Example_RNN/SineWave_Prediction.ipynb
Recommended Posts