Learn elliptical orbits with Feedforward Neural Network using chainer. The test is done in an open loop.
I will solve this problem while explaining how to use chainer.
[problem] Learn the elliptical orbits of x = 0.8cos (θ) and y = 0.8sin (θ).
[approach] x(θn)=(0.8cos(θn),0.8sin(θn)), (0<=θn<=2π) Design an FNN that predicts x (θn + 1) from x (θn). Then, use this FNN to learn using training data, and test and confirm the results using test data.
Specify any point on the ellipse.
[Training data] xtrain (θn) = (0.8cos (θn), 0.8sin (θn)), θn = 2πn / 20 (0 <= n <= 20, n is a natural number) will do. Write as follows.
a=np.linspace(0,20,sample_no).reshape(sample_no,1)
xtrain=np.zeros(input_no*sample_no).reshape(sample_no,input_no).astype(np.float32)
xtrain[:,0]=[0.8*np.cos(2.0*np.pi*i/20) for i in a]
xtrain[:,1]=[0.8*np.sin(2.0*np.pi*i/20) for i in a]
[test data] xtest (θn) = (0.8cos (θn), 0.8sin (θn)), θn = 2πn / 27 (0 <= n <= 27, n is a natural number) will do. Similarly, write as follows.
a=np.linspace(0,27,sample_no).reshape(sample_no,1)
xtest=np.zeros(input_no*sample_no).reshape(sample_no,input_no).astype(np.float32)
xtest[:,0]=[0.8*np.cos(2.0*np.pi*i/27) for i in a]
xtest[:,1]=[0.8*np.sin(2.0*np.pi*i/27) for i in a]
Set as follows. Number of teacher data: sample_no Number of learning: epoch Number of layers: Input layer: input_no = 2 (fixed) Middle layer: hidden_no Output layer: output_no = 2 (fixed) Batch size: bs
Register the Link named Linear with the names l1 and l2.
class FNN(Chain):
def __init__(self): #Prepare a connection
super(FNN,self).__init__(
l1=L.Linear(input_no,hidden_no),
l2=L.Linear(hidden_no,output_no),
In addition, it should be noted l1=L.Linear(input_no,hidden_no), l2=L.Linear(hidden_no,output_no), Is self.add_link("l1",F.Linear(input_no,hidden_no)), self.add_link("l2",F.Linear(hidden_no,output_no)), Is the same as writing.
The registered Link is called as a function. The argument seems to be a variable class in principle.
#class FNN
def fwd(self,x):
h1=F.tanh(self.l1(x))
h2=F.tanh(self.l2(h1))
return h2
def get_predata(self,x):
return self.fwd(Variable(x.astype(np.float32).reshape(sample_no,input_no))).data
Called from chainer.functions. This time, we will use the mean square error.
#class FNN
def __call__(self,x,y): #Loss function
return F.mean_squared_error(self.fwd(x),y)
Create an optimizer and set it with the FNN model. There are various types such as SGD, Adam, and RMS Drop.
model=FNN()
optimizer=optimizers.SGD()
optimizer.setup(model)
Train the next data set (Xtrain_n + 1) as the correct answer for the current data set (Xtrain_n). Gradient initialization, backpropagation, and update optimizer.zero_grads() loss.backward() optimizer.update() Repeat.
for i in range(epoch):
for j in range(sample_no): #Put one ahead
if (j+1<sample_no):
ytrain[j]=np.array(xtrain[j+1])
else:
ytrain[j]=np.array(xtrain[0])
model.zerograds()
loss=model(xtrain,ytrain)
loss.backward()
optimizer.update()
Since it is done in an open loop, it reads the test data every time and predicts the next point. Since the test data is calculated by FeedForward and output to you, just write as follows.
yout=model.get_predata(xtest)
The teacher data (target ellipse) is drawn in blue and the test results are drawn in red.
100 learning times: still out of sync
5000 lessons: It's getting closer to an ellipse
Number of learning 5000000 times: Almost correct answer
ellipse.py
#-*- coding:utf-8 -*-
import numpy as np
import chainer
from chainer import cuda,Function,gradient_check,Variable,optimizers,serializers,utils
from chainer import Link,Chain,ChainList
import chainer.functions as F
import chainer.links as L
from sklearn import datasets
import matplotlib.pyplot as plt
#Number of teacher data
sample_no=100
#Number of learning
epoch=500000
#Number of layers
input_no=2
hidden_no=2
output_no=2
#Teacher data creation
a=np.linspace(0,20,sample_no).reshape(sample_no,1)
xtrain=np.zeros(input_no*sample_no).reshape(sample_no,input_no).astype(np.float32)
xtrain[:,0]=[0.8*np.cos(2.0*np.pi*i/20) for i in a]
xtrain[:,1]=[0.8*np.sin(2.0*np.pi*i/20) for i in a]
#Model building
class FNN(Chain):
def __init__(self): #Prepare a connection
super(FNN,self).__init__(
l1=L.Linear(input_no,hidden_no),
l2=L.Linear(hidden_no,output_no),
)
def __call__(self,x,y): #Loss function
return F.mean_squared_error(self.fwd(x),y)
def fwd(self,x):
h1=F.tanh(self.l1(x))
h2=F.tanh(self.l2(h1))
return h2
def get_predata(self,x):
return self.fwd(Variable(x.astype(np.float32).reshape(sample_no,input_no))).data
#Optimization method
model=FNN()
optimizer=optimizers.SGD()
optimizer.setup(model)
#Stores the correct answer value for training
ytrain=np.zeros(input_no*sample_no).reshape(sample_no,input_no).astype(np.float32)
#Batch size
bs=25
#Training
for i in range(epoch):
for j in range(sample_no): #Put one ahead
if (j+1<sample_no):
ytrain[j]=np.array(xtrain[j+1])
else:
ytrain[j]=np.array(xtrain[0])
model.zerograds()
loss=model(xtrain,ytrain)
loss.backward()
optimizer.update()
#test(openloop)
a=np.linspace(0,27,sample_no).reshape(sample_no,1)
xtest=np.zeros(input_no*sample_no).reshape(sample_no,input_no).astype(np.float32)
xtest[:,0]=[0.8*np.cos(2.0*np.pi*i/27) for i in a]
xtest[:,1]=[0.8*np.sin(2.0*np.pi*i/27) for i in a]
yout=model.get_predata(xtest)
print yout
#drawing
plt.plot(yout[:,0],yout[:,1],"r",label="training data") #Draw learning results in red
plt.plot(xtrain[:,0],xtrain[:,1],"b",label="teaching data") #Draw teacher data in blue
plt.show()
Recommended Posts