I don't have a GPU, but I'll try deep learning

Preface

With the concept of "deep learning tutorials can be done without building a complicated environment", you can try it by just copying the code in this article. (I mean, just clone it from GitHub and run it)

environment macOS Mojave version 10.14.6 python3.6.5

--Target readers of this article As the title suggests, it is intended for those who do not have a GPU but want to try deep learning. If you want to do deep learning in earnest, you still need a GPU, but I think it's best to actually move it to intuitively understand "what you can do" and "how it works". We will proceed on the assumption that the environment of Python itself has been built.

--Definition of deep learning The definition may differ slightly depending on the information source, but I feel that algorithms called deep learning and deep learning are often called from three or more hidden layers. (Sorry for being ambiguous) This article also deals with models with fewer hidden layers, but I won't make any distinction. However, the algorithm is called a neural network, and the weight learning is called deep learning.

--Evaluation function, loss function The evaluation function uses R ^ 2 score, and the loss function uses MSE. ..

Simply put, the R ^ 2 score is an evaluation function that expresses how close the regressed (predicted) curve is to the correct curve by 0 to 1. MSE is called the mean squared error, which is the average value of the mean squared error between the correct answer value and the predicted value.

Problem setting

Predict the SinCos curve from two input values (x1, x2)

The image looks like this. We will check how the accuracy changes depending on the number of hidden layers and the number of units in each layer.

Screen Shot 2019-12-10 at 11.31.12.png

Environment

#For simplicity, please follow the steps below to clone the repository I worked on.

$ git clone https://github.com/keroido/DNN-learning-Sin-Cos-wave.git
$ cd DNN-learning-Sin-Cos-wave

#Create a virtual environment and enter the virtual environment. (Any)
$ pip install virtualenv
$ virtualenv venv
$ . venv/bin/activate

#Install all the libraries required for the virtual environment.
(venv)$ pip install -r requirements.txt

#When leaving the virtual environment venv$ deactivate

Create training data

Follow the steps below to create a dataset. Generates data of 4 columns of Sin and Cos x 1000 rows when input values x0, x1 and the two are added.
Predict Sin and Cos from these x0 and x1.

image

index x0 x1 Sin Cos
0 50.199163279521 17.5983756102216 0.925854354002364 0.377880556756848
1 127.726947420807 116.093208916234 -0.897413633456196 -0.441190174966475
2 54.2208002632216 116.589734921833 0.159699676625697 -0.987165646325705
3 156.256738791155 8.64049515860479 0.260551118156132 -0.965460053460312
: ... ... ... ...
: ... ... ... ...
999 23.2978504439148 109.826906405408 0.72986697370653 -0.683589204634239

(0 <= x1, x2 <= 180)


Run the program that generates the dataset in the following directory. Also, create a training data set storage input and an output result storage output here.

#Check the current directory.
$ pwd
 [out]: .../DNN-learning-Sin-Cos-wave/code

#Create a place for input data and a place for output data.
$ mkdir ../input ../output

#Run the program that generates the dataset
$ python make_dataset.py
# make_dataset.py

import numpy as np
import pandas as pd
import math

x0 = np.random.rand(1000) * 180
x1 = np.random.rand(1000) * 180
s = [math.sin(math.radians(i+s)) for i, s in zip(x0, x1)]
c = [math.cos(math.radians(i+s)) for i, s in zip(x0, x1)]

df = pd.DataFrame({'x0':x0, 'x1':x1, 'sin':s, 'cos':c})
df.to_csv('../input/data.csv')

Then data.csv will be generated in the input directory.

Try deep learning

Now let's do deep learning. The theme of this article is deep learning without using GPU, so we will implement it with scikit-learn.
There is also train.py, but at the same time we are evaluating each model.

$ pwd
 [out]: .../DNN-learning-Sin-Cos-wave/code

$ python train.py
# train.py
import numpy as np
import pandas as pd
from sklearn.model_selection import train_test_split #Divide the data into training sets and test sets.
from sklearn.neural_network import MLPRegressor #It is a function of a neural network that runs on sklearn.
from sklearn.metrics import mean_squared_error # MSE(Mean mean square error)

#Reads the data in the input directory.
df = pd.read_csv('../input/data.csv')
df = df.drop('Unnamed: 0', axis=1)

#X for x0 and x1, y for SinCos
X = df.iloc[:, :2]
y = df.iloc[:, 2:]

#Divide into a training set and a test set.
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=0)

#2 hidden layers,3,4 Number of units is 10,50,100,150,200 Try all combinations.
hidden_layer_sizes = [(10, 10,), (50, 50,), (100, 100,), (150, 150,), (200, 200,),
                      (10, 10, 10,), (50, 50, 50,), (100, 100, 100,), (150, 150, 150,), (200, 200, 200,),
                      (10, 10, 10, 10,), (50, 50, 50, 50,), (100, 100, 100, 100,), (150, 150, 150, 150,), (200, 200, 200, 200,)]
ln = list(range(len(hidden_layer_sizes))) 

# Sin,Cos MSE and R respectively^2 Create a data frame to write the score
score_df = pd.DataFrame(columns={'sin_mse', 'cos_mse', 'r^2_score'})

for i, hidden_layer_size in zip(ln, hidden_layer_sizes):
    #Model details(https://scikit-learn.org/stable/modules/generated/sklearn.neural_network.MLPRegressor.html)
    #If you change verbose to True and execute it, you will be able to see the progress of learning.
    model = MLPRegressor(activation='relu', alpha=0, batch_size=100,
                         hidden_layer_sizes=hidden_layer_size, learning_rate_init=0.03,
                         random_state=0, verbose=False, solver='adam')
    #Feed the model a training dataset.
    model.fit(X_train, y_train)
    #Test set x0,x1 to Sin,Predict Cos.
    pred = model.predict(X_test)
    #Below is the formatting of the data frame that outputs the predicted result to the output directory, etc.
    pred = pd.DataFrame(pred)
    x = pd.DataFrame({'x':(X_test['x0'] + X_test['x1']).tolist()})
    tes = y_test.rename(columns={'sin': 'sin(label)', 'cos': 'cos(label)'}).reset_index(drop=True)
    pre = pred.rename(columns={0: 'sin(prediction)', 1: 'cos(prediction)'}).reset_index(drop=True)
    ans_df = pd.concat([x, tes, pre], axis=1)
    ans_df = ans_df[['x', 'sin(label)', 'sin(prediction)', 'cos(label)', 'cos(prediction)']]
    ans_df.to_csv('../output/result_{}_{}_{}.csv'.format(str(i).zfill(2), len(hidden_layer_size), hidden_layer_size[0]))
    sin_mse = mean_squared_error(tes['sin(label)'].tolist(), pre['sin(prediction)'].tolist())
    cos_mse = mean_squared_error(tes['cos(label)'].tolist(), pre['cos(prediction)'].tolist())
    r2 = model.score(X_test, y_test)
    score_df.loc['{}'.format(i), 'sin_mse'] = sin_mse
    score_df.loc['{}'.format(i), 'cos_mse'] = cos_mse
    score_df.loc['{}'.format(i), 'r^2_score'] = r2

col = ['sin_mse', 'cos_mse', 'r^2_score']
score_df = score_df[col]
#Output to the output directory
score_df.to_csv('../output/score.csv')

Commentary

(Sorry for the dirty code, but ...) Look at hidden_layer_sizes. Here, the number of layers (number of hidden layers) and the number of units of the neural network are changed as follows so that it is possible to make trial and error as to which combination can produce good accuracy.

Number of layers\Number of units 10 50 100 150 200
2 Spirit Every time But Good Ku
3 Nana Ru set Mi Go
4 Wow Se Is what

Evaluate and visualize

Jupyter notebook is convenient for visualization, so let's use it.

$ pip install jupyter

$ pwd
 [out]: .../DNN-learning-Sin-Cos-wave/code

$ ls
 [out]: make_dataset.py train.py viewer.ipynb

$ jupyter notebook

Open viewer.ipynb when jupyter notebook starts in your browser. This notebook allows you to evaluate and visualize data simply by running it from the top. https://github.com/keroido/DNN-learning-Sin-Cos-wave/blob/master/code/viewer.ipynb The following is execution on jupyter. (Code that does not require explanation is omitted)

Looking at the output directory, there are 15 csv files with names such as'result_00_2_10.csv'. Taking'result_00_2_10.csv' as an example, the name of this file is 00, where 2 is the number of layers and 10 is the number of units. Therefore, this csv file is said to be "the result of learning with the neural network of 10 units in 2 layers created in the 0th position".

!ls ../output

[out]:
result_00_2_10.csv  result_04_2_200.csv result_08_3_150.csv result_12_4_100.csv
result_01_2_50.csv  result_05_3_10.csv  result_09_3_200.csv result_13_4_150.csv
result_02_2_100.csv result_06_3_50.csv  result_10_4_10.csv  result_14_4_200.csv
result_03_2_150.csv result_07_3_100.csv result_11_4_50.csv  score.csv

1, Check the score of each neural network

score_df = pd.read_csv('../output/score.csv')
score_df = score_df.drop('Unnamed: 0', axis=1)
score_df

Let's check under what conditions the R ^ 2 score value is good. Looking at the results, the 9th, 3 layer 200 unit neural network of result_09_3_200.csv gives the best result. (May change depending on the settings) You can see that it is not just a matter of having a deep layer.

index sin_mse cos_mse r^2_score
0 0.118307 0.272191 0.551913
1 0.071344 0.174416 0.717997
2 0.101467 0.269444 0.574389
3 0.053282 0.022353 0.913211
4 0.374317 0.242327 0.292416
5 0.127534 0.274327 0.538875
6 0.061558 0.163282 0.742001
7 0.195692 0.262261 0.474512
8 0.034099 0.010542 0.948776
9 0.006197 0.004922 0.987241
10 0.512035 0.361053 -0.001846
11 0.116843 0.099484 0.751770
12 0.013951 0.029560 0.950072
13 0.009213 0.009595 0.978419
14 0.005862 0.006255 0.986096

2, Check the csv file that gave the best score

tmp = pd.read_csv('../output/result_09_3_200.csv')
tmp = tmp.drop('Unnamed: 0', axis=1)
tmp

(label) is the correct label and (prediction) is the predicted value of the neural network. You can see that we can predict a value that is relatively close.

x sin(label) sin(prediction) cos(label) cos(prediction)
0 271.800382 -0.999506 -0.912688 0.031417
1 133.334658 0.727358 0.722477 -0.686258
2 136.451163 0.688973 0.656727 -0.724787
3 187.429195 -0.129301 -0.182335 -0.991605
4 229.748855 -0.763220 -0.801409 -0.646139
... ... ... ... ...

3, actually visualize

files = glob.glob('../output/result*.csv')
files.sort()
csvs = []
t = []
for i in range(1, 16):
    t.append(files[i-1])
    if i%5 == 0:
        csvs.append(t)
        t = []

Sin

fig, axes = plt.subplots(3, 5, figsize=(15, 10))
fig.subplots_adjust(hspace=0.3, wspace=0.3)
for i in range(3):
    for j in range(5):
        tmp = pd.read_csv(csvs[i][j])
        axes[i, j].scatter(tmp.loc[:, 'x'], tmp.loc[:, 'sin(label)'], c='b')
        axes[i, j].scatter(tmp.loc[:, 'x'], tmp.loc[:, 'sin(prediction)'], c='r', alpha=0.5)
        axes[i, j].set_title('layer:{}, unit:{}'.format(csvs[i][j][20], csvs[i][j][22:-4]))
        plt.xlim(-5, 365)
Screen Shot 2019-12-10 at 10.49.03.png

cos

fig, axes = plt.subplots(3, 5, figsize=(15, 10))
fig.subplots_adjust(hspace=0.3, wspace=0.3)
for i in range(3):
    for j in range(5):
        tmp = pd.read_csv(csvs[i][j])
        axes[i, j].scatter(tmp.loc[:, 'x'], tmp.loc[:, 'cos(label)'], c='b')
        axes[i, j].scatter(tmp.loc[:, 'x'], tmp.loc[:, 'cos(prediction)'], c='r', alpha=0.5)
        axes[i, j].set_title('layer:{}, unit:{}'.format(csvs[i][j][20], csvs[i][j][22:-4]))
        plt.xlim(-5, 365)
Screen Shot 2019-12-10 at 10.49.20.png

It's fun to see at a glance how the neural network predicted when you visualize it. This is the end of "I don't have a GPU, but I'll try deep learning." Thank you for your hard work.

Recommended Posts

I don't have a GPU, but I'll try deep learning
I installed Chainer, a framework for deep learning
I tried deep learning
A scene where GPU is useful for deep learning?
Try deep learning with TensorFlow
Try to build a deep learning / neural network with scratch
Deep Learning Gaiden ~ GPU Programming ~
(Now) Build a GPU Deep Learning environment with GeForce GTX 960
Try Deep Learning with FPGA
I tried hosting a TensorFlow deep learning model using TensorFlow Serving
I tried to divide with a deep learning language model
I searched for a similar card of Hearthstone with Deep Learning
I made a C ++ learning site
Try Deep Learning with FPGA-Select Cucumbers
Try deep learning with TensorFlow Part 2
I have a question about whitespace
Deep learning to start without GPU
I tried deep learning using Theano
I don't have the skills or strength, but I made my own compiler
[Windows] A library where you can try Deep Learning immediately Keras course-Part 2
Try Bitcoin Price Forecasting with Deep Learning
Try with Chainer Deep Q Learning --Launch
I have a question! (Python, django) Easy
Try deep learning of genomics with Kipoi
I'm not sure, but I feel like I understand Deep Learning (I tried Deep Learning from scratch)
[Python] Deep Learning: I tried to implement deep learning (DBN, SDA) without using a library.
I don't have time, so can I just ask a question? feat. COTOHA API
I tried to extract a line art from an image with Deep Learning