[Part 4] Use Deep Learning to forecast the weather from weather images

Until last time

Until the last time

[Part 1] Use Deep Learning to forecast the weather from weather images [Part 2] Use Deep Learning to forecast the weather from weather images [Part 3] Use Deep Learning to forecast the weather from weather images

If you summarize only the final result

We increased the Epoch from 50 generations to 100 generations, and the average AUC was 0.724, and the following results were the best results at the moment.

weather precision recall f1-score
rain 0.53 0.77 0.63
Fine 0.86 0.68 0.76
average 0.76 0.71 0.72

At least there were too few generations. However, changing the image, increasing the training data, and ternary classification did not improve the results.

trial and error

After a lot of trial and error, I tried to do something that was neither uh nor uh, but I couldn't find anything that exceeded this AUC ...

To summarize only the results,

If you try to change the generation to 1000 generations, the accuracy will drop.

This is overfitting. With the training data, the accuracy goes up to 0.95, but with the test data, the result is bad.

Even if you increase the number of hidden layers, the accuracy does not increase, but rather decreases.

Currently, the hidden layer has 500 nodes, but even if this is 1000 nodes, the accuracy did not improve.

I try to make a fine model, a rain model, and a cloudy model, but the accuracy drops.

I made three models that perform binary classification of whether it is sunny, rainy, or cloudy, but the accuracy has dropped.

I tried to increase the number of convolution layers, but the accuracy did not improve either.

It was said that it was better to increase it, but this is also useless.

etc.

Possibility of arriving

As a result, the candidates that arrived are as follows.

1. Change the scaling method.

So far, image data scaling has been

img_mat /= 255.

That was a very simple scaling. Scaling is important even in conventional machine learning, and learning often cannot be done well without proper scaling.

Therefore, this is changed by standardizing the training data for each RGB channel as follows.

for i in range(0,3):
	m = x_train_mat[:,i,:,:].mean()
	s = x_train_mat[:,i,:,:].std()
	x_train_mat[:,i,:,:] = (x_train_mat[:,i,:,:] - m)/s
	x_test_mat[:,i,:,:] = (x_test_mat[:,i,:,:] - m)/s

As a result, looking at the learning process, the learning speed has changed significantly, and the accuracy has improved in a small number of generations.

2. Addition of batch processing

According to the paper, CNN should use a pre-trained model, so let's introduce this briefly. In normal training, the batch size divides the training data within 1 Epoch, but the training data itself to be introduced into the model is divided.

In other words

Learning data A → Model tuning → Learning data B → Model tuning → (ry

After tuning the model with A as in, tune the model with B using that parameter.

The idea is to suppress overfitting and fine-tuning.

When written in code

N = len(x_train_mat)
wbatch = N/3
perm = np.random.permutation(N)
for i in six.moves.range(0, N, wbatch):
	t_x_data = x_train_mat[perm[i:i + wbatch]]
	t_y_data = y_train[perm[i:i + wbatch]]
	if len(t_y_data) > wbatch / 2:
    	"""Discard if there is too little training data"""
    	cnn.fit_test(t_x_data,t_y_data,x_test_mat,y_test)

Like this.

3. Processing of teacher data

The teacher data in the training data is biased, and the relationship is like rain <fine. As a result, it fits excessively on the sunny side, and it has become apparent that the accuracy of rain does not improve. Therefore, in order to increase the rain data in the training data, the following processing was performed.

  1. Make all 2015 data into learning data.
  2. Add 2013 and 2014 "rain" data to the training data.
  3. Use all 2016 data without changing the test data.

4. Modest model change

To prevent overfitting, we included drop out in the CNN max pooling results.

5. Adjust filter size and pooling size

Until now, I decided on this area without adjusting it and set it to 3, but I will change this.

Progress

Under the above conditions, we will change the filter size and pooling size.

For filter size 3 and pooling size 2

Average AUC: 0.711 Sunny AUC: 0.790

weather precision recall f1-score
rain 0.6 0.61 0.6
Fine 0.82 0.82 0.82
average 0.75 0.75 0.75

For filter size 3 and pooling size 4

Average AUC: 0.719 Sunny AUC: 0.783

weather precision recall f1-score
rain 0.66 0.58 0.61
Fine 0.82 0.86 0.84
average 0.77 0.77 0.77

For filter size 3 and pooling size 6

Average AUC: ** 0.735 ** Sunny AUC: 0.780

weather precision recall f1-score
rain 0.67 0.61 0.63
Fine 0.83 0.86 0.85
average 0.78 0.78 0.78

For filter size 3 and pooling size 8

Average AUC: 0.688 Sunny AUC: 0.790

weather precision recall f1-score
rain 0.67 0.48 0.56
Fine 0.79 0.89 0.84
average 0.75 0.76 0.75

For filter size 5 and pooling size 6

Average AUC: ** 0.741 ** Sunny AUC: 0.784

weather precision recall f1-score
rain 0.53 0.8 0.64
Fine 0.88 0.68 0.77
average 0.77 0.72 0.73

For filter size 1 and pooling size 6

Average AUC: ** 0.750 ** Sunny AUC: 0.790

weather precision recall f1-score
rain 0.61 0.7 0.65
Fine 0.85 0.8 0.83
average 0.78 0.77 0.77

Apparently, the result is better if the filter size is set to 0 and the pooling size is set a little larger. Increasing the pooling size will extract more characteristic features, so this may be better this time. I think there is a good combination of filter size and pooling size depending on the problem.

The result of trial and error

The final result of the completed model is as follows.

Average AUC: ** 0.767 ** Haru AUC: ** 0.806 **

weather precision recall f1-score
rain 0.59 0.77 0.67
Fine 0.88 0.76 0.82
average 0.79 0.76 0.77

It has risen considerably as an AUC.

Conclusion

I'm continuing trial and error by changing various conditions, but I haven't been able to find something like a pattern, so I'm trying various combinations. However, since the evaluation is done too much with the test data, isn't the test data also a part of the training data? It is certain that the theory has come out. It may be better to use different test data when the final model is completed.

Recommended Posts

[Part 4] Use Deep Learning to forecast the weather from weather images
[Part 1] Use Deep Learning to forecast the weather from weather images
[Part 3] Use Deep Learning to forecast the weather from weather images
[Part 2] Use Deep Learning to forecast the weather from weather images
Deep Learning from the mathematical basics Part 2 (during attendance)
I tried to implement Perceptron Part 1 [Deep Learning from scratch]
Reinforcement learning to learn from zero to deep
Image alignment: from SIFT to deep learning
"Deep Learning from scratch" Self-study memo (Part 12) Deep learning
Get data from Poloniex, a cryptocurrency exchange, via API and use deep learning to forecast prices for the next day.
About the order of learning programming languages (from beginner to intermediate) Part 2
Deep Learning beginners tried weather forecasting from meteorological satellite images using Keras
Tweet the weather forecast with a bot Part 2
Deep Learning from scratch ① Chapter 6 "Techniques related to learning"
POST images from ESP32-CAM (MicroPython) to the server
Stock Price Forecast Using Deep Learning (TensorFlow) -Part 2-
Deep Learning from scratch
Create a dataset of images to use for learning
[Deep Learning from scratch] I implemented the Affine layer
I wanted to use the Python library from MATLAB
Othello ~ From the tic-tac-toe of "Implementation Deep Learning" (4) [End]
[Deep Learning from scratch] I tried to explain Dropout
Deep Learning from scratch 1-3 chapters
Paper: Machine learning paper that reproduces images in the brain, (Deep image reconstruction from human brain activity)
How to use the generator
[Deep learning] Investigating how to use each function of the convolutional neural network [DW day 3]
[Deep Learning from scratch] I tried to explain the gradient confirmation in an easy-to-understand manner.
[Python] I asked LINE BOT to answer the weather forecast.
How to increase the number of machine learning dataset images
(Deep learning) Images were collected from the Flickr API and discriminated by transfer learning with VGG16.
Introduction to Deep Learning ~ Learning Rules ~
I captured the Touhou Project with Deep Learning ... I wanted to.
Deep Reinforcement Learning 1 Introduction to Reinforcement Learning
"Deep Learning from scratch" Self-study memo (Part 8) I drew the graph in Chapter 6 with matplotlib
How to use the decorator
Introduction to Deep Learning ~ Backpropagation ~
[Deep Learning from scratch] About the layers required to implement backpropagation processing in a neural network
Chapter 1 Introduction to Python Cut out only the good points of deep learning made from scratch
[Machine learning] Understand from mathematics why the correlation coefficient ranges from -1 to 1.
Lua version Deep Learning from scratch Part 6 [Neural network inference processing]
How to use machine learning for work? 01_ Understand the purpose of machine learning
[Python + heroku] From the state without Python to displaying something on heroku (Part 1)
[Python + heroku] From the state without Python to displaying something on heroku (Part 2)
Deep Learning / Deep Learning from Zero 2 Chapter 4 Memo
How to use the zip function
How to use the optparse module
Deep Learning / Deep Learning from Zero Chapter 3 Memo
How to use SWIG from waf
Deep Learning / Deep Learning from Zero 2 Chapter 5 Memo
Introduction to Deep Learning ~ Function Approximation ~
Try deep learning with TensorFlow Part 2
Deep learning from scratch (cost calculation)
Deep learning to start without GPU
Introduction to Deep Learning ~ Coding Preparation ~
Post images from Python to Tumblr
Use the Flickr API from Python
Deep Learning / Deep Learning from Zero 2 Chapter 7 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 8 Memo
Deep Learning / Deep Learning from Zero Chapter 5 Memo
Deep Learning / Deep Learning from Zero Chapter 4 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 3 Memo