I tried how to improve the accuracy of my own Neural Network

I want to learn XOR with Neurl Network!

Input, hidden layer, output layer There were times when XOR could be realized by simply making these three, and there were times when it was not a layer, so I tried to improve the accuracy.

By the way, the output when it doesn't work

#input[ (0,0)  , (0,1)   , (1,0)   , (1,1)  ]
#output[0.01... , 0.63... , 0.62... , 0.66...]
#answer[0       , 1       , 1       , 0      ]

With that feeling, the value when (1,1) is entered will be greatly deviated. The goal is to do something about this.

table of contents

--Increase the number of hidden node nodes from 2 to 3 ――The learning rate decreases as you learn --Increase the number of people to learn

Increase hidden layer nodes

This was done by adding a simple expression. I tried about 10 times, but it didn't change much. If you increase the number of trials a little more, the result may change. However, there was not much change in the way the numbers fluctuated, so we stopped after about 10 times.

Gradually lower the learning rate

I later realized that this was the last precision issue, but of course it didn't change much. Because, when it shifts, it shifts greatly from the beginning. So it doesn't make much sense if you can't adjust how the initial action works.

Increase the number of people to learn

I don't know what to express, so it's expressed like this, but in the image, if the start is bad, I will reduce the number of trials by epochs and increase it, and let only the excellent guys learn again. is. In other words, I tried to make some of the excellent guys and let them learn, and then let them learn among the excellent guys.

This has changed a lot.

Until now, I felt that the answer was close to the correct answer with a probability of about 1/5, but when I changed it, it was close to the correct answer with a probability of about 1/2! When I increased the number of trials, the difference became even wider.

Conclusion

I don't know what to call it, but it was the best feeling to have some people to learn and let them learn among the best guys! ~~ After all, is it a world where only excellent people survive ... scary ... ~~ AI and humans are the same!

Recommended Posts

I tried how to improve the accuracy of my own Neural Network
I made my own 3-layer forward propagation neural network and tried to understand the calculation deeply.
10 methods to improve the accuracy of BERT
I tried to improve the efficiency of daily work with Python
I tried to predict the genre of music from the song title on the Recurrent Neural Network
I tried to touch the API of ebay
I tried to correct the keystone of the image
I tried to predict the price of ETF
I tried to vectorize the lyrics of Hinatazaka46!
I tried to summarize how to use matplotlib of python
I tried to summarize the basic form of GPLVM
I tried to summarize four neural network optimization methods
I tried to visualize the spacha information of VTuber
Basics of PyTorch (2) -How to make a neural network-
I tried to erase the negative part of Meros
I tried to classify the voices of voice actors
I tried to summarize the string operations of Python
The 15th offline real-time I tried to solve the problem of how to write with python
I tried to implement a basic Recurrent Neural Network model
I tried to find the entropy of the image with python
[Horse Racing] I tried to quantify the strength of racehorses
I tried to simulate how the infection spreads with Python
I tried to get the location information of Odakyu Bus
I tried the accuracy of three Stirling's approximations in python
I tried to find the average of the sequence with TensorFlow
(Note) How to pass the path of your own module
Try to improve the accuracy of Twitter like number estimation
[Python] I tried to visualize the follow relationship of Twitter
[Machine learning] I tried to summarize the theory of Adaboost
I tried to fight the Local Minimum of Goldstein-Price Function
I tried to classify music major / minor on Neural Network
I tried to summarize how to use the EPEL repository again
[Deep learning] Investigating how to use each function of the convolutional neural network [DW day 3]
How to write offline real time I tried to solve the problem of F02 with Python
I tried to compare the accuracy of machine learning models using kaggle as a theme.
Touch the object of the neural network
How to easily draw the structure of a neural network on Google Colaboratory using "convnet-drawer"
I tried to move the ball
I tried to estimate the interval.
[Linux] I tried to summarize the command of resource confirmation system
I tried to get the index of the list using the enumerate function
I tried to automate the watering of the planter with Raspberry Pi
Make the theme of Pythonista 3 like Monokai (how to make your own theme)
I tried to build the SD boot image of LicheePi Nano
I summarized how to change the boot parameters of GRUB and GRUB2
I tried to expand the size of the logical volume with LVM
I want to check the position of my face with OpenCV!
I tried to summarize the frequently used implementation method of pytest-mock
I tried to visualize the common condition of VTuber channel viewers
[NNabla] How to remove the middle tier of a pre-built network
I tried to find out how to streamline the work flow with Excel × Python, my article summary ★
What I was addicted to when I built my own neural network using the weights and biases I got with scikit-learn's MLP Classifier.
How to build my own Linux server
I tried to summarize the umask command
How to check the version of Django
I tried to recognize the wake word
I tried to summarize the graphical modeling.
I tried to estimate the pi stochastically
I tried to touch the COTOHA API
I tried my best to return to Lasso
I tried to transform the face image using sparse_image_warp of TensorFlow Addons