[Deep Learning from scratch] Initial value of neural network weight when using Relu function

Introduction

This article is an easy-to-understand output of ** Deep Learning from scratch Chapter 7 Learning Techniques **. I was able to understand it myself in the humanities, so I hope you can read it comfortably. Also, I would be more than happy if you could refer to it when studying this book.

About the initial value of the neural network weight

Until now, the initial value of the weight of the neural network used the random method to generate random numbers, but that would widen the success of learning.

The initial value of the weight and the learning of the neural network are very closely related, and if the initial value is appropriate, the learning result will be good, and if the initial value is inappropriate, the learning result will be bad.

Therefore, this time, I would like to implement a method of setting an appropriate initial value of weight for a neural network using the Relu function.

Initial value of He

The initial value of the weight, which is said to be most suitable for neural networks using the Relu function, is the initial value of He.

scale = np.sqrt(2.0 / all_size_list[idx - 1])
self.params['W' + str(idx)] = scale * np.random.randn(all_size_list[idx-1], all_size_list[idx])

The initial value of He can be created by calculating the root of the number of nodes in the previous layer by 2 ÷ and multiplying it by the random number obtained by random.

Recommended Posts

[Deep Learning from scratch] Initial value of neural network weight when using Relu function
[Deep Learning from scratch] Initial value of neural network weight using sigmoid function
"Deep Learning from scratch" Self-study memo (No. 10-2) Initial value of weight
Python vs Ruby "Deep Learning from scratch" Chapter 3 Graph of step function, sigmoid function, ReLU function
Chapter 3 Neural Network Cut out only the good points of deep learning made from scratch
Lua version Deep Learning from scratch Part 6 [Neural network inference processing]
Python vs Ruby "Deep Learning from scratch" Chapter 4 Implementation of loss function
[Learning memo] Deep Learning from scratch ~ Implementation of Dropout ~
Deep Learning from scratch
Application of Deep Learning 2 made from scratch Spam filter
[Deep Learning] Execute SONY neural network console from CUI
Countermeasures for "Unable to get upper directory" error when using Deep Learning ② created from scratch with spyder of ANACONDA
Deep Learning from scratch 1-3 chapters
[Deep Learning from scratch] Implementation of Momentum method and AdaGrad method
Rank learning using neural network (Implementation of RankNet by Chainer)
Try to build a deep learning / neural network with scratch
Non-information graduate student studied machine learning from scratch # 2: Neural network
[Deep learning] Investigating how to use each function of the convolutional neural network [DW day 3]
[Deep Learning from scratch] Main parameter update methods for neural networks
Deep learning from scratch (cost calculation)
[Deep Learning from scratch] About the layers required to implement backpropagation processing in a neural network
Deep Learning from scratch 4.4.2 Gradient for neural networks The question about the numerical_gradient function has been solved.
Write an impression of Deep Learning 3 framework edition made from scratch
Deep Learning memos made from scratch
"Deep Learning from scratch" self-study memo (No. 13) Try using Google Colaboratory
[Deep Learning from scratch] I tried to implement sigmoid layer and Relu layer.
[Learning memo] Deep Learning made from scratch [Chapter 7]
Deep learning from scratch (forward propagation edition)
Implementation of 3-layer neural network (no learning)
Deep learning / Deep learning from scratch 2-Try moving GRU
Deep learning / Deep learning made from scratch Chapter 6 Memo
[Learning memo] Deep Learning made from scratch [Chapter 5]
[Learning memo] Deep Learning made from scratch [Chapter 6]
"Deep Learning from scratch" in Haskell (unfinished)
Implementation of "blurred" neural network using Chainer
Deep learning / Deep learning made from scratch Chapter 7 Memo
[Windows 10] "Deep Learning from scratch" environment construction
[Deep Learning from scratch] About hyperparameter optimization
"Deep Learning from scratch" Self-study memo (Part 12) Deep learning
[Learning memo] Deep Learning made from scratch [~ Chapter 4]
Deep Learning from scratch The theory and implementation of deep learning learned with Python Chapter 3
[Deep Learning from scratch] Speeding up neural networks I explained back propagation processing
"Deep Learning from scratch" self-study memo (unreadable glossary)
[Python / Machine Learning] Why Deep Learning # 1 Perceptron Neural Network
"Deep Learning from scratch" Self-study memo (9) MultiLayerNet class
Deep Learning from scratch ① Chapter 6 "Techniques related to learning"
Good book "Deep Learning from scratch" on GitHub
Deep learning / error back propagation of sigmoid function
Deep Learning from scratch Chapter 2 Perceptron (reading memo)
Python vs Ruby "Deep Learning from scratch" Summary
Deep Learning from scratch 4.3.3 Draw a gradient vector of your own function based on the sample code of partial differential.
"Deep Learning from scratch" Self-study memo (10) MultiLayerNet class
"Deep Learning from scratch" Self-study memo (No. 11) CNN
Python vs Ruby "Deep Learning from scratch" Chapter 1 Graph of sin and cos functions
[Deep Learning from scratch] I implemented the Affine layer
"Deep Learning from scratch" Self-study memo (No. 19) Data Augmentation
"Deep Learning from scratch 2" Self-study memo (No. 21) Chapters 3 and 4
Othello ~ From the tic-tac-toe of "Implementation Deep Learning" (4) [End]
Implementation of a convolutional neural network using only Numpy
[Deep Learning from scratch] I tried to explain Dropout
Collection and automation of erotic images using deep learning