Deep learning / error back propagation of sigmoid function

1.First of all

Summarize the error back propagation of the sigmoid function

2. Differentiation of sigmoid function

The derivative of the sigmoi degree function has a beautiful and simple form.

** sigmoid function: **

sigmoid(x) = \frac{1}{1+e^{-x}}

** Differentiation of sigmoid function: **

sigmoid'(x) = \frac{1}{1+e^{-x}} * ( 1 - \frac{1}{1+e^{-x}})

3. Code of sigmoid function

Therefore, the error back propagation code of the sigmoid function is also simplified as follows.

Class Sigmoid(object):
   def __init__(self, x):
       self.x = x

   def forward(self):
       y = 1.0 / (1.0 + np.exp(- self.x))
       self.y = y
       return y
   
   def backward(self, grad_to_y):
       grad_to_x = grad_to_y * self.y * (1.0 - self.y)
       return grad_to_x

Recommended Posts

Deep learning / error back propagation of sigmoid function
Error back propagation method (back propagation)
Deep learning 1 Practice of deep learning
Deep learning / softmax function
[Deep Learning from scratch] Initial value of neural network weight using sigmoid function
Deep running 2 Tuning of deep learning
Deep reinforcement learning 2 Implementation of reinforcement learning
Python vs Ruby "Deep Learning from scratch" Chapter 3 Graph of step function, sigmoid function, ReLU function
[Deep Learning from scratch] Implement backpropagation processing in neural network by error back propagation method
Introduction to Deep Learning ~ Function Approximation ~
Introduction to Deep Learning ~ Forward Propagation ~
Deep learning / matrix product error backpropagation
Summary Note on Deep Learning -4.2 Loss Function-
Deep learning from scratch (forward propagation edition)
Othello-From the tic-tac-toe of "Implementation Deep Learning" (3)
Meaning of deep learning models and parameters
Try deep learning of genomics with Kipoi
Visualize the effects of deep learning / regularization
Sentiment analysis of tweets with deep learning
Deep Learning
Learning record of reading "Deep Learning from scratch"
[Deep Learning from scratch] Layer implementation from softmax function to cross entropy error
Python vs Ruby "Deep Learning from scratch" Chapter 4 Implementation of loss function
Othello-From the tic-tac-toe of "Implementation Deep Learning" (2)
[Deep Learning from scratch] Speeding up neural networks I explained back propagation processing
The story of doing deep learning with TPU
Chainer and deep learning learned by function approximation
A memorandum of studying and implementing deep learning
[Learning memo] Deep Learning from scratch ~ Implementation of Dropout ~
Basic understanding of stereo depth estimation (Deep Learning)
Parallel learning of deep learning by Keras and Kubernetes
Introduction to Deep Learning ~ Localization and Loss Function ~
Implementation of Deep Learning model for image recognition
Deep learning learned by implementation (segmentation) ~ Implementation of SegNet ~
Deep Learning Memorandum
Start Deep learning
Python Deep Learning
Deep learning × Python
[Deep Learning from scratch] Initial value of neural network weight when using Relu function
Count the number of parameters in the deep learning model
Application of Deep Learning 2 made from scratch Spam filter
Techniques for understanding the basis of deep learning decisions
Othello ~ From the tic-tac-toe of "Implementation Deep Learning" (4) [End]
DNN (Deep Learning) Library: Comparison of chainer and TensorFlow (1)
Cats are already tired of loose fluffy deep learning
Collection and automation of erotic images using deep learning
DEEP PROBABILISTIC PROGRAMMING --- "Deep Learning + Bayes" Library --- Introduction of Edward
[Deep learning] Investigating how to use each function of the convolutional neural network [DW day 3]