[Deep Learning from scratch] About the layers required to implement backpropagation processing in a neural network

Introduction

This article is an easy-to-understand output of ** Deep Learning from scratch Chapter 6 Error back propagation method **. I was able to understand it myself, so I hope you can read it comfortably. Also, I would be more than happy if you could refer to it when studying this book.

What is a layer?

To implement backpropagation in a neural network, neurons need to be layers. It's easiest to think of a layer as a neuron that implements both forward and back propagation.

The minimum number of layers required to implement a neural network is ** Sigmoid layer, Relu layer, Affine layer, output layer to loss function **.

By implementing layers as classes and making them like parts, it becomes easier to operate when rearranging neural networks into various structures.

So, this time, I will layer the addition neuron and the multiplication neuron that I mentioned in the previous article.

Addition layer

class AddLayer:#Addition layer
    def __init__(self):
        pass#Do nothing
    
    def forward(self, x, y):
        out = x + y
        return out
    
    def backward(self, dout):
        dx = dout * 1
        dy = dout * 1 #At the time of addition, both variables inherit the previous derivative
        return dx, dy

In the addition layer, the forward propagation process sums the values of the two variables and returns them, and the back propagation process inherits and returns the previous derivative.

Multiplication layer

class MulLayer:#Multiplication layer
    def __init__(self):
        self.x = None
        self.y = None #Variable xy to instance variable

    def forward(self, x, y):
        self.x = x
        self.y = y #Save the value of the variable xy as it will be used in the backpropagation process.
        out = x * y
        
        return out

    def backward(self, dout):
        dx = dout * self.y
        dy = dout * self.x

        return dx, dy

The multiplication layer returns by multiplying the variable xy in the forward propagation process, and finds and returns the derivative by multiplying the value of the other variable by the previous derivative in the back propagation process.

Recommended Posts

[Deep Learning from scratch] About the layers required to implement backpropagation processing in a neural network
[Deep Learning from scratch] Implement backpropagation processing in neural network by error back propagation method
I tried to implement Perceptron Part 1 [Deep Learning from scratch]
Chapter 3 Neural Network Cut out only the good points of deep learning made from scratch
[Deep Learning from scratch] I tried to explain the gradient confirmation in an easy-to-understand manner.
Deep Learning from scratch 4.4.2 Gradient for neural networks The question about the numerical_gradient function has been solved.
"Deep Learning from scratch" in Haskell (unfinished)
[Deep Learning from scratch] About hyperparameter optimization
[Deep Learning from scratch] I tried to implement sigmoid layer and Relu layer.
Deep Learning from scratch ① Chapter 6 "Techniques related to learning"
Python vs Ruby "Deep Learning from scratch" Chapter 3 Implementation of 3-layer neural network
From nothing on Ubuntu 18.04 to setting up a Deep Learning environment in Tensor
[Deep Learning from scratch] Speeding up neural networks I explained back propagation processing
[Deep Learning from scratch] Initial value of neural network weight using sigmoid function
[Deep Learning from scratch] I implemented the Affine layer
Deep Learning from scratch
Implement Neural Network from 1
Implement feedforward neural network in Chainer to classify documents
[Deep Learning] Execute SONY neural network console from CUI
[Deep Learning from scratch] I tried to explain Dropout
[Deep Learning from scratch] Initial value of neural network weight when using Relu function
[Deep learning] Investigating how to use each function of the convolutional neural network [DW day 3]
A memo when executing the deep learning sample code created from scratch with Google Colaboratory
"Deep Learning from scratch" Self-study memo (No. 14) Run the program in Chapter 4 on Google Colaboratory
"Deep Learning from scratch" Self-study memo (Part 8) I drew the graph in Chapter 6 with matplotlib
I tried to implement a basic Recurrent Neural Network model
[Part 4] Use Deep Learning to forecast the weather from weather images
An amateur stumbled in Deep Learning from scratch Note: Chapter 1
[Part 1] Use Deep Learning to forecast the weather from weather images
[Part 3] Use Deep Learning to forecast the weather from weather images
Non-information graduate student studied machine learning from scratch # 2: Neural network
An amateur stumbled in Deep Learning from scratch Note: Chapter 3
An amateur stumbled in Deep Learning from scratch Note: Chapter 7
An amateur stumbled in Deep Learning from scratch Note: Chapter 5
[Part 2] Use Deep Learning to forecast the weather from weather images
An amateur stumbled in Deep Learning from scratch Note: Chapter 4
A story about trying to implement a private variable in Python.
An amateur stumbled in Deep Learning from scratch Note: Chapter 2
Implement a 3-layer neural network
Deep Learning from scratch 1-3 chapters
Introduction to Deep Learning ~ Backpropagation ~
Chapter 1 Introduction to Python Cut out only the good points of deep learning made from scratch
Why ModuleNotFoundError: No module named'dataset.mnist' appears in "Deep Learning from scratch".
Deep learning from scratch (cost calculation)
I tried to understand the learning function in the neural network carefully without using the machine learning library (second half).
Deep Learning memos made from scratch
A story about creating a program that will increase the number of Instagram followers from 0 to 700 in a week
About the order of learning programming languages (from beginner to intermediate) Part 2
Graph of the history of the number of layers of deep learning and the change in accuracy
Voice processing by deep learning: Let's identify who the voice actor is from the voice
[Deep Learning from scratch] Layer implementation from softmax function to cross entropy error
Build a "Deep learning from scratch" learning environment on Cloud9 (jupyter miniconda python3)
Introduction to AI creation with Python! Part 2 I tried to predict the house price in Boston with a neural network
Reinforcement learning to learn from zero to deep
[Learning memo] Deep Learning made from scratch [Chapter 7]
Deep learning from scratch (forward propagation edition)
Deep learning / Deep learning from scratch 2-Try moving GRU
Deep learning / Deep learning made from scratch Chapter 6 Memo
[Learning memo] Deep Learning made from scratch [Chapter 5]
[Learning memo] Deep Learning made from scratch [Chapter 6]
Image alignment: from SIFT to deep learning