Deep learning / matrix product error backpropagation

1.First of all

The error back propagation of matrix multiplication was difficult to understand, so I will summarize it.

2. Scalar product error backpropagation

Reviewing the error backpropagation of the scalar product, ![Screenshot 2020-03-29 15.41.12.png](https://qiita-image-store.s3.ap-northeast-1.amazonaws.com/0/209705/aaff755d-2340-a6ac-ad21- e590c81d87db.png) Assuming that the object to be gradient is L and $ \ frac {\ partial L} {\ partial y} $ is known in advance, from the chain rule スクリーンショット 2020-03-29 15.51.39.png This is no problem, isn't it?

3. Matrix product error back propagation

However, when it comes to matrix multiplication, it changes with intuition. スクリーンショット 2020-03-29 16.04.49.png

Somehow, it doesn't come with a pin. So, I will confirm it concretely. The setting is considered to be connected to neuron Y via the inner product of two neurons X and four weights W. スクリーンショット 2020-03-29 16.20.37.png スクリーンショット 2020-03-29 16.24.32.png ** 1) First, find $ \ frac {\ partial L} {\ partial X} $. ** First, calculate these in advance. スクリーンショット 2020-03-29 16.30.57.png While using this calculation on the way スクリーンショット 2020-03-29 16.35.53.png

** 2) Next, find $ \ frac {\ partial L} {\ partial y} $. ** First, calculate these in advance. スクリーンショット 2020-03-29 16.44.41.png While using this calculation on the way スクリーンショット 2020-03-29 16.46.43.png

4. Matrix product forward and back propagation code

If x1 = X, x2 = Y, grad = $ \ frac {\ partial L} {\ partial y} $,

class MatMul(object):
    def __init__(self, x1, x2):
        self.x1 = x1
        self.x2 = x2

    def forward(self):
        y = np.dot(self.x1, self.x2)
        self.y = y
        return y

    def backward(self, grad):
        grad_x1 = np.dot(grad, self.x2.T)
        grad_x2 = np.dot(self.x1.T, grad)
        return (grad_x1, grad_x2)

Recommended Posts

Deep learning / matrix product error backpropagation
Introduction to Deep Learning ~ Backpropagation ~
Deep Learning
Deep Learning Memorandum
Python Deep Learning
Deep learning × Python
Deep learning / error back propagation of sigmoid function
First Deep Learning ~ Struggle ~
Python: Deep Learning Practices
Deep learning / activation functions
Deep Learning from scratch
Deep learning 1 Practice of deep learning
Deep learning / cross entropy
First Deep Learning ~ Preparation ~
First Deep Learning ~ Solution ~
I tried deep learning
Deep learning large-scale technology
Deep learning / softmax function
[Deep Learning from scratch] Implement backpropagation processing in neural network by error back propagation method
Where are matrix products and inner products used in deep learning?
Deep Learning from scratch 1-3 chapters
Try deep learning with TensorFlow
Deep Learning Gaiden ~ GPU Programming ~
Deep learning image recognition 1 theory
Deep running 2 Tuning of deep learning
Deep learning / LSTM scratch code
Rabbit Challenge Deep Learning 1Day
<Course> Deep Learning: Day1 NN
Deep Kernel Learning with Pyro
Deep learning for compound formation?
Introducing Udacity Deep Learning Nanodegree
Subjects> Deep Learning: Day3 RNN
Introduction to Deep Learning ~ Learning Rules ~
Deep reinforcement learning 2 Implementation of reinforcement learning
Generate Pokemon with Deep Learning