Deep Learning from mathematical basics (during attendance)

Taking classes at MPS Yokohama

I am struggling because I am a beginner in both programming and mathematics. (I will check Markdown notation and TeX notation and rewrite it later)

Motivation for attending:

Ultimately, I would like to customize the appropriate Deep Learning library (tensorflow) and utilize it for work (advertising operation tool) and play (home appliance operation from sensing ??).

Textbook (learning source)

MPS Lecture Slides Lecture video

@ hi_saito's Great Summary Note (I'm sorry if there is a typographical error (?) Or my lack of understanding)

2016/5/25 "Unknown list"

1.1.1 2nd Slide p36 Great note p14

The weight W is written in two ways, the matrix W and the vector W (bold letters) translocation (?) T, and I think that both are the same as the resulting numbers, but I wonder if there is something particular about it. ?? → 2nd Lecture video 1h23min ~ 1h42min is the explanation The weights v00 and v01 required to derive z0 are two y0 and y1, so they are vectors, aren't they? Is suspicious. If it's a 2-by-1 matrix, it's a special vector, so I understand it now. I wonder if it is a device to make it a form that can be processed by numpy.dot (w, x).

1.1.1 Axes3D in the module mpl_toolkits.mplot3d How to draw here

W,B = np.meshgrid(w,b)


    Z = se(0, sigmoid(W,1,B))
    fig = plt.figure()

    ax = Axes3D(fig)
    ax.plot_wireframe(W,B,Z)```

 2. What are you trying to achieve with numpy mesh grid?

3.```if __name__ == '__main__':```

 I'm embarrassed not to understand. .. .. The function you called? class? If is main, do you run it? ?? What do you mean?

 4.numpy.vectorize () Why use vectorize to create an array?

 5.self.w [0] [0] What are the elements of this matrix w?

 → Maybe understanding.

#### **`w = [[3.0,],]`**
```0,],]

b = [[1.0,],]
alpha = 0.1
w=np.concatenate((w,b),axis=1)
w[0][0] += alpha```
 Since the array [row] [column] of numpy, w [0] [0] is the 1st row and 1st column of the array w.
 What kind of array w is is a 1-by-2 matrix called [3.0 1.0] because b is included in the column direction.
 So, what is this 3.0? The bitflip problem is that the input is one-dimensional with 1 or 0 and the output is also one-dimensional with 1 or 0.
 There is one weight w for the input. Its weight is 3.0
 This time, I want to increase or decrease this w by alpha depending on the result, so I take out 3.0 in the 1st row and 1st column with w [0] [0].
 I'm adding or subtracting alpha minutes.


 7. About the sigmoid function
 Since 1 is divided by 1 + "positive number (or 0)", I understand that the maximum is 1 and the minimum is a number as close to 0 as possible.
 What kind of number is this "positive number (or 0)" of e to the -αu power (a> 0)? 1 / e to the αu power
 I wonder what the number is ... (Since u is Wx + b, there is a possibility that it may be negative. Then, 1 / e to the au power becomes e to the au power, and the larger a is, the more rapidly the sigmoid Does the result of the function approach 0?)

 8. How to use f (u) in python

 9. How to calculate the derivative by hand

10.




Recommended Posts

Deep Learning from mathematical basics (during attendance)
Deep Learning from the mathematical basics Part 2 (during attendance)
Deep Learning from scratch
Deep Learning from scratch 1-3 chapters
Deep Learning / Deep Learning from Zero 2 Chapter 4 Memo
Deep Learning / Deep Learning from Zero Chapter 3 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 5 Memo
Deep learning from scratch (cost calculation)
Deep Learning / Deep Learning from Zero 2 Chapter 7 Memo
Deep Learning / Deep Learning from Zero Chapter 5 Memo
Deep Learning / Deep Learning from Zero Chapter 4 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 3 Memo
Deep Learning memos made from scratch
Deep Learning / Deep Learning from Zero 2 Chapter 6 Memo
(python) Deep Learning Library Chainer Basics Basics
Deep learning tutorial from environment construction
Deep Learning
Reinforcement learning to learn from zero to deep
[Learning memo] Deep Learning made from scratch [Chapter 7]
Deep learning from scratch (forward propagation edition)
Deep learning / Deep learning from scratch 2-Try moving GRU
Deep learning / Deep learning made from scratch Chapter 6 Memo
[Learning memo] Deep Learning made from scratch [Chapter 5]
[Learning memo] Deep Learning made from scratch [Chapter 6]
Image alignment: from SIFT to deep learning
"Deep Learning from scratch" in Haskell (unfinished)
Deep learning / Deep learning made from scratch Chapter 7 Memo
[Windows 10] "Deep Learning from scratch" environment construction
[Deep Learning from scratch] About hyperparameter optimization
"Deep Learning from scratch" Self-study memo (Part 12) Deep learning
Mathematical statistics from the basics Random variables
[Learning memo] Deep Learning made from scratch [~ Chapter 4]
"Deep Learning from scratch" self-study memo (unreadable glossary)
"Deep Learning from scratch" Self-study memo (9) MultiLayerNet class
Deep Learning from scratch ① Chapter 6 "Techniques related to learning"
Good book "Deep Learning from scratch" on GitHub
Deep Learning from scratch Chapter 2 Perceptron (reading memo)
Deep Learning Memorandum
[Learning memo] Deep Learning from scratch ~ Implementation of Dropout ~
Start Deep learning
Python: Deep Learning in Natural Language Processing: Basics
Python Deep Learning
Unsupervised learning 1 Basics
Deep learning × Python
Python vs Ruby "Deep Learning from scratch" Summary
"Deep Learning from scratch" Self-study memo (10) MultiLayerNet class
"Deep Learning from scratch" Self-study memo (No. 11) CNN
[Deep Learning from scratch] I implemented the Affine layer
"Deep Learning from scratch" Self-study memo (No. 19) Data Augmentation
"Deep Learning from scratch 2" Self-study memo (No. 21) Chapters 3 and 4
Application of Deep Learning 2 made from scratch Spam filter
Othello ~ From the tic-tac-toe of "Implementation Deep Learning" (4) [End]
[Deep Learning] Execute SONY neural network console from CUI
[Deep Learning from scratch] I tried to explain Dropout
First Deep Learning ~ Struggle ~
Python: Unsupervised Learning: Basics
Deep learning 1 Practice of deep learning
Deep learning / cross entropy
First Deep Learning ~ Preparation ~
First Deep Learning ~ Solution ~
[AI] Deep Metric Learning