It is a material and memo writing for the study session. We cannot guarantee the validity of the content, so we recommend that you refrain from quoting.
-[Perceptron-Wikipedia](https://ja.wikipedia.org/wiki/%E3%83%91%E3%83%BC%E3%82%BB%E3%83%97%E3%83%88% E3% 83% AD% E3% 83% B3) -Machine learning that even high school graduates can understand (2) Simple perceptron | When you think of it in your head -Machine learning that even high school graduates can understand (3) Multilayer perceptron | When you think of it in your head -15th Classification Problem Beginning: Let's Start Machine Learning | gihyo.jp… Technical Review Company -Machine learning learned by Yaruo-Perceptron --- Kengo's mansion
_2 Input perceptron (from "Deep Learning from scratch" P.22) _
--A model that receives multiple signals as inputs and outputs one signal --The signal is a binary value of 0 (do not flow) or 1 (flow) --Each input is multiplied by a unique weight $ w ^ 1, w ^ 2 ... $ (e.g. $ w ^ 1x ^ 1 $) ――The higher the weight, the more important the target signal becomes. --Outputs 1 when the sum of the input values exceeds an arbitrary threshold $ \ theta $ ("neurons fire")
Assuming that the output is $ y $, the above elements can be expressed mathematically as follows.
y = \left\{
\begin{array}{ll}
0 & (w^1x^1 + w^2x^2 \leqq \theta) \\
1 & (w^1x^1 + w^2x^2 \gt \theta)
\end{array}
\right.
--The AND / NAND / OR logic circuit can be expressed by using the perceptron. --Change the behavior of changing weights and thresholds without changing the structure of the perceptron
_P.23 See "Fig. 2-2 AND Gate Truth Table" _
--Outputs 1 only when two inputs are 1, otherwise outputs 0 --Perceptron that outputs when the sum of two input values exceeds the threshold $ \ theta $
_P.24 See "Figure 2-3 NAND Gate Truth Table" _
--NAND = Not AND = AND Behavior opposite to gate --Outputs 0 only when both are 1, and outputs 1 otherwise --When the sign of the parameter that realizes the AND gate is inverted, it becomes a NAND gate.
--Output 1 if at least one input is 1.
Let's define and execute the AND circuit as follows.
def AND(x1, x2):
w1, w2, theta = 0.5, 0.5, 0.7
tmp = x1 * w1 + x2 * w2
if tmp <= theta:
return 0
elif tmp > theta:
return 1
--Introduce a bias into the AND circuit of 2.3.1. --Replace $ \ theta $ in Equation 2.1 with $ -b $, where the bias is $ b $.
y = \left\{
\begin{array}{ll}
0 & (w^1x^1 + w^2x^2 \leqq \theta) \\
1 & (w^1x^1 + w^2x^2 \gt \theta)
\end{array}
\right.
y = \left\{
\begin{array}{ll}
0 & (b + w^1x^1 + w^2x^2 \leqq 0) \\
1 & (b + w^1x^1 + w^2x^2 \gt 0)
\end{array}
\right.
Let's check with the interpreter.
>>> import numpy as np
>>> x = np.array([0, 1]) #input
>>> w = np.array([0.5, 0.5]) #weight
>>> b = -0.7 #bias
>>> w * x
>>> np.sum(w * x)
>>> np.sum(w * x) + b
Let's implement each circuit of AND / NAND / OR based on the previous section.
def AND(x1, x2):
x = np.array([x1, x2])
w = np.array([0.5, 0.5])
b = -0.7
tmp = np.sum(w * x) + b
if tmp <= 0:
return 0
else:
return 1
def NAND(x1, x2):
x = np.array([x1, x2])
#Only weights and biases differ from AND
w = np.array([-0.5, -0.5])
b = 0.7
tmp = np.sum(w * x) + b
if tmp <= 0:
return 0
else:
return 1
def OR(x1, x2):
x = np.array([x1, x2])
#Only bias is different from AND
w = np.array([0.5, 0.5])
b = -0.2
tmp = np.sum(w * x) + b
if tmp <= 0:
return 0
else:
return 1
--Weights $ w ^ 1, w ^ 2 ... $ control the importance of the input signal --Bias adjusts the ease of firing of neurons (probability of outputting 1)
The term bias means "geta haki". This means how much geta (adding a value) to the output when there is no input (when the input is 0). In fact, the calculation of b + w1 x1 + w2 x2 in Equation (2.2) prints only the bias value if the inputs x1 and x2 are 0.
(Excerpt from P.27)
--A single-layer perceptron can separate linear regions, but not non-linear regions such as XOR (exclusive OR).
--XOR can be expressed by "layering". --A perceptron with multiple layers is called a ** multi-layered perceptron **, and is expressed as "$ n $ layer perceptron".
An XOR circuit that outputs 1 when one of the inputs is 1 and the other is 0 is represented by a combination of existing AND / NAND / OR. Let's sort out the behavior of each circuit again.
--AND ... Outputs 1 if the two inputs are 1, otherwise outputs 0. --NAND ... Outputs 0 if the two inputs are 1, otherwise outputs 1. --Outputs 1 if at least one of the two inputs is 1, otherwise outputs 0. --XOR ... Outputs 1 if one of the two inputs is 1 and the other is 0, otherwise outputs 0.
Actually, it can be realized by the following wiring. The initial value is $ x ^ 1, x ^ 2 $, the NAND output is $ s ^ 1 $, and the OR output is $ s ^ 2 $.
_P.32 Figure 2-11_Check the truth table.
_P.33 Figure 2-12 XOR Gate Truth Table _Let's implement it in the code based on the previous section.
def XOR(x1, x2):
s1 = NAND(x1, x2)
s2 = OR(x1, x2)
y = AND(s1, s2)
return y
--Perceptron is an algorithm with input and output. Given a certain input, a fixed value is output. --In Perceptron, "weight" and "bias" are set as parameters. --By using Perceptron, you can express logic circuits such as AND and OR gates. Wear. --The XOR gate cannot be represented by a single-layer perceptron. --The XOR gate can be represented by using a two-layer perceptron. --Single-layer perceptrons can only represent linear regions, whereas multi-layer perceptrons Perceptron can represent a non-linear region. --A multi-layered perceptron can (in theory) represent a computer.
Recommended Posts