Perceptron is an algorithm, an old one, You should know it when learning machine learning.
For example, try to calculate popular followers. Let's assume that the index at that time is the number of youtube channel registrations and the number of twitter followers.
For example
User name | Number of registered channels | Popular(If there is 1) |
---|---|---|
aaa | 50 | 0 |
bbb | 10000 | 0 |
ccc | 100000 | 1 |
ddd | 90000 | 0 |
eee | 120000 | 1 |
User name | tw Number of followers | Popular(If there is 1) |
---|---|---|
aaa | 100 | 0 |
bbb | 10 | 0 |
ccc | 7000 | 1 |
ddd | 150000 | 1 |
eee | 90000 | 1 |
In this case, since ◯ is ccc and eee, you will be able to see whether it will become popular at the following boundaries.
If both are popular, we will consider them popular. Speaking above, ccc and eee will be popular.
In this case, by creating a boundary, you can tell whether it is popular or unpopular.
To make this division in the actual program, w = weighting, theta = bias (inclination adjustment) Adjust by about the value so that it can be allocated by 0,1 (◯ × in the figure) Number of channels = w1 = 1.6 tw Number of followers = w2 = 0.11 And let's set the threshold of 0,1 to 160500 Hmm, how did you find this value? There is no choice but to hit it appropriately. First, find the value of tmp and see if it comes to 0 or 1. Can be judged. Temporarily set the threshold, if it is not correct It feels like fine-tuning w1, w2, theta. This manual work is also difficult for Perceptron.
def AND_famous(x1,x2):
w1 = 1.6
w2 = 0.11
theta = 160500
tmp = w1*x1 + w2*x2
if tmp <= theta:
y = 0
else:
y = 1
return y
print(AND_famous(50,100))
print(AND_famous(10000,10))
print(AND_famous(100000,7000))
print(AND_famous(90000,150000))
print(AND_famous(120000,90000))
//console result
0
0
1
0
1
With the simple perceptron, you can divide the classification that can be grouped by one straight line.
There are cases where it cannot be done in the above cases. For example, what would you do with this result?
Judges those who excel in only one art (not including two arts) Judgment criteria will be genius (XOR) if either is true reference) https://wwws.kobe-c.ac.jp/deguchi/sc180/logic/gate.html https://the01.jp/p0004619/
User name | Test score (3),4 is normal) | talent(If there is 1) |
---|---|---|
aaa | 3 | 0 |
bbb | 70 | 0 |
ccc | 92 | 1 |
ddd | 6 | 0 |
eee | 97 | 0 |
User name | Painting ability score (up to 100) | talent(If there is 1) |
---|---|---|
aaa | 100 | 1 |
bbb | 40 | 0 |
ccc | 98 | 1 |
ddd | 92 | 1 |
eee | 70 | 0 |
In this case, only one of them excels It will be aaa, ddd, eee. The figure is as follows
Looking at the figure above, I think you should go to classify by ○ and x. When you actually draw a grouping line, it looks like this. I won't program this time, but it's good to just remember that it can be achieved by connecting simple perceptrons.
Summary, A simple perceptron can draw a 0,1 boundary with a straight line Multilayer perceptrons can draw 0,1 boundaries on curves, but only those that can be combined with simple perceptrons Will be.
So, isn't it possible to do everything with Perceptron? For example
--Boundary is closed --Those with multiple boundaries --Boundary is bent (thing that cannot be expressed by xor) There are such things, so by trying to be able to do it, The history of evolution is that evolutionary methods have emerged since Perceptron.
The shape is like this Diagram of multiple perceptrons
import matplotlib.pyplot as plt
#plt.plot(x3)
plt.plot(3,100,marker='o')
plt.plot(70,40,marker='x')
plt.plot(92,98,marker='x')
plt.plot(6,92,marker='o')
plt.plot(97,70,marker='o')
Recommended Posts