Last time University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (13) https://github.com/legacyworld/sklearn-basic
This time I didn't use scikit-learn for the first time. Commentary on Youtube is the 7th (1) per 25 minutes 50 seconds It's not difficult as a problem, but one caveat.
w^T = (w_0,w_1,w_2),
X = \begin{pmatrix}
x_{11}&x_{12}&\cdots&x_{110} \\
x_{21}&x_{22}&\cdots&x_{210} \\
1&1&\cdots&1
\end{pmatrix} \\
w^TX =0 is x_1,x_W on the plane of 2_0x_1 + w_1x_2 + w_2 =Expressed as 0. Transformed\\
x_2 = -\frac{w_2}{w_1} - \frac{w_0}{w_1}x_1
Normally, $ w_0 $ is the intercept part, but in this problem it is $ w_2 $. Click here for source code
python:Homework_6.4.py
import numpy as np
import matplotlib.pyplot as plt
#Sigmoid function
def sigmoid(w,x):
return 1/(1+np.exp(-np.dot(w,x)))
# 0.Classified by 5
def classification(a):
return 1 if a > 0.5 else 0
X = np.array([[1.5,-0.5],[-0.5,-1.0],[1.0,-2.5],[1.5,-1.0],[0.5,0.0],[1.5,-2.0],[-0.5,-0.5],[1.0,-1.0],[0.0,-1.0],[0.0,0.5]])
#Since the intercept part is at the back, add 1 at the end
X = np.concatenate([X,np.ones(10).reshape(-1,1)],1)
y = np.array([1,0,0,1,1,1,0,1,0,0])
w = np.array([[6,3,-2],[4.6,1,-2.2],[1,-1,-2]])
#Create logit contour lines for the same reference as the explanation
fig = plt.figure(figsize=(20,10))
ax = [fig.add_subplot(2,2,i+1) for i in range(4)]
ax[0].scatter(X[:,0],X[:,1])
x_plot = np.linspace(-1.0,2.0,100)
ax[0].set_ylim(-3,1)
for i in range(0,3,1):
y_plot = -w[i][2]/w[i][1]-w[i][0]/w[i][1]*x_plot
ax[0].plot(x_plot,y_plot,label=f"w{i+1}")
ax[0].set_title("Sample Distribution")
ax[0].legend()
ax[0].grid(b=True)
#Mesh data
xlim = [-2.0,2.0]
ylim = [-3.0,3.0]
n = 100
xx = np.linspace(xlim[0], xlim[1], n)
yy = np.linspace(ylim[0], ylim[1], n)
YY, XX = np.meshgrid(yy, xx)
xy = np.vstack([XX.ravel(), YY.ravel(),np.ones(n**2)])
for i in range(3):
Z = sigmoid(w[i],xy).reshape(XX.shape)
interval = np.arange(0,1,0.01)
#0 is purple, 1 is red, and gradation between them
m = ax[i+1].contourf(XX,YY,Z,interval,cmap="rainbow",extend="both")
m = ax[i+1].scatter(X[:,0],X[:,1],c=y)
ax[i+1].set_title(f"w{i+1} Logit Contour")
fig.colorbar(mappable = m,ax=ax[i+1])
plt.savefig("6.4.png ")
# w^Calculation of T x
for index,w_i in enumerate(w):
print(f"w{index+1} {np.dot(w_i,X.T)}")
# sigmoid(w^T x)Calculation
np.set_printoptions(formatter={'float': '{:.2e}'.format})
for index,w_i in enumerate(w):
print(f"w{index+1} {sigmoid(w_i,X.T)}")
#Classification
for index,w_i in enumerate(w):
print(f"w{index+1} {np.vectorize(classification)(sigmoid(w_i,X.T))}")
#probability
for index,w_i in enumerate(w):
print(f"w{index+1} {np.count_nonzero(np.vectorize(classification)(sigmoid(w_i,X.T))==y)*10}%")
This is the logit contour line shown in the commentary.
Click here for execution results
w1 [ 5.5 -8. -3.5 4. 1. 1. -6.5 1. -5. -0.5]
w2 [ 4.2 -5.5 -0.1 3.7 0.1 2.7 -5. 1.4 -3.2 -1.7]
w3 [ 0. -1.5 1.5 0.5 -1.5 1.5 -2. 0. -1. -2.5]
w1 [9.96e-01 3.35e-04 2.93e-02 9.82e-01 7.31e-01 7.31e-01 1.50e-03 7.31e-01 6.69e-03 3.78e-01]
w2 [9.85e-01 4.07e-03 4.75e-01 9.76e-01 5.25e-01 9.37e-01 6.69e-03 8.02e-01 3.92e-02 1.54e-01]
w3 [5.00e-01 1.82e-01 8.18e-01 6.22e-01 1.82e-01 8.18e-01 1.19e-01 5.00e-01 2.69e-01 7.59e-02]
$ Classification result of x_j by model \ sigma (w_i ^ Tx_j) $
w1 [1 0 0 1 1 1 0 1 0 0]
w2 [1 0 0 1 1 1 0 1 0 0]
w3 [0 0 1 1 0 1 0 0 0 0]
Correct answer rate
w1 100%
w2 100%
w3 60%
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (1) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (2) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (3) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (4) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (5) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (6) University of Tsukuba Machine Learning Course: Study sklearn while making the Python script part of the task (7) Make your own steepest descent method University of Tsukuba Machine Learning Course: Study sklearn while making the Python script part of the task (8) Make your own stochastic steepest descent method University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (9) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (10) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (11) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (12) https://github.com/legacyworld/sklearn-basic https://ocw.tsukuba.ac.jp/course/systeminformation/machine_learning/
Recommended Posts