○ The main points of this article Note that I learned linear regression
Linear regression: ・ Algorithm for predicting regression problems ・ Expressed as a straight line of y = b + ax ・ Find the parameters that minimize the error (loss) of each data and the straight line that is predicted. -Use the mean square error to measure the error (the mean square of the distance between the straight line and each data) -It is possible to model the relationship in which the objective variable becomes larger (smaller) as the explanatory variable becomes larger. ・ Supervised learning
Linear regression model
#Linear regression
from sklearn.linear_model import LinearRegression
import matplotlib.pyplot as plt
%matplotlib inline
#Training data
X = [[10.0], [8.0], [13.0], [9.0], [11.0], [14.0], [6.0], [4.0], [12.0], [7.0], [5.0]]
y = [8.04, 6.95, 7.58, 8.81, 8.33, 9.96, 7.24, 4.26, 10.84, 4.82, 5.68]
#Model generation, learning, and evaluation
model = LinearRegression()
model.fit(X, y) #Learning
print(model.coef_) #Slope of a regression line
print(model.intercept_) #Intercept of regression line
print('y = 0.5x + 3')
y_pred = model.predict(X) #Forecast
#graph display
fig, ax = plt.subplots()
plt.xlabel("X")
plt.ylabel("y")
ax.scatter(X, y, color='blue', marker='s', label='data')
plt.plot(X, y_pred, "r-")
result [0.50009091] 3.0000909090909094 y = 0.5x + 3
・ Although the number of data is small, I think it expresses the data as it is. ・ Algorithms for predicting regression problems include support vector machines, regularization, and neural networks, but this algorithm is the easiest to understand. ・ I think it's easy to understand because I learned with linear functions when I was a student. I would like you to add more study of these algorithms to the essential education.