[Mathematics] Let's visualize what are eigenvalues and eigenvectors

The concept of ** eigenvalues ** appears in linear algebra. I think it's hard to imagine at first, but it's an important concept and is often used in statistics, so I'd like to try to explain this by writing an animation graph as this visualization series.

I will understand the meaning of such a graph. _eigen_value-compressor-2.gif

1. What are eigenvalues and eigenvectors?

First of all, what are eigenvalues and eigenvectors? It is as follows when expressed by a mathematical formula.


A{\bf x} = \lambda {\bf x}

$ {\ bf x} $ in $ {\ bf x} \ neq {\ bf 0} $, multiplied by matrix A, so that the length is $ \ lambda $ times $ {\ bf x} $ Things are called ** eigenvectors ** and $ \ lambda $ are called ** eigenvalues **. Who doesn't know? ?? ?? So, I'm not sure from this alone. I would like to explain it with a graphical explanation immediately.

2. Linear transformation by matrix A

Before explaining the eigenvalues and eigenvectors, let's take a look at linear transformations using matrices. For example, suppose the matrix $ A $ is a matrix whose components are as follows.

A = 
\left[
\begin{array}{cc}
2 & 1 \\
-0.5 & -1.5 \\
\end{array}
\right]

At this time, if you calculate $ Ax $


A{\bf x} = \left[
\begin{array}{cc}
2 & 1 \\
-0.5 & -1.5 \\
\end{array}
\right]
\left[
\begin{array}{cc}
x_1 \\
x_2 \\
\end{array}
\right]
=
\left[
\begin{array}{cc}
2x_1 + x_2 \\
-0.5x_1 - 1.5x_2 \\
\end{array}
\right]

And will be. If $ {\ bf x} = (1, 1) $

A{\bf x} = \left[
\begin{array}{cc}
2 & 1 \\
-0.5 & -1.5 \\
\end{array}
\right]
\left[
\begin{array}{cc}
1 \\
1 \\
\end{array}
\right]
=
\left[
\begin{array}{cc}
2 + 1 \\
-0.5 - 1.5 \\
\end{array}
\right]
=
\left[
\begin{array}{cc}
3 \\
-2 \\
\end{array}
\right]

Will be.

Multiplying the matrix $ A $ by $ {\ bf x} = (1, 1) $ yields $ (3, -2) $. To illustrate this,

eigen_val02-compressor.png

With that feeling, the blue vector $ (1, 1) $ is rotated and stretched. So, I think that the operation of multiplying this matrix can be regarded as ** "rotate and stretch the vector" **.

Let's check with the Python code to see if the calculation is correct. Looking at the result graph,

eigen_val07-compressor.png

With that feeling, the blue line extending from the origin is the (1, 1) vector, but you can see that the red line after conversion extends to (3, -2). The code is below.

%matplotlib inline
import matplotlib.pyplot as plt
import numpy as np
from matplotlib import animation as ani

plt.figure(figsize=(8,8))
n=20

A = [[ 2, 1],
     [-0.5,-1.5]]
x = [1, 1]

a = np.dot(A, x)   #I'm calculating Ax here

plt.plot([0, x[0]], [0, x[1]], "b", zorder=100)
plt.plot([0, a[0]], [0, a[1]], "r", zorder=100)

plt.plot([-15,50],[0,0],"k", linewidth=1)
plt.plot([0,0],[-40,40],"k", linewidth=1)
plt.xlim(-1,4)
plt.ylim(-3,2)
plt.show()

Similarly, plot 100 points in a square area and transform it with the matrix $ A $ to transform it into a parallelogram. Each point in the blue square has moved to a red parallel quadrilateral point.

eigen_val04-compressor.png

I plotted the numbers so that the correspondence between the blue and red dots can be easily understood.

eigen_val05-compressor.png

Here is the code to draw the above graph.

plt.figure(figsize=(10,10))
n=10
xmin = -5
xmax =  35
ymin = -20
ymax =  10

A = [[ 2, 1],
     [-0.5,-1.5]]
for i in range(n):
    for j in range(n):
        x=j
        y=i
        
        a = np.dot(A, [x, y])
        
        plt.scatter(x,  y,  facecolor="b", edgecolors='none', alpha=.7, s=20)
        plt.scatter(a[0], a[1], facecolor="r", edgecolors='none', alpha=.7)

        plt.plot([xmin,xmax],[0,0],"k", linewidth=1)
        plt.plot([0,0],[ymin,ymax],"k", linewidth=1)
        plt.xlim(xmin, xmax)
        plt.ylim(ymin, ymax)
plt.show()
plt.figure(figsize=(10,10))
n=10
xmin = -5
xmax =  35
ymin = -20
ymax =  10

A = [[ 2, 1],
     [-0.5,-1.5]]
for i in range(n):
    for j in range(n):
        x=j
        y=i

        a = np.dot(A, [x, y])
        
        loc_adjust = .2  #Adjusting the display position
        plt.text(x-loc_adjust, y-loc_adjust, "%d"%(i*n + j), color="blue")
        plt.text(a[0]-loc_adjust, a[1]-loc_adjust, "%d"%(i*n + j), color="red")

        plt.plot([xmin,xmax],[0,0],"k", linewidth=1)
        plt.plot([0,0],[ymin,ymax],"k", linewidth=1)
        plt.xlim(xmin, xmax)
        plt.ylim(ymin, ymax)
plt.show()

3. Visualization of eigenvalues and eigenvectors

Based on the previous section, the eigenvalue / eigenvector formula,

A{\bf x} = \lambda {\bf x}

Is a combination of A and $ {\ bf x} $ that does not rotate even if converted by the matrix A, and only stretches so that only the length changes, as shown in the graph below. is.

eigen01-compressor.png

Let's animate this. First, prepare a vector of length 1 (blue line) as a vector before conversion, and rotate it 360 degrees. Draw a vector (red line) linearly transformed by matrix A for each. The time when these two lines line up in a straight line is when the eigenvalue and the eigenvector match. The blue line $ {\ bf x} $ and the red line $ A {\ bf x} $ point in the same direction but differ in length, that is, $ A {\ bf x} = \ lambda {\ bf x} $ is.

For the sake of clarity, the lines are thickened when they are lined up in a straight line.

_eigen_value-compressor-2.gif

You can calculate eigenvalues and eigenvectors with numpy, so let's try it.

la, v = np.linalg.eig(A)
print "la",la
print "v",v

The result is below.

output


la [ 1.85078106 -1.35078106]
v [[ 0.98904939 -0.28597431]
   [-0.1475849   0.95823729]]

Two eigenvalues $ \ lambda_1 = 1.85078106, \ lambda_2 = -1.35078106 $ and two eigenvectors $ x_1 = (0.98904939, -0.1475849), x_2 = (-0.28597431, 0.95823729) $ were obtained. Let's take out one frame of the previous animation and compare it. The title part "original (blue): (-0.989, -0.149)" almost matches $ x_1 $. The slight error is due to the rough frame rate of the animation, which is more consistent with higher rates (more precision). Also, "[length] red: 1.849", which also matches $ \ lambda_1 $: smile:

eigen_val11-compressor.png

So, in two dimensions, an eigenvector is a vector in a direction in which the direction does not change and only the length changes when you try a vector in all directions 360 degrees from the origin. The eigenvalue is the vector in that direction, which is the ratio of the length of the vector before and after conversion: blush:

4. Example of application in statistics

I would like to give an example of using eigenvalues and eigenvectors in statistics. First, take a look at this graph.

eigen_val10-compressor.png

The blue dot above is a plot of 1000 random numbers that follow a two-dimensional normal distribution. If you calculate the variance-covariance matrix from this data, you will get a 2x2 matrix. Calculating the eigenvectors of this matrix yields two two-dimensional eigenvectors. If you put two of them side by side, you can make a matrix again. The graph of the red dots below plots the data generated by multiplying the matrix of these eigenvectors by aligning the center of the original data with the origin. The conversion is as if the long side of the ellipse was brought horizontally by rotating it without changing the shape of the data. This operation, in fact, becomes an analysis method called principal component analysis in the world of statistics. (The article I wrote about principal component analysis is here)

In this way, the idea of eigenvalues and eigenvectors is also used in the field of statistics.

The Python code for this operation is below.

np.random.seed(0)
xmin = -10
xmax =  10
ymin = -10
ymax =  10

#average
mu = [2,2]
#Covariance
cov = [[3,2.3],[1.8,3]]

#Random number generation of bivariate normal distribution
x, y = np.random.multivariate_normal(mu,cov,1000).T

av_x = np.average(x)
av_y = np.average(y)

#Calculate the variance-covariance matrix from the data
S = np.cov(x, y)
print "S", S
    
#Calculate eigenvalues and eigenvectors
la, v = np.linalg.eig(S)

print "la", la
print "v", v

#Slide so that the origin is in the center
x2 = x - av_x
y2 = y - av_y

#Multiply the data that slides the origin by a matrix created by arranging eigenvectors.
a1 = np.array([np.dot(v, [x2[i],y2[i]]) for i in range(len(x))])

#Drawing a graph
plt.figure(figsize=(8, 13))

#Original data plot
plt.subplot(211)
plt.xlim(xmin, xmax)
plt.ylim(ymin, ymax)
plt.scatter(x, y, alpha=0.5, zorder=100)
plt.plot([0, 0], [ymin, ymax], "k")
plt.plot([xmin, xmax], [0, 0], "k")

#Plot of data multiplied by a matrix created by arranging eigenvectors
plt.subplot(212)
plt.xlim(xmin, xmax)
plt.ylim(ymin, ymax)
plt.scatter(a1[:,0], a1[:,1], c="r", alpha=0.5, zorder=100)
plt.plot([0, 0], [ymin, ymax], "k")
plt.plot([xmin, xmax], [0, 0], "k")
plt.show()

output


S [[ 2.6774093   1.93221432]
   [ 1.93221432  3.05844013]]
la [ 0.92634075  4.80950869]
v [[-0.74098708 -0.67151928]
   [ 0.67151928 -0.74098708]]

The determinant of v is calculated to be 1, so we can see that this transformation only rotates and the length does not change.

All Python code for this article can be found here [https://gist.github.com/matsuken92/47e5bf7b49e01f8a4a9d).

Recommended Posts

[Mathematics] Let's visualize what are eigenvalues and eigenvectors
Find eigenvalues and eigenvectors
What are "sudo ln -s" and "ln -s"?
[Python] What are @classmethods and decorators?
(Beginner) What are cores and threads?
Eigenvalues and eigenvectors: Linear algebra in Python <7>
What are go mod, go get and go mod vendors?
What are Linux POSIX options and GNU options?
Training data and test data (What are X_train and y_train?) ②
What are you comparing with Python is and ==?
What are python tuples and * args after all?