Linear Algebra for Programming Chapter 4 (Eigenvalues, Eigenvectors)

When it will be useful

I want to find out if the autoregressive model goes out of control!

What is an autoregressive model?

For example, today's $ ξ (t) $ is yesterday's $ ξ (t-1) $, the day before yesterday's $ ξ (t-2) $, and the day before yesterday's $ ξ (t-3) $. From $ u (t) $, suppose you have the following:

ξ(t)=-0.5ξ(t-1)+0.34ξ(t-2)+0.08ξ(t-3)+2u(t)

Expressing this as a matrix,

x(t)=
\begin{pmatrix}
-0.5&0.34&0.08\\
1&0&0\\
0&1&0
\end{pmatrix} \times
x(t-1)

It will be. Since then, this has been generalized

x(t)=Ax(t-1)

Handles.

In the case of one dimension

x(t)=7x(t-1)

This is multiplied by 7 each time $ t $ grows, so

x(t)=7^tx(0)

Then, when $ t → ∞ $, it runs out of control because it is $ 7 ^ t → ∞ $. On the other hand, if the coefficient is $ 0.2 $, there is no runaway. This is easy, isn't it?

For diagonal matrix

x(t)=
\begin{pmatrix}
5&0&0\\
0&-3&0\\
0&0&0.8
\end{pmatrix} \times
x(t-1)

In the case of, the power of the diagonal matrix only needs to be the power of the component as it is.

x(t)=
\begin{pmatrix}
5&0&0\\
0&-3&0\\
0&0&0.8
\end{pmatrix}^t
x(0)

Can be written.

Generalization

If it is a non-diagonal matrix, it will be forcibly converted to a diagonal matrix. Bring some regular matrix $ P $ to the original variable $ x (t) $

x(t)=Py(t)

Let's think about how to convert to another variable with. At this time, how is $ x (t) = Ax (t-1) $ converted?

First, $ x (t) = Py (t) $

y(t)=P^{-1}x(t)=P^{-1}Ax(t-1)\\
      =P^{-1}A(Py(t-1))=(P^{-1}AP)y(t-1)

If you put $ Λ = P ^ {-1} AP $,

y(t)=Λy(t-1)

Can be placed. If this $ Λ $ is a diagonal matrix,

y(t)=Λ^ty(0)

You can easily find $ y (t) $ with

x(t)=Py(t)=PΛty(0)=PΛ^tP^{-1}x(0)

I was happy to find x as well.

This procedure "bringing a convenient $ P $ and making $ P ^ {-1} AP $ a diagonal matrix" is called diagonalization.

What i want to do

P^{-1}AP≡Λ=diag(λ_1,...,λ_n)

Is to find $ P $ like. In general, for the square matrix $ A $

Ap=λp\\
p≠o

The number $ λ $ and the vector $ p $ that satisfy the conditions are called eigenvalues and eigenvectors, respectively.

Calculate eigenvalues and eigenvectors in Python

You can easily calculate by making full use of the function of Numpy. Let's calculate with the following matrix.

A=
\begin{pmatrix}
5&3&4\\
6&8&-8\\
6&9&-9
\end{pmatrix}
import numpy as np

def get_eigenpairs(arr):
    #np.linalg.eig outputs an eigenvalue in the first argument and an eigenvector in the second argument
    w, v = np.linalg.eig(arr)
    eigenpairs = []
    
    #Because v is standardized to 1. I'm not sure if this is the case.
    #Therefore, the numerical value is restored by dividing by the minimum value excluding 0 in each column.
    for i, val in enumerate(w):
        vec = v[:, i] / np.min(np.abs(v[:, i][v[:, i] != 0]))
        eigenpairs.append((val, vec))
    eigenpairs.sort(key=lambda x:x[0])
    return eigenpairs

A = np.array([[5,3,-4],[6,8,-8],[6,9,-9]])

get_eigenpairs(A)
#array([[-1.0, array([1., 2., 3.])],
#       [2.0, array([-1., -3., -3.])],
#       [3.0, array([1., 2., 2.])]], dtype=object)

We got 3 eigenvectors. These correspond to the eigenvalues $ -1, 2, 3 $, respectively. So I put three eigenvectors side by side

P=
\begin{pmatrix}
1&-1&1\\
2&-3&2\\
3&-3&2
\end{pmatrix}

Let's check if $ P ^ {-1} AP $ is a diagonal matrix.

P = np.empty((3,3))

#Store 3 eigenvectors in a 3x3 matrix
for i,val in enumerate(get_eigenpairs(a)):
    P[i] = val[1]

np.rint(np.dot(np.dot(np.linalg.inv(P.T),a),P.T))
#array([[-1.,  0.,  0.],
#       [ 0.,  2., -0.],
#       [ 0.,  0.,  3.]])

It was confirmed that $ P ^ {-1} AP = diag (-1,2,3) $ was successfully obtained. Now you can easily find out about the runaway of the autoregressive model.

Recommended Posts

Linear Algebra for Programming Chapter 4 (Eigenvalues, Eigenvectors)
Eigenvalues and eigenvectors: Linear algebra in Python <7>
Find eigenvalues and eigenvectors
Linear Programming with PuLP