Given the matrix $\mathbf{A} = \begin{bmatrix}1, 4 \\ 6, -1 \end{bmatrix}$, a vector, $\mathbf{x} = \begin{bmatrix} 1\\1 \end{bmatrix} $ and a scalar, $\lambda=5$. Are x and $\lambda$ an eigenvector and eigenvalue for A? Explain.
Khan Academy video on finding eigen vectors and values.
For the following matrix, find the eigenvalues and the eigenvectors
$\begin{bmatrix} 2, -7 \\ 3,-8 \end{bmatrix}$
(From EMEA 13.10.1)
If you want an alternative explanation of eigen values, you could see this video from Khan Academy.
We'll start with a demonstration of the simplest possible geometric interpretation of eigenvalues and eigenvectors from PPNM. In the first video, we considered what a eigenvector means geometrically in 3 dimensions - this is the vector you rotate around without itself changing other than, potentially, stretching higher into the air.
In two dimensions we have a similar idea.
We start by creating our own 2-d vector plotting function
import numpy as np
import matplotlib.pyplot as plt
plt.style.use('seaborn-poster') #using a certain plot style
%matplotlib inline
def plot_vect(x, b, xlim, ylim):
'''
function to plot two vectors,
x - the original vector
b - the transformed vector
xlim - the limit for x
ylim - the limit for y
'''
plt.figure(figsize = (10, 6))
plt.quiver(0,0,x[0],x[1],\
color='k',angles='xy',\
scale_units='xy',scale=1,\
label='Original vector')
plt.quiver(0,0,b[0],b[1],\
color='g',angles='xy',\
scale_units='xy',scale=1,\
label ='Transformed vector')
plt.xlim(xlim)
plt.ylim(ylim)
plt.xlabel('X')
plt.ylabel('Y')
plt.legend()
plt.show()
A few things to note here:
plt.quiver() function creates an arrow. In this function we are then plotting two 2-d vectors, x and b.
the use of a plot style. You can try playing with other styles
So now we define a certain matrix, A, and a certain vector x. b we define as the matrix multiplication of Ax. Geometrically, we will see, that this means that this will rotate and stretch the x vector:
A = np.array([[2, 0],[0, 1]])
x = np.array([[1],[1]])
b = np.dot(A, x)
plot_vect(x,b,(0,3),(0,2))
So, what can we say? Would it be possible to substitute our A matrix with a scalar multiplication? No! Therefor, we can say that x is not a eigenvector.
Now let's try another vector:
x = np.array([[1], [0]])
b = np.dot(A, x)
plot_vect(x,b,(0,3),(-0.5,0.5))
Would it be possible to substitute a scalar for the matrix A here? YES! That means the vector
\begin{bmatrix} 1\\ 0 \end{bmatrix}is an eigenvector to A.
Is it the only eigenvector to A? Try for example x=[2,0]
x = np.array([[2], [0]])
b = np.dot(A, x)
plot_vect(x,b,(0,6),(-0.5,0.5))
So here we see that we can have many eigenvectors to a given matrix A
We start by loading in np and also a function implimenting the qr decomposition
import numpy as np
from numpy.linalg import qr
We will now manually work through our iterative process of finding our eigenvalues by using qr decomposition. We will use the simplist possible matrix: 2x2, but we could easily put in a larger matrix:
a = np.array([[0, 2],
[2, 3]])
q, r = qr(a)
print('Q:', q)
print('R:', r)
Q: [[ 0. -1.] [-1. 0.]] R: [[-2. -3.] [ 0. -2.]]
a_new = np.dot(r, q)
print('QR:', a_new)
QR: [[3. 2.] [2. 0.]]
a = a_new
q, r = qr(a)
a_new = np.dot(r, q)
print('QR:', a_new)
QR: [[ 3.99999881e+00 2.44140567e-03] [ 2.44140567e-03 -9.99998808e-01]]
After only a few iterations we should get something that gets very close to:
\begin{bmatrix} 4&0\\ 0&0 \end{bmatrix}Looking at the diagonal, we can see that 4 is an eigenvalue. We also have a 0 in the diagonal, but does not give much meaning since multiplying a vector by 0 will always create a zero vector.
Notice, that we have used QR decomposition and an iterative process to find eigenvalues but not the eigen vectors. But with our eigenvalues in place, we can define a system of equations:
$\mathbf{a}x = 4x$
$\begin{bmatrix} 0&2\\ 2&3 \end{bmatrix} \begin{bmatrix} x_1\\ x_2 \end{bmatrix} = 4\begin{bmatrix} x_1\\ x_2 \end{bmatrix}$
So to find the eigen vectors, we just need to solve this equation
numpy includes algorithms for finding eigenvalues and eigenvectors:
from numpy.linalg import eig
We'll look at the same simple matrix as above, but this tool can efficiently find eigen values and eigen vectors of arbitrarily large matrices:
a = np.array([[0, 2],
[2, 3]])
evalues,evectors=eig(a)
print('Eigen value:', evalues)
print('Eigen vector', evectors)
Eigen value: [-1. 4.] Eigen vector [[-0.89442719 -0.4472136 ] [ 0.4472136 -0.89442719]]
Notice that the eig() function returns both the eigen values and eigen vectors
Suppose that the symmetric matrix $\mathbf{A} = \begin{bmatrix} a, b, c\\b, d, e\\c,e,f \end{bmatrix}$ is known to have the three eigenvalues $\lambda_1 = 3$, $\lambda_2 = 1$, and $\lambda_3 = 4$, with the corresponding eigenvectors:
$\mathbf{v_1} = \begin{bmatrix} 1\\0\\-1 \end{bmatrix}$, $\mathbf{v_2} = \begin{bmatrix} 1\\2\\1\end{bmatrix}$, and $\mathbf{v_3} = \begin{bmatrix} 1\\-1\\1\end{bmatrix}$.
Determine the numerical values of a,b, c, d, e, f
(EMEA Exercise 13.10.2)
Using only the qr function and matrix multiplication, use the QR algorithm to find the eigenvalue of the following matrix:
$$\begin{bmatrix} 1&3&4\\3&1&0\\4&0&1 \end{bmatrix}$$Use a while loop to automatically loop through the iterations. What would be an appropriate stopping/exit decision for the loop?
We consider an economy where a certain percentage, $\alpha$, of people in any given year lose their jobs and a certain percentage of people, $\theta$ find a job.
We define a transition matrix
$$A = \begin{bmatrix} 1-\alpha & \alpha \\ \theta & 1-\theta \end{bmatrix}$$Let's say that initially (period 0) there are 1,000,000 employed people and 150,000 unemployed people: we can put this in a vector:
$$x0 = \begin{bmatrix} 1.000.000 & 150.000 \end{bmatrix}$$a.) Assuming that $\alpha=0.4$ and $\theta=0.6$ Use matrix multipication to find how many employed and unemployed people there will be in period 1, $x1$.
b.) What about in periods $x2$, $x3$, ... and so on? Do the values converge to some stable numbers?
c.) What are the eigenvectors of A? How does this relate to b?