Easier Eigenvectors For Hermitian Matrices

Dante


If you're wondering what an eigenvalue, or eigenvector is, thank whatever god you pray to and move on. However, if you're morbidly curious, eigenvalues and their respective eigenvectors are nontrivial solutions the this equation.

$$A\vec{v}=\lambda\vec{v}$$

And by nontrivial, I mean that you can't just set \(\vec{v}\) and \(\lambda\) to zeros and be done with it. You'd like to do that, but that would be too easy. Instead, you will be instructed to solve the equation through the characteristic equation in order to find appropriate eigvenvalues, like so,

$$\left\|A-\lambda\text{I}\right\|=0$$

These, in turn, you will plug back into the equation above, allowing you to solve for all the appropriate eigvectors by gaussian elimination. So, for a 3x3 matrix, the characteristic equation can stiff you with a 3rd order polynomial solution (for three total possible eigenvalues), and each of these eigenvalues must be plugged back into the original eigvector equation to give you three eigenvectors by solving three sets of three linear equations -- after which you should then check that these vectors are linearly independent of one another, and toss out those that are not. If that description gives you goosebumps, you're not alone. It is therefore, no small wonder that we might want for a faster, or at least more streamlined, method for solving these equations, and that's where this blog comes in.

Unfortunately, I'm not here to help you with the characteristic equation. You're on you're own with that one, but what I can help you out with, is determining the eigenvectors, once you have a set of eigenvalues, particularly once we get to this equation.

$$(A-\lambda\text{I})\vec{v}=0$$

In particular, what I found was that given a square hermitian matrix, A, and it's eigenvalues, \(\lambda_i\), the eigenvectors of the matrix, \(\vec{v_i}\), are given by taking the matrix G, defined by,

$$G_i=A-\lambda_i\text{I}$$

Substituting each row with a constants (I suggest 1) and then taking the determinant. In case you're wondering, the row number will then correspond to the eigenvector element. So, if you're like me and find determinants far more appealing than Gaussian elimination, this might just be the method for you. Of course, it's probably leaving your head spinning, too. So let's do an example!

Let's start with a matrix satisfying the above stipulations, notice the symmetry about the diagonal -- this tells us that this matrix is hermitian.

$$A=\begin{bmatrix}4&6\\6&{-1}\end{bmatrix}$$

Placing this inside of the characteristic equation yields

$$\left\|A - \lambda\text{I}\right\|=\left|\begin{matrix}{4-\lambda}&6\\6&{-1-\lambda}\end{matrix}\right|=0$$

Taking the determinate, we are left with the quadratic equation

$$\begin{array} ((4-\lambda)\cdot{(-1-\lambda)}-(6^2) &=& -4-4\cdot{\lambda}+\lambda+\lambda^2-36\\
{}&=&\lambda^2-3\cdot\lambda-40\\
{}&=&(\lambda+5)\cdot(\lambda-8) = 0\end{array}$$

So, \(\lambda=-5\) or \(\lambda=8\).

Now that we have our eigenvalues, we just need to plug them back into the equation above to get G.

$$G_1=\begin{bmatrix}{4+5}&6\\6&{5-1}\end{bmatrix}=\begin{bmatrix}9&6\\6&4\end{bmatrix}$$
$$G_2=\begin{bmatrix}{4-8}&6\\6&{-1-8}\end{bmatrix}=\begin{bmatrix}{-4}&6\\6&{-9}\end{bmatrix}$$

Let's start with \(G_1\) and find the first eigenvector.

$$\left\|{G_{1,1}}\right\|=\left|\begin{matrix}1&6\\1&4\end{matrix}\right|=(1\cdot4)-(1\cdot6)=4-6=-2$$
$$\left\|{G_{1,2}}\right\|=\left|\begin{matrix}9&1\\6&1\end{matrix}\right|=(9\cdot1)-(1\cdot6)=9-6=3$$

Which makes,

$$\vec{v_1}=\begin{bmatrix}{-2}\\{3}\end{bmatrix}$$

Now, let's move on to \(G_2\),

$$\left\|{G_{2,1}}\right\|=\left|\begin{matrix}1&6\\1&-9\end{matrix}\right|=(1\cdot{-9})-(6\cdot1)=-9-6=-15$$
$$\left\|{G_{2,2}}\right\|=\left|\begin{matrix}{-4}&1\\6&1\end{matrix}\right|=(-4\cdot1)-(1\cdot6)=-4-6=-10$$

Which makes,

$$\vec{v_2}=\begin{bmatrix}{-15}\\{-10}\end{bmatrix}$$

Or, factoring out a -5 and throwing it away (we divide it out from both sides of the eigenvector equation we started with at the top of the document),

$$\vec{v_2}=\begin{bmatrix}{3}\\{2}\end{bmatrix}$$

Now we have \(\vec{v_1}\) and \(\vec{v_2}\), the eigenvectors of our equation, without having to solve any sets of linear equations. Of course, you might prefer to use linear equations, but as determinates are pretty much a cut and dry algorithm, there's a chance that you, like me, prefer to do these via determinates instead.


Also, be warned that I've never actually "proved this", and it's been years since I've played with this stuff in my diff-eq course. So make sure to try it out and "test it" to see if I was completely off my rocker or not. If you're up for trying to prove it yourself, my hunch was that this is basically an extension of cramer's rule. Hope this helps :).



I declare that I am not a spam bot.