Diagonalization

Suppose that $$A \in \mathbb{R}^{n\times n}$$ has a linearly independent set of eigenvectors $$v_1, \dots, v_n$$:

$$Av_i = \lambda_i v_i$$ for $$i=1,\dots,n$$.

We can define $$T:=[v_1 \dots v_n]$$ and $$\Lambda = \text{diag}(\lambda_1,\dots,\lambda_n)$$. Using the eigenvalue identities listed above, we can write :

$$AT = T\Lambda$$

and finally (since $$T$$ has a set of linearly independent eigenvectors):

$$T^{-1}A T = \Lambda$$.

In other words, similarity transformation by $$T$$ diagonalizes $$A$$. The converse holds too: If there is $$T=[v_1 \dots v_n]$$ such that $$T^{-1}A T = \Lambda = \text{diag}(\lambda_1, \dots, \lambda_n)$$, then $$AT = T\Lambda$$, which implies that $$v_1,\dots, v_n$$ form a set of linearly independent eigenvectors.

Eigendecomposition
Clearly, one can decompose $$A$$ as $$A = T \Lambda T^{-1} $$. This is called eigendecomposition as it is a decomposition that involves eigenvectors and eigenvalues.

When is $$A$$ diagonalizable?
$$A$$ is diagonalizable if
 * there exists $$T$$ s.t. $$T^{-1}A T = \Lambda$$ is diagonal
 * $$A$$ has a set of linearly independent eigenvectors

Alternatively, $$A$$ is diagonizable also if it has distinct eigenvalues. (The converse does not necessarily hold.)

Left Eigenvectors
If $$A$$ is diagonalizable, then we can also write $$T^{-1}A = \Lambda T^{-1}$$. Let $$w_i^T$$ be the rows of $$T^{-1}$$. Then, we can write:

$$w_i^T A = \lambda_i w_i^T$$.

In other words, $$w_i$$s are (linearly independent) left eigenvectors of $$A$$.

What's the point of Diagonalization?
Diagonalization leads to modal form, which is very useful for analyzing LDSs. Specifically, modal forms allow us to find the conditions for system stability. Also, the modal form (or one can attribute it to diagonalization itself) allows us to obtain simple expressions of the resolvent of a system $$(sI-A)^{-1}$$, the matrix power $$A^k$$ and the matrix exponential $$e^A$$; the latter give the solutions to discrete- and continuous-time systems (see Modal Form).