# Diagonalization

Suppose that has a linearly independent set of eigenvectors :

for .

We can define and . Using the eigenvalue identities listed above, we can write^{[1]}:

and finally (since has a set of linearly independent eigenvectors):

.

In other words, similarity transformation by *diagonalizes* . The converse holds too: If there is such that , then , which implies that form a set of linearly independent eigenvectors.

### Eigendecomposition[edit]

Clearly, one can decompose as . This is called *eigendecomposition* as it is a decomposition that involves eigenvectors and eigenvalues.

### When is diagonalizable?[edit]

is diagonalizable if

- there exists s.t. is diagonal
- has a set of linearly independent eigenvectors

Alternatively, is diagonizable also if it has distinct eigenvalues. (The converse does not necessarily hold.)

### Left Eigenvectors[edit]

If is diagonalizable, then we can also write . Let be the rows of . Then, we can write:

.

In other words, s are (linearly independent) left eigenvectors of .

## What's the point of Diagonalization?[edit]

Diagonalization leads to modal form, which is very useful for analyzing LDSs. Specifically, modal forms allow us to find the conditions for system stability. Also, the modal form (or one can attribute it to diagonalization itself) allows us to obtain simple expressions of the resolvent of a system , the matrix power and the matrix exponential ; the latter give the solutions to discrete- and continuous-time systems (see Modal Form).

## References[edit]

- ↑ Stephen Boyd, EE263 Lecutre Notes 11