Orthogonal matrix

Let $$u_1, \dots, u_n \in \mathbb{R}^n$$ be an orthonormal set of vectors. Then, it's a basis for $$\mathbb{R}^n$$ and the matrix $$U = [u_1 \dots u_n]$$ is called orthogonal and it satisfies $$U^TU = I$$. It follows that, $$U^{-1} = U^T$$ and also $$UU^T = I$$. This can also be written as the sum of rank-one matrices :

$$\sum_{i=1}^n u_i u_i^T = I$$.

Expansion with orthogonal matrices is very simple. Since $$x = UU^Tx$$, we have that

$$x = \sum_{i=1}^n(i_i^T x) u_i$$.

In other words, to represent the vector as a linear combination of $$u_i$$'s, all we need to do is compute the linear combination coefficient for the $$i$$th component as $$u^T_i x$$.

Some nomenclature:
 * u_i^T is called the component or the coefficient of $$x$$ in the direction of $$u_i$$.
 * a = U^Tx resolves $$U^T x$$ resolves x into the vectors of its $$u_i$$ components.
 * $$x = Ua$$ reconstitutes $$x$$ from its $$u_i$$ components.
 * $$x = Ua = \sum_{i=1}^n a_i u_i$$ is called the expansion of $$x$$.

Geometric interpretation
Recall that left-multiplying $$x$$ with a matrix with orthonormal columns preserves norm and angles, i.e. $$||Uz|| = ||z||$$ and $$\angle (Uz,U\tilde{z}) = \angle (z,\tilde{z})$$. Therefore, any orthogonal matrices do either
 * A rotation about an axis
 * A reflection about an axis