Least-norm solution

When a linear system $$y=Ax$$ has more than one solution, we may be interested in getting the solution that has the least L2 norm, $$||\cdot||$$. Clearly, this can happen only if $$A\in \mathbb{R}^{m\times n}$$ is strictly fat. In other words, $$x$$ is underspecified -- many choices of $$x$$ lead to the same $$y$$. If we assume that the $$A$$ has full rank, according to the fundamental theorem of linear algebra, $$\text{dim} \mathcal N (A) = n-m$$.

The least-norm solution $$x_{ln}$$ is then :

$$x_{ln} = A^T (AA^T)^{-1}y$$

The matrix $$A^T (AA^T)^{-1}$$ is the pseudo-inverse -- it is a right inverse. The matrix $$I-A^T (AA^T)^{-1}A$$ gives the projection onto $$\mathcal N(A)$$.

This can be shown by first showing that $$(x-x_{ln}) \perp x_{ln}$$, and using this to expand $$||x||{}^2$$ as $$||x||{}^2 = ||x_{ln} + x - x_{ln}||{}^2 = ||x_{ln}||{}^2 + ||x-x_{ln}||{}^2 \ge ||x_{ln}||{}^2$$.

Geometric Interprtation
The orthogonality condition leads to the fact that $$x_{ln} \perp \mathcal N(A)$$. The projection interpretation of the least-norm solution is that $$x_{ln}$$ is the projection of $$0$$ on the solution set $$\{x : Ax= y\}$$.

Relation to regularized least-squares
There is a relationship between a multi-objective least squares and the least-norm solution. If we define first objective as $$J_1 = ||Ax-y||{}^2$$ and $$J_2$$ as $$J_2 = ||x||{}^2$$, as $$\mu\to 0$$, $$x_{ls}\to x_{ln}$$ because in the latter case we have that

$$(A^TA + \mu I)^{-1} \to A^T (A A^T)^{-1}$$.

Note that $$\mu$$ cannot be zero in this approach because $$A$$ is assumed to be (strictly) fat and when $$\mu =0$$, the matrix $$(A^TA + \mu I)^{-1} $$ is not invertible.

Both least-norm and least-squares are special cases of general norm minimization with equality constraints.