Ergodicity

In a word, ergodicity means that averaging over time and averaging over ensembles is the same thing (page 562 ). That is, consider (i) a single realization of a random process, $$x_i[n]$$, and (ii) the random process $$x_1[n_0],x_2[n_0], \dots$$ that contains an infinite number of random variables obtained by picking the timepoint $$n_0$$ for all the realizations $$x_1, x_2, \dots$$. Then, ergodicity says that the two aforementioned means will be the same, i.e.,

$$\lim_{M\to \infty}\frac{1}{M}\sum_{m=1}^M x_m[n_0] = \lim_{N\to \infty} \sum_{n=1}^N x_i[n]$$

for any realization $$x_i$$. In short, we can say that for an ergodic random process a single realization is representative of an ensemble of realizations.

Ergodicity does not apply only for the mean,but also for other moments. E.g., we can say ergodic in the covariance etc.


 * Fig. 17.8 gives nice example of non-ergodic random process

What's the point of ergodicity?
Ergodicity is a critical property as it allows us to infer properties of a random process using a single realization. In practice we never have infinite observations, but if we have a long sequence of observations, ergodicity tells us that we can divide this sequence into chunks and treat each chunk as a separate realization. This allows us, for example, to obtain better estimates of the PSD function (see page 579 ).