Random variable

A random variable is simply a mapping between all possible experimental outcomes and a numerical value. For example, in coin tossing all experimental outcomes are $$H, T$$ and the numerical values are generally $$0, 1$$; the random variable is nothing but a function that does this mapping. The name random variable is somewhat disappointing as there is nothing random about this function, it just maps $$H$$ to $$0$$ and $$T$$ to 1, pretty deterministicly. What is random is the outcome of the experiment; we don't know if we'll have $$H$$ or $$T$$.

My notes from Papoulis:

Definition of Random Variable. An RV $$\mathbf x$$ is a process of assigning a number $$\mathbf x(\zeta)$$ to every outcome $$\zeta$$.

Example 1. In the coin tossing experiment, if $$\zeta_1 = head $$ and $$\zeta_2 = tails$$, one may construct the RV $$\mathbf x$$ such that $$\mathbf x(\zeta_1) = 0$$ and $$\mathbf x(\zeta_2) = 1$$.

Example 2. Suppose that our RV $$\mathbf x$$ will measure temperature. Then, one can construct the RV $$\mathbf x$$ simply as $$\mathbf x(\zeta_i) = \zeta_i$$ (\eg when $$\zeta$$ is a continuous quantity such as temperature). In this case, the variable $$\zeta$$ has a double meaning (see p77): It is the outcome of the experiment and the corresponding value $$\mathbf x(t)$$ of the RV $$\mathbf x$$.

The construction of RV is important for accurately constructing the intended probability distribution. For example, in the experiment of tossing a coin twice, if we don't care about the order of the outcomes, one can construct the RV $$\mathbf x$$ as: $$\mathbf x(HH) = 0, \,\,\,\mathbf x(HT)=1, \,\,\,\mathbf x(TT)=2.$$ But if we do care about the order, then we can construct $$\mathbf x$$ as: $$\mathbf x(HH) = 0, \,\,\,\mathbf x(HT)=1, \,\,\,\mathbf x(TH)=2, \,\,\,\mathbf x(TT)=3.$$