Expectation and Variance ======================== The **expectation** $\E(X)$ of a random variable is the value that is to be expected 'on average'. That is if we would take an infinite numbers from this random variable then the average is what we call the expectation. For a discrete RV we have: .. math:: \E(X) = \sum_{x=-\infty}^{\infty} x\,p_X(x) For a continuous RV the summation becomes an integral: .. math:: \E(X) = \int_{-\infty}^{\infty} x\,f_X(x)\,dx Let $g$ be some function $g: v\in\setR \mapsto \setR$ then $Y=g(X)$ is a random variable too of course. For a discrete RV we have: .. math:: \E(g(X)) = \sum_{x=-\infty}^{\infty} g(x)\,p_X(x) and for a continuous RV: .. math:: \E(g(X)) = \int_{-\infty}^{\infty} g(x)\,f_X(x)\,dx Using the above property we can prove an important property of the expectation: the **scaling property**. In case we make a new RV by multiplying the RV $X$ with a constant value $a$ and adding a constant $b$ we have: .. math:: \E(a X + b) = a\E(X) + b We give the proof for a continuous RV. .. math:: \E(aX+b) &= \int_{-\infty}^{\infty} (ax+b)\,f_X(x)\,dx\\ &= \int_{-\infty}^{\infty} a x\,f_X(x)\,dx + \int_{-\infty}^{\infty} b\,f_X(x)\,dx\\ &= a\,\int_{-\infty}^{\infty} x\,f_X(x)\,dx + b\,\int_{-\infty}^{\infty} f_X(x)\,dx\\ &= a \E(X) + b The **variance** $\Var(X)$ is defined as: .. math:: \Var(X) = \E( (X-\E(X))^2 ) it equals the expected quadratic difference from the mean of $X$. For the variance we have the following scaling property: .. math:: \Var(aX+b) = a^2 \Var(X) Note that adding a constant does not change the variance and that the scaling factor $a$ becomes a quadratic factor for the variance. We leave the proof as an exercise.