Probability Distributions ========================= Discrete Distributions ---------------------- Bernoulli Distribution ....................... The simplest of all probabilistic experiments perhaps is tossing a coin. The outcome is either heads or tails, or 0 or 1, or.. Such an experiment is called a Bernoully experiment and the corresponding distribution is called the **Bernoulli distribution**. The Bernoulli distribution has parameter $p$. In case the random variable $X$ is Bernoulli distributed we write $X\sim\Bernoulli(p)$. The probability mass function is: .. math:: p_X(x) = \begin{cases} 1-p &: x=0\\ p &: x=1\\ 0 &: \text{elsewhere} \end{cases} The expectation equals: .. math:: \E(X) &= \sum_{x=-\infty}^{\infty} x\,p_X(x)\\ &= p and its variance: .. math:: \Var(X) = p(1-p) Binomial Distribution ..................... Consider a Bernoulli distributed random variable $Y\sim\Bernoulli(p)$, and let us repeat the experiment $n$ times. What then is the probabilty of $k$ successes. A success is defined as the outcome $Y=1$ for the Bernoulli experiment. So we define a new random variable $X$ that is the sum of $n$ outcomes of repeated independent and identicallu distributed (iid) Bernoulli experiments. The outcomes of $X$ run from $0$ to $n$ and the probability of finding $k$ successes is given as: .. math:: p_X(x) = P(X=k) = {n \choose k}\,p^k\,(1-p)^{n-k} This is called the **Binomial Distribution**. For a random variable $X$ that has a binomial distribution we write $X\sim\Bin(n,p)$. The expectation is: .. math:: \E(X) = n\,p and the variance: .. math:: \Var(X) = n\,p\,(1-p) Uniform Distribution .................... The **discrete uniform distribution** is used in case all possibe outcomes of a random experiment are equally probable. Let $X\sim\Uniform(a,b)$, with $a