=========================== Probability Distributions =========================== There are a lot of probability mass functions and probability density functions for which an analytical expression is known. These functions are parameterized in the sense that only a few parameters define the value for every value $x$. In this section we discuss some of the most often used functions. In a later section we describe one method to estimate the parameters given a sample from a distribution. Discrete Distributions ---------------------- A discrete distribution is completely characterized with its probability mass function $p_X$. Remember that for a probability mass function we should have that the sum over all possible outcomes $x$ equals 1 and that all probabilities are greater than or equal to zero. Given that function we will calculate (or only give the result) both the expectation and the variance. Bernoulli Distribution ....................... .. figure:: /figures/headstails.jpg :figwidth: 40% :align: right **Heads or Tails.** Grammarist explains. “Heads refers to the side of the coin with a person’s head on it. Tails refers to the opposite side, not because there is a tail on it, but because it is the opposite of heads.” The simplest of all probabilistic experiments perhaps is tossing a coin. The outcome is either heads or tails, or 0 or 1. Such an experiment is called a Bernoully experiment and the corresponding distribution is called the **Bernoulli distribution**. The Bernoulli distribution has parameter $p$. In case the random variable $X$ is Bernoulli distributed we write $X\sim\Bernoulli(p)$. The probability mass function is: .. math:: p_X(x) = \begin{cases} 1-p &: x=0\\ p &: x=1\\ 0 &: \text{elsewhere} \end{cases} The expectation equals: .. math:: \E(X) &= \sum_{x=-\infty}^{\infty} x\,p_X(x)\\ &= 0\times (1-p) + 1\times p\\ &= p and its variance: .. math:: \Var(X) = p(1-p) Binomial Distribution ..................... .. figure:: /figures/galton.gif :figwidth: 50% :align: right **Galton Board.** A simulation of a Galton board. The distribution of the ball over the bins follows binomial distribution. Open the animated gif image in a separate window to clearly see the pins that force a falling ball to go either left or right (the Bernouilli experiment). Google for Galton board video and you can see a physical Galton board in action. Consider a Bernoulli distributed random variable $Y\sim\Bernoulli(p)$, and let us repeat the experiment $n$ times. What then is the probabilty of $k$ successes. A success is defined as the outcome $Y=1$ for the Bernoulli experiment. So we define a new random variable $X$ that is the sum of $n$ outcomes of repeated independent and identicallu distributed (iid) Bernoulli experiments. The outcomes of $X$ run from $0$ to $n$ and the probability of finding $k$ successes is given as: .. math:: p_X(k) = P(X=k) = {n \choose k}\,p^k\,(1-p)^{n-k} This is called the **Binomial Distribution**. For a random variable $X$ that has a binomial distribution we write $X\sim\Bin(n,p)$. The expectation is: .. math:: \E(X) = n\,p and the variance: .. math:: \Var(X) = n\,p\,(1-p) .. exec_python:: rvPMF rvDiscrete :linenumbers: :code: shutter :Code_label: Show code for figure :results: hide import numpy as np import matplotlib.pyplot as plt from scipy.stats import binom n = 20 for p, c in zip([0.05, 0.4, 0.8], ['r', 'g', 'b']): plt.stem(np.arange(0, n+1), binom.pmf(np.arange(0, n+1), n, p), linefmt=c, markerfmt=c+'o', label=f'p={p}') plt.legend() plt.xticks(np.arange(0, n+1)) plt.savefig('source/figures/binomialdistribution.png') .. figure:: /figures/binomialdistribution.png :align: center :width: 60% **Binomial Distribution.** Shown are the binomial probability mass functions for $n=20$ and $p\in\{0.05, 0.4, 0.8\}$. Uniform Distribution .................... .. figure:: /figures/dice12.jpg :figwidth: 50% :align: right **Twelve sided die.** Throwing this die result in a number in the range from 1 to 12 to show on top with equal probability (in case of a fair die). The **discrete uniform distribution** is used in case all possibe outcomes of a random experiment are equally probable. Let $X\sim\Uniform(a,b)$, with $a-s, x