Joint Distributions¶
Two Discrete Distributions¶
Consider a random experiment where we observe two random variables \(X\) and \(Y\). Assume both RV’s are discrete. The probability for outcomes \(X=x\) and \(Y=y\) is given by the **joint probability mass function \(p_{XY}\)
Again the sum of all possible outcomes of the experiment should be 1:
Note that we don’t have to run the summation over all of \(\setZ\) in case we know that \(p_{XY}(x,y)=0\) outside a given interval for \(x\) and \(y\).
Let’s consider an example where \(X\) can have the values 1,2 and 3, and \(Y\) can take on the values 1 and 2. Assume we know all probabilities we can set up the joint distribution table:
\(x\) | \(y\) | \(p_{XY}(x,y)\) |
---|---|---|
1 | 1 | 0.10 |
1 | 2 | 0.20 |
2 | 1 | 0.20 |
2 | 2 | 0.25 |
3 | 1 | 0.15 |
3 | 2 | 0.10 |
The joint distribution probability functions (and hence the joint distribution table) is all there is to know about the random experiment. So we may also calculate \(P(X=x)\) from it:
We can also calculate:
In case \(X\) and \(Y\) are independent we have:
Two Continuous Distributions¶
In case \(X\) and \(Y\) are continuous random variables we have a joint probability density function \(f_{XY}(x,y)\) let \(A\subset\setR^2\) then
Note that \(f_{XY}\) is a density function in two variables. Therefore \(f_{XY}(x,y)dxdy\) is a probability.
The integral of the pdf over \(\setR^2\) should equal 1 (because the universe in this case is \(\setR^2\):
Like for the discrete counterpart we can calculate \(f_X\) from \(f_{XY}\):
In case \(X\) and \(Y\) are independent we have:
Multiple Random Variables¶
Now we consider the case that we are observing \(n\) random variables from one random experiment. In case that \(X_1,X_2,\ldots,X_n\) are all discrete RV’s we have:
For continuous RV’s we have: