Glossaria.net

Glossary Statistics / Term

Joint Probability Distribution

If X1, X2, … , Xk are random variables defined for the same experiment, their joint probability distribution gives the probability of events determined by the collection of random variables: for any collection of sets of numbers {A1, … , Ak}, the joint probability distribution determines

P( (X1 is in A1) and (X2 is in A2) and … and (Xk is in Ak) ).

For example, suppose we roll two fair dice independently. Let X1 be the number of spots that show on the first die, and let X2 be the total number of spots that show on both dice. Then the joint distribution of X1 and X2 is as follows:

P(X1 = 1, X2 = 2) = P(X1 = 1, X2 = 3) = P(X1 = 1, X2 = 4) = P(X1 = 1, X2 = 5) = P(X1 = 1, X2 = 6) = P(X1 = 1, X2 = 7) =

P(X1 = 2, X2 = 3) = P(X1 = 2, X2 = 4) = P(X1 = 2, X2 = 5) = P(X1 = 2, X2 = 6) = P(X1 = 2, X2 = 7) = P(X1 = 2, X2 = 8) = …

… P(X1 = 6, X2 = 7) = P(X1 = 6, X2 = 8) = P(X1 = 6, X2 = 9) = P(X1 = 6, X2 = 10) = P(X1 = 6, X2 = 11) = P(X1 = 6, X2 = 12) = 1/36.

If a collection of random variables is independent, their joint probability distribution is the product of their marginal probability distributions, their individual probability distributions without regard for the value of the other variables. In this example, the marginal probability distribution of X1 is

P(X1 = 1) = P(X1 = 2) = P(X1 = 3) = P(X1 = 4) = P(X1 = 5) = P(X1 = 6) = 1/6,

and the marginal probability distribution of X2 is

P(X2 = 2) = P(X2 = 12) = 1/36

P(X2 = 3) = P(X2 = 11) = 1/18

P(X2 = 4) = P(X2 = 10) = 3/36

P(X2 = 5) = P(X2 = 9) = 1/9

P(X2 = 6) = P(X2 = 8) = 5/36

P(X2 = 7) = 1/6.

Note that P(X1 = 1, X2 = 10) = 0, while P(X1 = 1)×P(X2 = 10) = (1/6)(3/36) = 1/72. The joint probability is not equal to the product of the marginal probabilities: X1 and X2 are dependent random variables.

Permanent link Joint Probability Distribution - Creation date 2021-08-07


< Invalid (logical) argument Glossary / Statistics Law of Averages >