Glossaria.net

Glossary Statistics / Term

Law of Averages

The Law of Averages says that the average of independent observations of random variables that have the same probability distribution is increasingly likely to be close to the expected value of the random variables as the number of observations grows. More precisely, if X1, X2, X3, …, are independent random variables with the same probability distribution, and E(X) is their common expected value, then for every number ε > 0,

P{|(X1 + X2 + … + Xn)/n − E(X) | < ε}

converges to 100% as n grows. This is equivalent to saying that the sequence of sample means

X1, (X1+X2)/2, (X1+X2+X3)/3, …

converges in probability to E(X).

Permanent link Law of Averages - Creation date 2021-08-07


< Joint Probability Distribution Glossary / Statistics Law of Large Numbers >