Glossaria.net

Glossary Statistics / Term

Maximum Likelihood Estimate (MLE)

The maximum likelihood estimate of a parameter from data is the possible value of the parameter for which the chance of observing the data largest. That is, suppose that the parameter is p, and that we observe data x. Then the maximum likelihood estimate of p is

estimate p by the value q that makes P(observing x when the value of p is q) as large as possible.

For example, suppose we are trying to estimate the chance that a (possibly biased) coin lands heads when it is tossed. Our data will be the number of times x the coin lands heads in n independent tosses of the coin. The distribution of the number of times the coin lands heads is binomial with parameters n (known) and p (unknown). The chance of observing x heads in n trials if the chance of heads in a given trial is q is nCx qx(1−q)n−x. The maximum likelihood estimate of p would be the value of q that makes that chance largest. We can find that value of q explicitly using calculus; it turns out to be q = x/n, the fraction of times the coin is observed to land heads in the n tosses. Thus the maximum likelihood estimate of the chance of heads from the number of heads in n independent tosses of the coin is the observed fraction of tosses in which the coin lands heads.

Permanent link Maximum Likelihood Estimate (MLE) - Creation date 2021-08-07


< Markov's Inequality Glossary / Statistics Mean, Arithmetic mean >