Glossaria.net

Glossary Time-Series / Term

Mean Percentage Error

(MPE) A measure of forecast accuracy calculated by taking the individual forecast errors of the time series forecasts, dividing by the actual value of the time series, multiplying by 100, and calculating the average of the resulting values. The MPE is typically small because positive and negative errors tend to offset each other.

Permanent link Mean Percentage Error - Modification date 2019-12-22 - Creation date 2019-12-22


< Mean Error Glossary / Time-Series Mean Squared Error >