Glossaria.net

Glossary Time-Series / Term

Mean Squared Error

(MSE) A measure of forecast accuracy calculated by squaring the individual forecast error for each time-series observation and then finding the average of those errors. The MSE gives larger weight to large errors than to other measures of accuracy because those values are squared before they are averaged.

Permanent link Mean Squared Error - Modification date 2019-12-22 - Creation date 2019-12-22


< Mean Percentage Error Glossary / Time-Series MPE >