Root mean square
The root mean square is a way of calculating how badly a set of observed numbers differs from a set of expected numbers.
Assume we have a set of observations as numbers, and we also know for each observation what the expected number was. Now let's say that we want to know how much these observations differ from their expected numbers. One approach would be to calculate the individual differences (expected - observed), add them all up, and divide by the number of observations. The problem here is that some of the observations will be larger than the expected values resulting in negative difference values.
To prevent the negative numbers from canceling out the positive we first need to take the absolute value of each of the differences. The average of the absolutes is called the Mean Absolute Error. The MAE treats all numbers as equal when it calculates the average, however, what if we care more about large differences and not so much about smaller ones.
This is where the root mean square is helpful. The RMS first squares the difference values, it then calculates the average and finally takes the root of the result. The square and root operations cancel each other out. And the square operation also neatly solves our problem of negative numbers. Since we are now squaring numbers larger differences count heavier in the final value.
Algorithm
- For each of the observations:
- Subtract observed value from expected value, this is the difference
- Square the difference
- Add the squared difference to a total
- Divide the total by the number of values
- Take the root of the division