Why not use modulus for variance?

by Tiago   Last Updated April 28, 2017 00:19 AM

I am trying to wrap my mind around the variance definition.

Given a set of values S and n = #(S), the variance is defined as:

$$ var(S) = \frac{\sum_{i=1}^{n}( S_i - mean(S) )^2}{n} $$

And that measures how far away the values are on average from the mean.

However, there is a simpler formula that also measures how far away the values are from the mean:

$$ anotherPossibleDefForVar(S) = \frac{\sum_{i=1}^{n}|S_i - mean(S)|}{n} $$

I am trying to understand the reasoning behind the fact we use square root instead of a simpler modules function there. Is there a real reason why variance was defined the first way and not the second way?

Related Questions

Modularity for determining the number of clusters

Updated June 29, 2017 01:19 AM