by Tiago
Last Updated April 28, 2017 00:19 AM

I am trying to wrap my mind around the variance definition.

Given a set of values ** S** and

$$ var(S) = \frac{\sum_{i=1}^{n}( S_i - mean(S) )^2}{n} $$

And that measures how far away the values are on average from the mean.

However, there is a simpler formula that also measures how far away the values are from the mean:

$$ anotherPossibleDefForVar(S) = \frac{\sum_{i=1}^{n}|S_i - mean(S)|}{n} $$

I am trying to understand the reasoning behind the fact we use square root instead of a simpler modules function there. Is there a real reason why variance was defined the first way and not the second way?

- ServerfaultXchanger
- SuperuserXchanger
- UbuntuXchanger
- WebappsXchanger
- WebmastersXchanger
- ProgrammersXchanger
- DbaXchanger
- DrupalXchanger
- WordpressXchanger
- MagentoXchanger
- JoomlaXchanger
- AndroidXchanger
- AppleXchanger
- GameXchanger
- GamingXchanger
- BlenderXchanger
- UxXchanger
- CookingXchanger
- PhotoXchanger
- StatsXchanger
- MathXchanger
- DiyXchanger
- GisXchanger
- TexXchanger
- MetaXchanger
- ElectronicsXchanger
- StackoverflowXchanger
- BitcoinXchanger
- EthereumXcanger