Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Really really basic statistics question

  • 28-05-2011 2:58pm
    #1
    Closed Accounts Posts: 47


    This may have been asked before, but I didn't find it within the last few pages and it has me embarrassingly confused (I'm one of those people that never really got along well with math).

    I used to think that the standard deviation was the same thing as adding up the distance from each sample to the mean and dividing by the number of samples (For the record, I do no the correct formula, but thought it was a quicker way of finding the same thing simply squaring and taking the square root to instantly make everything positive). I only just found out recently that this is not the case.

    My question is this. Is there actually a name for the result given by taking the difference between each sample and the mean, adding them up and dividing by the total number of samples, i.e. taking the mean of the differences.


Comments

  • Registered Users, Registered Users 2 Posts: 1,845 ✭✭✭2Scoops


    It called the absolute mean deviation or something along those lines. Very rarely used.


  • Closed Accounts Posts: 47 Slouch


    Exactly right. Thanks a bunch. I was able to find the explanation on the wikipedia article for absolute deviation.

    The average absolute deviation, or simply average deviation of a data set is the average of the absolute deviations and is a summary statistic of statistical dispersion or variability. It is also called the mean absolute deviation, but this is easily confused with the median absolute deviation.


    Since you mention it, why is it so rarely used? Is it a particularly problematic way of measuring dispersion? Is it more inclined to be skewed by outliers than standard deviation (only guessing because that seems to be the reason for most things I don't get about statistics :)).


  • Registered Users, Registered Users 2 Posts: 2,481 ✭✭✭Fremen


    Turns out that quantity is always zero.
    [latex] \frac{1}{n}\sum^n x_i = \mu[/latex],

    so


    [latex] \frac{1}{n}\sum^n (x_i - \mu) = \frac{1}{n}\sum^n x_i - \frac{1}{n}\sum^n \mu = \mu - \mu = 0. [/latex]

    If you take the absolute value of the first term inside the sum in the above, the result is related to the median and the L1 norm.

    I think it's called the first absolute central moment.


  • Registered Users, Registered Users 2 Posts: 2,481 ✭✭✭Fremen


    It's rarely used because the standard deviation is much better-behaved mathematically. The absolute value is not differentiable at xero, whereas x^2 is.

    The square root of the sum of squares can be interpreted nicely as a type of Euclidean distance, so it's a very natural quantity.


  • Closed Accounts Posts: 47 Slouch


    Thanks Fremen. I think I get the gist of what you're saying, though some of it is going a bit over my head. I might just write out some of these formulae down on paper and make sure I'm following them properly and look back here when my head is clear.


  • Advertisement
Advertisement