Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

2 Statistics Problems I am having trouble with.

  • 04-03-2014 1:28pm
    #1
    Registered Users, Registered Users 2 Posts: 225 ✭✭


    Q1
    A method-of-moments estimator of the population variance, based on a sample of
    size n, is S^2 = some expression involving a sum from i=1 to i = n

    (a) Prove that S^2 is a biased estimator of σ^2.
    Can anyone tell me where to begin proving this? My guess is it involves the mean square thing but I can't for the life of me see how!

    (b) How can the estimator be modified to become unbiased?
    Again any ideas here?


    Q2
    Let Xi ∼ Exp(λ), where i = 1, 2, ..., n, and fX(x; λ) = λe−λx is the PDF of X.
    (a) Derive λ, the maximum likelihood estimator of λ.
    (b) For the sample {1, 1, 1, 1, 2, 2, 2, 3, 3, 4, 5} calculate the numeric estimate of λ.
    (c) Assuming that the standard error of λ is X/√n, compute the approximate
    95% confidence interval of λ.

    I really am stuck on this one, tried integrating the PDF and setting equal to one but I wasn't sure if I was solving for lambda or x at that stage and with two variables I gave up on that approach. I guess there is a simple way of approaching this but I just can't see it




    If anyone has any input at all, preferably a nudge in the right direction, I would very much appreciate it!!


Comments

  • Registered Users, Registered Users 2 Posts: 1,328 ✭✭✭Sev


    Q1. a)

    For something like this, my first approach would probably be to put an E[] around the whole expression for S^2. We want to show that E[expression] != the thing its supposed to estimate.

    This might be achieved by breaking up the expression into a sum of components for which you can say something about their expectation E[]. Remembering E[x+y] = E[x] + E[y]

    To be honest, I suspect this question is just a book proof about what you would call the "sample variance" and how it is an unbiased estimator of the true variance, look at :

    http://en.wikipedia.org/wiki/Bias_of_an_estimator#Sample_variance

    Q2)

    Again this is more or less a standard book proof. It's asking you to derive the maximum likelihood estimator of the exponential distribution.

    http://en.wikipedia.org/wiki/Exponential_distribution#Maximum_likelihood

    You need to write out an expression, L, for the likelihood of N arbitrary values of Xi. The likelihood, L, is just the product of (the PDF evaluated at Xi) for each value Xi. You would typically take the logarithm of this expression L (it breaks it up into a sum which is easier to handle while preserving the fact that the maximum of the logarithm of a function occurs at the same point as the maximum of the function).

    To get the maximum of this log-likelihood, you then take the derivative and set it equal to zero and solve for lambda (note: the derivative of a sum is just the sum of the derivatives). This will give you an expression for the lambda that provides the best likelihood (product of pdfs evaluated at those points). Part b just wants you to evaluate your expression with a set of points it has given you.

    Q2. c) is not necessarily related to the rest of the question (MLE estimation). It's just asking you for the confidence intervals given that you know standard deviation of the estimate (aka. the standard error). This is just an exercise in looking up Z-tables.


Advertisement