Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Maximum Likelihood of a Geometric distribution

Options
  • 28-03-2012 8:08pm
    #1
    Moderators, Education Moderators, Motoring & Transport Moderators Posts: 7,395 Mod ✭✭✭✭


    If I have [latex]X_1, X_2, .... , X_n[/latex] that's iid and each [latex]X_i[/latex] is Geometric(p). (X is the number of trials until the first success, p is the probability of success in each trial).

    Thus the pmf is [latex]p_{X_i}(x_i) = (1-p)^{x_i-1}p[/latex].

    I am asked to find the maximum likelihood estimator of p, and going through the usual method I get

    [latex]\hat{p}=\displaystyle{\frac{1}{\bar{x}}[/latex]

    I am then asked to find bias, variance and mean squared error of this estimator.
    However,
    [latex]\test{bias}(\hat{p}) = E[\hat{p}]-p[/latex] where E is the expected value.

    However, this is [latex]\displaystyle{E[\frac{1}{\bar{x}}][/latex] which I don't know how to compute! Writing it out as [latex]\displaystyle{nE[\frac{1}{\sum{X_i}}][/latex] doesn't help either!

    I can't really calculate Variance or MSE of the estimator either without getting past this problem!

    Thanks!


Comments

  • Moderators, Education Moderators, Motoring & Transport Moderators Posts: 7,395 Mod ✭✭✭✭**Timbuk2**


    Wait, I think I might have an idea:

    I want [latex]nE[\frac{1}{\sum{X_i}}][/latex]

    As each Xi is Geometric(p), and there are n of them, then [latex]\sum(X_i)[/latex] is NegativeBinomial(n, p) (yes?)

    Let [latex]S=\sum{X_i}[/latex]

    Therefore pmf for S is [latex]P_S(s) = {{s-1}\choose{n-1}}p^n(1-p)^{s-n}[/latex].

    So [latex]E(1/S) = \displaystyle{\sum_{s=1}^{\infty}(1/s){{s-1}\choose{n-1}}p^n(1-p)^{s-n}}[/latex] which is pretty difficult to calculate (unless there's some trick?)

    Any other ideas?


  • Registered Users Posts: 3,745 ✭✭✭Eliot Rosewater


    This does seem quite challenging! I think you're on the right track with the negative binomial but that summation is difficult. The usual tactic would be to incorporate the 1/s into the combinatorical thing, but it doesn't work...sorry!


  • Registered Users Posts: 338 ✭✭ray giraffe


    Probably uses trick with partial fractions, 1/s = 1/(s-1) - 1/s(s-1) or similar...


  • Registered Users Posts: 3,745 ✭✭✭Eliot Rosewater


    That 1/s(s-1) term is still problematic though. I tried a few Googles ... but this thread is now the 6th result! :D You could perhaps try http://math.stackexchange.com/ -- you will more than likely get an answer.


  • Moderators, Education Moderators, Motoring & Transport Moderators Posts: 7,395 Mod ✭✭✭✭**Timbuk2**


    Thanks for the answers!

    I found this after a few google searches: http://statistics.stanford.edu/~ckirby/techreports/ONR/SOL%20ONR%20130.pdf (see page 11 onwards), which seems to solve this problem, but it's very tricky, possibly a bit beyond 2nd year statistics!

    The lecturer has since said that this problem is solvable as it is, but more difficult than he anticipated, and that he proposes instead we estimate [latex]\tau = \frac{1}{p}[/latex]

    This certainly seems easier, as using the same methods, [latex]\frac{1}{p} = \hat{\tau}=\bar{x}[/latex], and thus

    [latex]E(\hat{\tau}) = \frac{1}{n}E(W) = \frac{1}{n}\frac{n}{p} = \frac{1}{p}[/latex] where W = [latex]\displaystyle\sum_{i = 1}^n X_i[/latex] ~ NegBin(n, p)

    Is this valid though? 1/p is not exactly a paramater - I am estimating the mean I suppose! Funny how it turns out to be much easier to work out (and in fact an unbiased estimator of 1/p, whereas estimating p using MLE doesn't seem to be unbiased).


  • Advertisement
Advertisement