Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Covariance - what is happening here

Options
  • 01-11-2011 6:05pm
    #1
    Registered Users Posts: 246 ✭✭


    I found an example of a covariance proof in a book, but I don't understand one of the steps.

    The model is [latex]E(y_i) = \beta_0 + \beta_1x_i[/latex]
    and I want to find the covariance between [latex]y_i[/latex] and [latex]\hat{\beta_1}[/latex]

    So the book writes:
    [latex]$Cov$(y_i, \hat{\beta_1}) = \frac{1}{Sx}$Cov$(y_i, Sxy)[/latex], pulling the Sx out of the covariance ([latex]\hat{\beta_1}[/latex] = Sxy/Sx
    Then [latex]$Cov$(y_i, \hat{\beta_1}) = \frac{1}{Sx}$Cov$(y_i, \displaystyle\sum_{j}y_j(x_j - \bar{x}))[/latex]
    Then [latex]$Cov$(y_i, \hat{\beta_1}) = \frac{1}{Sx}$Cov$(y_i, y_i(x_i - \bar{x}))[/latex]

    What did they do to get to the last line from the previous line - i.e. how did they get rid of the summation sign?

    Thanks,
    Jack.


Comments

  • Registered Users Posts: 168 ✭✭Colours


    Hi Jack,

    I don't recall the underlying relationships at play between the variables and formulae (having done some study in this area quite a few moons ago), it looks to me like perhaps the reason why the summation sign can be done away with in the last line is because all the terms of the form Cov(y_i,y_j(x_j-xbar) are zero EXCEPT when j=i, that is the summation of all those terms ranging through j can simply be replaced by the only non-zero one which is Cov(y_i,y_i(x_i-xbar).


Advertisement