Deriving the Least Squares Estimators

Document Preview:

Part 1: Deriving the Least Squares Estimators
Recall, in Linear Least Squares, we have to estimate the ‘line of best fit’ which minimizes the ‘sum of squared deviations’ of the data. This is equivalent to choosing parameters which minimises the function of two variables (namely SSR):
.
You will use calculus to show that is minimized for the choices:
where and .
You may assume the following summation properties in your solution (see Lecture 5):
If k is a constant, .
Given and for , then .
Differentiating a summation (the ‘derivative of the sums’ equals ‘the sum of the derivatives’):
In addition, for any question in Part 1, you may assume the results for all the previous questions, even if you cannot prove the results in the previous questions up to that point.
Q1 (1 mark): Recall from lectures that the ‘total deviation about the mean’ is always zero, i.e. . Expand the summation across the brackets and apply this result to the to prove that:
.
Ans:
Q2 (1 mark): Prove that . (Hint: use the result of Q1).
Ans:
Q3 (2 marks): By differentiating the summation, show that . Then show that when .
Ans: Now,
Q4 (1 mark): Differentiate the summation with respect to , to get a summation expression for .
Ans:
Q5 (2 marks): By setting , show that you obtain the least squares estimator for the second parameter, namely. (Hint: you will have to substitute , and use the previous results.)
Ans:
Q6 (1 mark) Use the appropriate second-order test from Mathematics to show that the above choices for give rise to a minimum for.

 

"Is this question part of your assignment? We Can Help!"