Respuesta :

Answer:

Step-by-step explanation:

Consider that we have a multiple linear model given by this:

[tex]y = \beta X + e[/tex]

And we assume that we have p independent variables and p+1 is the dimension with the vector including the intercept, so then n =p+1.

For this case we want to minimize the sum of error [tex]\sum_{i=1}^n e^2_i[/tex]

And we have this:

[tex] min e'e = (y-Xb)' (y-Xb)[/tex]

[tex] min e'e= y'y -2b'X'y +b'X'Xb[/tex]

And when we find the partial derivate respect to b we got:

[tex]\frac{d(e'e)}{db} =-2X'y + 2X'Xb =0[/tex]

[tex] X'X b= X' y[/tex]

We are assuming that the matrix X is invertible(non singular). And applying the inverse matrix X'X on both sides we got:

[tex]b =\hat \beta =(X'X)^{-1} X'y[/tex]

The estimator for [tex\beta[/tex] is given by:

[tex] \hat \beta = (X^T X)^{-1} X^T y[/tex]

And the predicted values can be written like this:

[tex] \hat y = X \hat \beta =[X(X^T X)^{-1} X^T]= Hy[/tex]

And we can see that [tex] H=X(X^T X)^{-1} X^T[/tex] and on this case if we find the dimensions for H, assuming that X is nxn. Then H is a matrix nxn since only depends of the matrix X.

On this case the elements [tex]h_ii[/tex] represent the elements of the diagonal for the matrix H and are used in order to find the estimated values [tex]hat y[/tex]