Generalized Least Squares

Consider a full rank parameterization

by SVD of ,

the projection Matrix is , which is symmetric, and hence is an orthogonal projection.

Now all computations have been done in the coordinates, so in particular estimates .

Since linear combinations of Gauss-Markov estimates are Gauss-Markov, it follows immediately that .

A direct solution via inner products

We can approach the problem of determining the Generalized Least Squares estimators in a different way by viewing as determining an intter product.

We do this by returning to first principles, carefully defining means and covariances in a general inner product space.

let and be the usual innter product.

choose a basis , the usual coordinate vectors. then a rvec has coordinates .

  • Definition 1.

where . For any ,

thus, another characterization of is: is the unique vector that satisfies for all .

Now, turn to Cov. use the same set-up as above. if , then exists for all , and defines .

For any ,

  • Definition 2

Assume . The unique non-negative definite linear transformation that satisfies for all is called the covariance of and is denoted .

  • Theorem 1

let with innerproduct , . Define another inner product on by for some positive definite . Then the covariance of in the inner product sapce is .

  • Note 1: This shows that if exists in one inner product, it exists in all inner products.

If in , then if in the inner product , the covariance is .

  • Theorem 2

Suppose in . If is symmetric on , and for all , then . This implies that the covariance is unique.

Consider the inner product sapce given by , where , and .

Let be the projection on in this inner product space, and let , so .

  • Theorem 3

with , is an orthogonal projection.

  • Theorem 4

let the OLS estimate and the GLS estimate . then

  • Corollary 1

So need not be inverted to apply the theory.

To use this equivalence theorem (due to W. Kruskal), we usually characterize the ‘s for a given for which .

if is completely arbitrary, then only works.

  • Intra-class correlation model:

let . then any of the form

with will work.

to apply the theorem, we write,

so for , the i-th coluimn of is

with .

Thus, the i-th column of is a linear combination of the i-th column of and the column of ‘s.

For the first column of , we compute and , So as required, provided that or .