Multivariate Multiple Regression (wk6)
Overview
Recall:
univariate Linear Regression:
repsponse variable , predictor variables .
- model:
- estimation:
- inference:
let . then
Multivariate Multiple Regression
-
Notation
-
Model
-
- Cov of responses:
-
the meaning of
- : observations from different trials, are uncorrelated
- : errors for different responses on the same trial are correlated
-
th response :
Least Square
- Collecting Univariate Least Squares Estimates (LSE)
- Errors
-
Error Sum of Squares (SSE)
- diagonal elements: Error SS for univariate least squares is minimized.
- the generalized is also minimized.
-
Properties
-
- by (3), residuals are orthogonal to
- by (4), residuals are orthogonal to
-
Error Sum of Squares
- Results 1
-
- at here, and are correlated.
-
Results 2
- If has a , then is MLE of
-
- (5) is MLE of
- .
-
Comment
- Multivariate regression requires no new computational problems.
- Univariate least squares are computed individually for each response variable.
- Diagnostics check must be done as in univariate regression.
- Residual vectors can be examined for multivariate normality.
Hypothesis Testing
- Note:
Full Model vs. Reduced Model
let , then .
under , ,
let
- . 여기서 E라는 것은 오차행렬이기 때문에, 즉 univariate 를 4번 반복해서 나온 오차를 모은 것이 바로 이 라는 행렬.
let be non-zero ev of , .
- Four Test Stat:
- Wilk’s Lambda:
- Pillai Trace:
- Lawley-Hotelling’s Trace:
- Roy’s Largest Root:
- maximum ev of .
Example)
fit FM .
fit , then we acquire .
1. $~H_0: \begin{bmatrix} \beta_{31},\beta_{32},\beta_{33},\beta_{34} \end{bmatrix} =0~$,
- ,
under ,
now, fit (X_2, X_3 excluded), then we acquire .
let’s calculate ev of , and compute Wilk’s Lambda .
Sampling Distribution of the Wilk’s Lambda
let Z be full rank of , and .
let be normally distributed.
under , .
Prediction
assume fixed values of the predictor variables. then .
- simultaneous CI for :
-
- where is the th column of .
- is the th diagonal element of .
- simultaneous C.I. for the individual responses :