Multivariate Analysis
Regression: Matrix Formulation


Matrix Formulation of Multiple Regression

Y: Vector of Criterion Scores

X: Augmented Raw Score Matrix

b: Vector of Regression Coefficients

e: Vector of Residuals

Computing b

It is relatively easy to derive this:

Y = Xb + e [The matrix form of the regression equation]

X'Y = X'(Xb + e) [Multiply each side by X']

X'Y = X'Xb + X'e [Simplify]

It can be shown that X'e is always 0 because the residuals are independent of the predictor variables, thus:

X'Y = X'Xb

(X'X)-1X'Y = (X'X)-1X'Xb [Now multiply both sides by (X'X)-1]

(X'X)-1X'Y = Ib [Since (X'X)-1X'X = I]

(X'X)-1X'Y = b [Simplify]

Computing Predicted Score

Computing e Computing the Squared Multiple Correlation

Note: Do not use the augmented X; x's and y's must be in deviation score form.

Partitioning the Sums of Squares Residual Sums of Squares Total Sum of Squares

Regression Sum of Squares

Covariance Matrix of Regression Standard Errors

Standardized Regression Coefficients

Computing R-squared


Multivariate Course Page

Phil Ender, 3Oct07, 30Jun98