In this lecture we look at ridge regression as an approach to deal with multicollinearity and showing how it can be motivated as a constrained optimization problem.
Readings: Christensen Chapter 15 and Hoff Chapter 9
In this lecture we look at ridge regression as an approach to deal with multicollinearity and instability of OLS/MLEs when eigenvalues $X^TX$ approach zero. Using a rotation of the data, we will show how this relates to regression on Principle Components. Finally we will describe how ridge regression can be motivated as a constrained optimization problem or as a Bayesian estimator.