STA 211: The Mathematics of Regression

STA 211 presents the mathematics underpinning linear and generalized linear models, some of the most common techniques used in applied statistics. We will see how matrix algebra enables a very practical representation of regression modeling, helps us understand how and why regression works, and offers convenient maximum likelihood estimation of regression model parameters.

In STA 211, we emphasize mathematics and theory over applied data analysis, although we continuously connect the mathematical theory to topics in data analysis.

This course meets once per week on Monday 3:05 - 4:20 pm in Old Chemistry 116.

Please check: syllabus, schedule, course support, and important dates.

Other tools we will use: Gradescope for homework submission, Ed Discussions for course questions, and Canvas for checking important announcements and grades.

Miscellaneous

The image is from a first edition printing of Carl Friedrich Gauss' work Theoria combinationis observationum erroribus minimis obnoxiae. In particular, it displays one of the concluding sections of the first half of the work, in which Gauss establishes the conditions under which the ordinary least squares estimator achieves the minimum variance among all possible linear unbiased estimators. This is the Gauss-Markov theorem, which, coincidentally, we will derive and study as the conclusion to the first half of our semester.

This image is from a first edition printing of Carl Friedrich Gauss’ work Theoria combinationis observationum erroribus minimis obnoxiae. In particular, it displays one of the concluding sections of the first half of the work, in which Gauss establishes the conditions under which the ordinary least squares estimator achieves the minimum variance among all possible linear unbiased estimators. This is the Gauss-Markov theorem, which, coincidentally, we will derive and study as the conclusion to the first half of our semester.