This is a rough schedule for the course and will be updated regularly. Please check this frequently for adjustments. Announcements will be posted here and made in class. It will be up to you to keep up to date on all class announcements and web announcements made for the course. Read along in Hoff before coming to class.
Slides and notes for this class are based upon many different references and notes that I have written.
The course syllabus can be found here Syllabus: things you need to know about the course!
You lab schedule and homeworks will all be posted on Sakai (and submissions should be done on Sakai as well). Expect one homework per week. Yes, we have lab the first week of class.
Supplementary reading
I have written both undergraduate and graduate level notes. Please feel free to use these to complement Hoff as needed. Please do watch out for typos!
Some of Bayesian Methods: The Essential Parts (Graduate Level), Author: Rebecca C. Steorts
Note: Chapter 5 has typos that I have no had time to fix and some parts are not
as clear as I would like. Nevertheless, this should give you some extra examples
and explanations different from Hoff.
Baby Bayes using R, Author: Rebecca C. Steorts
This material was meant for undergraduate students as a cross-displinary introduction to Bayesian methods, without assuming a knowledge of calculus except that a density integrates to 1. If you're having trouble with Hoff, either as an undergraduate or graduate student, consider reading parts of this. Also, there is
an introduction to probability and statistics (akin with Ch 2 in Hoff). I will assume that you know this. This is all fair game for exams.
Lecture notes
Module 0: Course Expectations and An introduction R
Module 1: An introduction to Bayesian methods
- Module 1 Slides,
- Read Ch 1, Ch 2.1 -- 2.6. (Hoff)
Read Ch 1.1, 2.5--2.7, 2.9 of "Some of Bayesian Methods"
Read Ch 4 for predictive inference (Hoff).
Module 2: An introduction to Decision Theory
- Module 2 Slides,
- Read Read Ch 2.1 -- 2.4 of "Some of Bayesian Methods". This is not covered in Hoff.
Module 3: An introduction to Normal-Normal Model
- Module 3 Slides,
- Read Ch 2, Example 2.7 and 2.8 (in terms of variance derivations) of "Some of Bayesian Methods"
Module 4: An introduction to Normal-Gamma Model
Module 5: An introduction to Monte Carlo
- Module 5 Slides,
- Read Hoff, Chapter 4.
- Read Chapter 5.1, 5.3 of Some of Bayesian Statistics
Remark: The slides will cover examples not always in Hoff or the notes.
Module 6: An introduction to Metropolis
- Module 6 Slides -- Metropolis
- The reading below covers the reading for Metroplis and Gibbs sampling.
- Read Hoff, Ch 6
- Read Chapter 5.2 of "Some of Bayesian Statistics"
- For the Metropolis Algorithm, read Hoff 10.2
Module 7: An introduction to Gibbs
Module 8: Gibbs Sampling and Data Augmentation
- Module 8 Slides -- Gibbs Sampling and Data Augmentation,
- The material in class is not in the "Some of Bayesian Statistics" for the most part.
- Gibbs reading: You should have already read Ch 6 (Hoff), so review as need be.
- Metropolis Hastings: 10.4 and 10.5 (Hoff)
- Latent variable allocation: Chapter 12 (Hoff)
Module 9: The Multivariate Normal Distribution
Module 10: The Multinomial-Dirichlet Distribution
Module 11: Linear Regression
Additional readings:
- Credible Intervals): Cred intervals are covered on pages 52 and 267 of Hoff.
Read Ch 4.1--4.1 (Cred intervals) in "Some of Bayesian Statistics"
Top