STA-215 Statistical Inference
Spring 98
Links to:
Problems
|
Exercises |
Midterms
|
Announcements |
Computing
TA 215 STATISTICAL INFERENCE. Credits: 1.00, Hours: 3.0. Areas: QR.
Section 01. Call No. 138935. Limit: 18. Instructor: MUELLER, P.
MWF, 11:50AM- 12:40 in Old Chemistry, room
025. Footnotes: P.
STA 213 or equivalent is a
prerequisite
for this class. If you are in doubt, please make a
self test.
Instructor
Peter Müller ;
office: 219 Old Chemistry Bldg.;
office hours: Friday from 2:00 to 3:00 and by appointment;
e-mail: pm@stat.duke.edu; telephone: 684-3437.
TA:
Ashih, Heidi .
office: 222 Old Chemistry Bldg.
e-mail:heidi@stat.duke.edu,
phone: 684-8840.
Midterms and Problems
Two midterms:
Midterms will be open book and open notes.
Weekly problem sets: around 10 problems. Group work is ok and
encouraged.
Solutions and hints will be posted regularily.
Exercises: occasional exercises to prepare class discussion for the
next lecture. Excercises will not be collected.
Computing
Some problems will require statistical computing.
Splus
is recommended. Any other software (Gauss, Xlisp-Stat etc) is ok.
Grading
Grading will be based on homework problems (30%), midterms (30% each) and
class participation (10%).
Textbooks
- GCSR: Gelman, Carlin, Stern and Rubin, Bayesian Data Analysis.
Main course text. Most problem sets will come from GCSR.
- Spector: An Introductioon to S and S-plus. Reference for Splus,
optional course text.
Reading List
- GCSR: Gelman, Carlin, Stern and Rubin, Bayesian Data Analysis.
Main course text. Most problem sets will come from GCSR.
-
BS: Bernardo and Smith, Bayesian Theory.
Alternative to GCSR. More formal presentation.
Will use chapters 4.2-4.5 and 6.
-
OH: O'Hagan, Bayesian Inference. Alternative to GCSR.
-
B: Berger, Bayesian Analysis and Statstical Decision Theory
Some decision theoretic foundations (Chapters 1 and 2).
-
R: Robert, The BAyesian choice.
Alternative to GCSR. More decision theoretic.
-
BD: Bickel and Docksum, Mathematical Statistics.
Frequentist optimality criteria (Ch 4), Hyptothesis
testing (Ch 6).
-
Berger and Casella, Statistical Inference.
Alternative to BD.
-
Berger and Wolpert, The Likelihood Principle.
If, beyond this course, you want to read more on statistics see the
Duke Statistics reading list.
Part I: Basic Concepts
- Introduction:
Prior, likelihood, posterior (GCSR Ch 1 and 2).
- Foundations: The Bayesian paradigm (B Ch 1).
Likelihood principle and sufficiency principle.
- Multiparameter models:
Marginal posteriors, mv normal, multinomial (GCSR Ch 3).
- Analytic posterior approximations (GCSR Ch 4 and 9).
Part II: More Modelling
- Hierarchical models (GCSR Ch 5).
Empirical Bayes (B Ch. 4.5).
- Model checking and model comparison (GCSR Ch 6; BS 6).
- Regression (GCSR Ch 8 and 12).
- Generalized linear models (GCSR Ch 14).
- General principles:
Exchangeability (BS 4.2); invariance (BS 4.3); sufficiency (BS 4.4);
nonidentifiability; sequential decisions.
Part III: Comparison of classical, likelihood and
Bayesian approaches
- Decision theoretic foundations (R Ch 2, 6).
Utility and loss; Admissibility; Admissibility of BAyes estimators.
- P-values and Bayes Factors
Significance testing. p-values. Contrast with Bayes factors.
Testing point null hypothesis. Bayesian vs. Classical
perspectives.
- Neyman Pearson Testing Concepts (BD Ch 6).
Power. Neyman-Pearson Lemma. Implications.
Nuisance parameters. Unbiased testing.
pm@stat.duke.edu