Lecture plan

Chapters 1-10 from George Casella and Roger L. Berger, 2002: Statistical Inference, Second Edition, Duxbury.

Self-study: Chapters 1-5. The contents is assumed esentially known, but some parts will be covered on a 'need to know' basis in the lectures.

Chapter 1 of Keener (2010) (Theoretical Statistics. Topics for a Core Course) is recomended as a replacement of Chapter 1 in Casella-Berger. Keener writes at a level similar to Casella-Berger, but he is mathematically more precise. Keener uses more standard notation and definitions, and this will also be used in the lectures. The text by Keener was used as a textbook in (STAT 210A) Theoretical Statistics at UC Berkeley by one of my master students in 2018. It should be accessible also to other students at NTNU. The course STATS 300A at Stanford is more advanced, but with many good notes that you may find useful. The same can be said about the notes for the STAT 210A course.

Week Book Topic Notes/Supplementary reading
34 Ch. 6 Introduction. Statistical models. Principles of data reduction. Sufficiency principle. Factorization theorem Videoer på norsk: Sverdrup intervju; Kolmogorov; Hendelser; Encyclopedia of mathematics: Factorization theorem
35 Ch. 6 Minimal sufficiency. Complete statistics and ancillary statistics: Basu theorem. Exponential family. Likelihood Wikipedia: Sufficient including minimal; ancillary; Exponential family; Complete statistic; Basu theorem; Ancillary meaning
36 Ch. 6 The likelihood principle. Likelihood principle and Birnbaum argument; Conditionality principle; Statistical problem of the Nile, and real Nile problem.
37 Ch. 6 Symmetry in statistics: The equivariance principle. Summary of Ch.6: S-L-C-E and Basu and Halmos-Savage Equivariance; Inference; Group action; Alternative methods for data reduction
38 Ch. 7 Parameter and statistic: Axioms for events and random quantities. Estimation: Point, Set, Distribution. Methods: Empirical-Moments-MLE-Bayes. Maximum likelihood estimation in exponential families. Evaluation: Distribution-Loss-Bias-Equivariance. Examples: Location/Scale, Binomial, Exponential family, Gamma Parameter; Statistics; Statistics also!: Point estimation; Estimation; Estimation again :-) ; Exponential family;
39 Ch. 7 Evaluation: Distribution-Loss-Bias-Equivariance.Sufficiency. Equivariance and ancillaries. Cramer-Rao's inequality. Bayes-estimation. Convex loss and Jensen's inequality. Conditional probability; Conditional expectation; Bayesian inference; Loss; Convex function; Jensen's inequality; Convex optimizatioan
40 Ch. 7 Discussion of Cramer-Rao's inequality. Sufficiency and unbiasedness. Rao-Blackwell's theorem. Conditional distribution. Hilbert space arguments. Lehmann-Scheffe theorem; Rao-Blackwell theorem
41 Ch. 7 Construction and evaluation of estimators. Examples.
42 Ch. 8 Hypothesis testing. Likelihood ratio tests. Methods of evaluating tests. Unbiased test. Most powerful tests: UMP. Neyman-Pearson. MLR. UMPU. UIT. IUT. p(x). Risk and power. Reject an unlikely hypothesis! Hypothesis H. Test R. Power β. Likelihood (ratio) λ(x), level and size α, best test, valid p (x)
43 Ch. 8 Monotone likelihood ratio; p-value; Stochastic ordering; Neyman-Pearson; Karlin-Rubin theorem
44 Ch. 9 Interval estimation. Test-, Point-, set-, and distribution-estimators. Loss and risk. Power. Unbiased. Inversion. Bayes. Pivot. Pratt and Kolmogorov Theorems. Coverage coefficient and level. Accuracy. Size- or Neyman-precision. Trueness. Equivariance. Credibility-, Confidence-, Prediction-, and Tolerance-set. GUM; Accuracy and precision
45 Ch. 9 Examples: Construction and evaluation. Bayes-, Fiducial-, and Confidence-distributions, -intervals, -estimators.
46 Ch. 10 + 5.5 Example continued. Asymptotic evaluations Asymptopia
47 Repetition
2021-11-17, Gunnar Taraldsen