Dette er en gammel utgave av dokumentet!
Lecture Plan and Progress
Week | Chapter | Topic | Download | Remark |
---|---|---|---|---|
2 | Ch. 1 | Introductory meeting | Decision on lecture times | |
3 | Ch. 2 | Overview of supervised learning | Slides Ch. 2 | 4 hrs lecture (no exercises) |
4 | Ch. 3.1-3.4 | Linear methods for regression | Slides Ch. 3, Maximization result | |
5 | Rest of Ch. 3, except 3.7-3.8 | Linear methods for regression | Multivariate prediction and regression | |
6 | Ch. 4, except 4.4.3 and 4.5 | Linear methods for classification | Slides Ch. 4 | |
7 | Ch. 5, except 5.8, 5.9 and Appendix | Basis expansions and regularization | Slides Ch. 5 | |
8 | Ch. 6 | Kernel smoothing methods | Kernel density estimation | No lecture on Thursday 23 February. Trial exam. |
9 | Ch. 7 (7.7, 7.8, 7.9 and 7.12 are not in curriculum) | Model assessment and selection | Slides Ch. 7 | |
10 | Ch. 7.10-7.11 and some remaining issues from 7.1-7.6. 8.1, 8.2, 8.7. | Model selection and inference | ||
11 | Ch. 9.1 (not all details), 9.2. INTRODUCTION: Ch. 8.1, 8.2.1, 8.2.2 | Additive models, trees, and related methods (bagging, random forests) | Bookchapter (B. Ripley: Pattern Recognition and Neural Networks), Example: Pruning and cross-validation | Additional reading: Ch. 3 in "Berk: Statistical Learning". |
12 | Ch. 12.1, 12.2, (skim 12.3). INTRODUCTION: Ch. 9.1-9.3 | Support Vector Machines | ||
13 | Ch. 10.1-10.5. INTRODUCTION: Ch. 8.2.3 | Boosting and additive trees | Additional reading: Ch. 6 in "Berk: Statistical Learning". | |
14 | Ch. 11 | Neural Networks | Additional reading: Ch. 8.1 in "Berk: Statistical Learning". | |
15 | Easter vacation | |||
16 | Final meeting on Thursday April 20 | Summing-up | No lecture on Tuesday 18 April (NTNU no teaching day) |