TMA4300 Computer Intensive Statistical Methods, Spring 2024
Messages
August 13: I have just submitted the grades on the re-sit exam so they should appear on studweb shortly. The grades were good and significantly better than on the ordinary exam!
August 10: Todays exam and a preliminary solution and grading guide is available at https://www.math.ntnu.no/emner/TMA4300/eksamen/?C=M;O=D
July 10: The re-sit exam in August will written and not oral, see https://www.ntnu.no/studier/emner/TMA4300/2023#tab=omEksamen and https://i.ntnu.no/wiki/-/wiki/English/Re-sit+examination for further information.
June 20: The final grades should become available on studweb shortly. The letter grades on the final exam had the following distribution:
Grade | Count |
---|---|
A | 1 |
B | 5 |
C | 4 |
D | 5 |
E | 5 |
F | 7 |
The solution has been updated and now includes points given for answering the different question of each subproblem correctly.
June 4: Today's exam and a preliminary solution is available here.
May 8: I will be available to answer questions before the exam on Monday June 3, from 9:00-13:00 in KJL21. Send me an email if I'm not there.
May 8: I found an error in the Gibbs sampler for the renewal process example in the summary lecture on April 30 which has been corrected.
You can work in groups consisting of between one and three students (preferably two) on the projects. If you want to work in a group but don't know any other student, post a message in the EdStem-forum below or send an e-mail to Guillermina and she will try to put you into a group with someone else.
Practical information
Lecturer: Jarle Tufto
Teaching assistant: Guillermina Senn
Lectures: Tuesdays and Thursdays 12:15-14:00 in B3.
Project supervision: Tuesdays 12:15-14:00 in B3 (tentatively) in weeks without lectures.
EdStem forum (sign up using the link, ntnu email required). Use this when you have questions about any part of the course.
Reference group: NN and NN.
Curriculum
The curriculum is what is covered in the lectures and by the exercises. The curriculum is based on sections (see below) in G.H. Givens and J.A. Hoeting (2013). Computational Statistics, 2nd edition (NTNU access through this link) (GH) and D. Gamerman and H.F. Lopes (2006). Markov chain Monte Carlo - Stochastic Simulation for Bayesian Inference, 2nd edition (sold by Akademika) (GL). You may be able to buy a second-hand physical copy or get hold of a digital pdf version of GL.
Projects
Part 1
Stochastic simulation and an introduction to Bayesian inference. Givens and Hoeting: 1-1.8 (mostly repetition), 6.1, 6.2-6.2.3.2, 6.3.1, 6.4.1. Gamerman and Lopes: 1-1.5.
Week 2.1: Introduction, pseudorandom number generators, inversion sampling. GL: 1.1-1.2-1.3.1, GH: 1 (review)
Week 2.2: Transformation formula for joint densities, ratio of uniforms method, Box-Muller algorithm. GL: 1.3.2, GH: 6-6.2.2
Week 3.2: Simulation via mixtures, the multivariate normal, rejection sampling. GL: 1.4, 1.5.1, GH: 6.2.2, 6.2.3
Week 3.2: Rejection sampling variants, Monte Carlo, importance sampling. GL: 1.5 (all), GH: 6.2.3 (all), 6.3.1, 6.4.1. Alias method.
Week 4.1: Introduction to Bayesian inference. GL: 1.5 (all), GH: 6.2.3 (all), 6.3.1, 6.4.1
Week 4.2: More Bayes, antithetic sampling (GH: 6.4.2), Summary part 1.
Notes from the lectures part 1 (updated on May 24, see corrections in red on pages 11 and 13)
Part 2
Bayesian inference and Markov chain Monte Carlo. Maybe some INLA and/or TMB. Givens and Hoeting: 7-7.3.3. Gamerman and Lopes: 2.1, 2.2-2.2.2, 2.3.1, 2.3.3, 2.4, 4.1-4.6, 5.1-5.2, 6.1-6.4.4.
Week 7.1: Hierarchical Bayesian models, review of Markov chain, time reversability, detailed balance equation, Metropolis-Hastings algorithm.
Week 7.2: Tuning of random-walk metropolis algorithm, comparison with independence sampler, numerical issues, M-H with iterative conditioning, Gibbs sampling.
Week 8.1: Gibbs sampling examples, Metropolis within Gibbs (hybrid sampler), blocking.
Week 8.2: Single site vs. block updating example cont. Estimating autocorrelation and effective sample size. More about DAGs. Gibbs sampling for random intercept LMM. Project 2, problem A.
Week 9.1: Laplace approximation in hierarchic latent variable models. Computation of the Laplace approximation in INLA vs. via AD in TMB and RTMB. Automatic differentation (AD). INLA in some more detail. For even more details, see Rue, Martino & Chopin 2009) which also has an interesting "Discussion of the paper" by other authors, discussing for instance alternative MCMC strategies using eq. (3) to construct a one-block proposal.
Week 9.2: RTMB and INLA examples.
Notes from the lectures part 2
Part 3
Bootstrapping and the EM-algorithm. Givens and Hoeting: 4.1, 4.2-4.2.1, 9.1, 9.2-9.2.4, 9.3, 9.3.1, 9.3.2.1-2, 9.5.1, 9.8.
Week 12.1 Non-parametric bootstrap via simulation and via exact method. Parametric bootstrap. Examples. Bootstrap estimation of bias and bias correction. Bootstrap confidence intervals via percentile method. Validity of percentile method.
Week 12.2 Regression and bootstrapping residuals versus bootstrapping pairs. Accelerated bias-corrected percentile method BC\(_a\) with R-implementation. Bootstrap t confidence intervals. Residual bootstrapping for time series models.
Week 13: Easter holidays
Week 14.1 (Thursday, April 4) The EM-algorithm. ABO-blood types example. Gaussian mixture example. Convergence. Proof that \(f_X(x|\theta^{(t+1)})\ge f_X(x|\theta^{(t)})\).
Project 3 (10% towards final grade). Deadline Sunday April 28, 23:59.
Plan for oral presentations project 3
Week 18: Tuesday, April 30: Summary of the course. More examples from part 1, 2 and 3
Final exam
June 4, 15:00 (70% towards final grade)
Previous exams can be found in this folder.