# TMA4205 Numerical Linear Algebra, Autumn 2013

## News

Date Message
20.12 The grades for the semester project are now available on itslearning.
05.12 Here is the exam and solution: [nor] [eng] [eng sol]
29.11 Solution 6 and a temporary version of Solution 5 have been posted.
28.11 Important: All students that are taking the course as a specialization course (fordypningsemne), please send me an e-mail confirming this, even if you have informed me about this before.
27.11 K. Rottmann: Matematisk formelsamling was added to the list of allowed support material.
26.11 Information for those of you that are taking the course as a specialization course has been added to the Exam information.
26.11 See Exam information for a list of permitted examination support material for the written exam.

## Schedule

MoTuWeThFr
08:15
09:15
10:15Lecture (F3)
11:15
12:15
13:15  Lecture (F4)
14:15Office hour (1301)
15:15  Exercise (F4)
16:15

## Exam information

The exam will test a selection of the Learning outcome.

Permitted examination support material: C: Specified, written and handwritten examination support materials are permitted. A specified, simple calculator is permitted (either Citizen SR-270X or Hewlett Packard HP30S). The permitted examination support materials are:

• Y. Saad: Iterative Methods for Sparse Linear Systems. 2nd ed. SIAM, 2003 (book or printout)
• L. N. Trefethen and D. Bau: Numerical Linear Algebra, SIAM, 1997 (book or photocopy)
• G. Golub and C. Van Loan: Matrix Computations. 3rd ed. The Johns Hopkins University Press, 1996 (book or photocopy)
• E. Rønquist: Note on The Poisson problem in <jsm>\mathbb{R}^2</jsm>: diagonalization methods (printout)
• K. Rottmann: Matematisk formelsamling
• Your own lecture notes from the course (handwritten)

For those that are taking the course as a specialization course: At the oral exam, you will be presented with a list of three topics. You will then get 5 minutes preparation time. You are allowed to bring the support material listed above to the preparation, but not to the exam itself. In the preparation part, you may take notes on a supplied sheet of paper and bring it with you to the exam. You will then get 45 minutes to present the three topics.

All candidates who are taking the oral exam must meet at 06.12, 09:00 at room 822, Central Building 2. We will then agree upon the candidate order for the exam.

## Learning outcome

Here is a list of the goals that we will achieve in this course.

1. Knowledge
• L1.1: The student has knowledge of the basic theory of equation solving.
• L1.2: The student has knowledge of modern methods for solving large sparse systems.
• L1.3: The student has detailed knowledge of techniques for calculating eigenvalues of matrices.
• L1.4: The student understands the mechanisms underlying projection methods in general.
• L1.5: The student understands the mechanisms underlying Krylov methods in general.
• L1.6: The student has detailed knowledge about a selection of projection and Krylov algorithms.
• L1.7: The student understands the principle of preconditioning.
• L1.8: The student understands selected preconditioning techniques in detail.
• L1.9: The student is familiar with the practical use of matrix factorization techniques.
2. Skills
• L2.1: The student is able to implement selected algorithms for a given model problem.
• L2.2: The student can assess the performance and limitations of the various methods.
• L2.3: The student can make qualified choices of linear equation solvers/eigenvalue algorithms for specific types of systems.
• L2.4: The student can assess the complexity and accuracy of the algorithms used.
3. General competence
• L3.1: The student can describe a chosen scientific method and communicate his or her findings in a written report using precise language.

## Lecture log and plan

S refers to Saad, TB to Trefethen & Bau, GVL to Golub & Van Loan, and R to Rønquist.

Date Topic Reference Learning outcome
28.10 QR algorithm with shifts, Wilkinson shift TB29 L1.3, L2.4
23.10 QR algorithm without shifts, simultaneous iteration TB28 L1.3
21.10 Power iteration, inverse iteration, Rayleigh quotient iteration TB27 L1.3
16.10 Eigenvalue problems, eigenvalue-revealing factorizations, eigenvalue algorithms, Rayleigh quotient TB24–25, TB26 (self-study), TB27 L1.3, L1.9
14.10 Matrix properties via the SVD TB5 L1.9
09.10 DD as a preconditioner, the singular value decomposition (SVD) GVL2.5, TB4 L1.9
07.10 MG as a preconditioner in PCG, domain decomposition (DD) methods, Schwarz' alternating procedure, multiplicative and additive overlapping Schwarz S14.1, S14.3–14.3.3 L1.8, L2.3
02.10 Intergrid operators, two-grid cycles, V-cycles and W-cycles, red-black Gauss–Seidel S13.3–13.4.3, S12.4.1 L1.2, L1.8, L2.3
30.09 Preconditioned CG (PCG), Introduction to multigrid (MG) methods, weighted Jacobi iteration S9.2.1, S13.1–13.2.2 L1.2, L1.8
25.09 Convergence of GMRES, complex Chebyshev polynomials, introduction to preconditioning, left-preconditioned GMRES S6.11.2 (self-study), S6.11.4, S9.1, S9.3.1 L1.2, L1.7, L1.8
23.09 Convergence of CG, Chebyshev polynomials S6.11.1, S6.11.3 L1.6, L2.2
18.09 The D-Lanczos algorithm, the conjugate gradient method (CG) S6.7.1 L1.2, L1.6, L2.2, L2.4
16.09 Givens rotations, the Lanczos algorithm, the Lanczos method S6.5.3, S6.6.1, S6.7.1 L1.2, L1.6, L1.9
11.09 More on Arnoldi's algorithm, the full orthogonalization method (FOM) and variations of FOM, the generalized minimal residual method (GMRES) S6.3.1–6.5.1 L1.2, L1.5, L1.6, L2.2
09.09 Convergence of steepest descent, minimal residual (MR) iteration, convergence of MR, Krylov subspace methods, Arnoldi's algorithm S5.3.1–5.3.2, S6.1, S6.3–6.3.1 L1.4, L1.5, L1.6, L2.2
04.09 Projection methods, error projection, residual projection, steepest descent method S5.1.1–5.2.2, S5.3–5.3.1 L1.4
02.09 Orthogonal projections, Jacobi - and Gauss–Seidel iteration, convergence of splitting methods, Gershgorin's theorem, the Petrov–Galerkin framework S1.12.3–1.12.4, S4.1, S4.2–4.2.1, S1.8.4, S4.2.3, S5.1 L1.1, L1.4
28.08 Perturbation analysis, finite difference methods, diagonalization methods, projection operators S1.13.1 (self-study), S1.13.2, S2.2–2.2.5 (self-study), R, S1.12.1 L1.2, L1.4
26.08 Similarity transformations, normal -, Hermitian -, and positive definite matrices S1.8–1.8.1, S1.8.2 (self-study), S1.8.3, S1.9, S1.11 L1.9
21.08 QR factorizations S1.7, TB10 L1.9
19.08 Background in linear algebra S1.1–1.3 (self-study), S1.4–1.6

## Curriculum

The curriculum consists of all topics that have been covered in the lectures, all self-study topics, the semester project, and the exercises with solutions. The lectures and self-study topics are based on the following material. The list is not yet final.

• Saad: 1.1–1.13, 2.2, 4.1, 4.2.1–4.2.3, 5.1–5.3, 6.1–6.11, 9.1–9.3, 10.1–10.2, 13.1–13.5, 14.1–14.3
• Trefethen & Bau: Lectures 4, 5, 10, 25, 26, 27, 28, 29
• Golub & Van Loan: 2.5
• Rønquist: Note on The Poisson problem in $\mathbb{R}^2$: diagonalization methods. [pdf]

Of the reading material listed below, you should buy the book by Saad, but not necessarily any of the other ones. Saad's book is also available online for free through NTNU's network at SIAM.

• Y. Saad: Iterative Methods for Sparse Linear Systems. 2nd ed. SIAM, 2003. (main text)
• L. N. Trefethen and D. Bau: Numerical Linear Algebra, SIAM, 1997. Photocopies are available at itslearning.
• G. Golub and C. Van Loan: Matrix Computations. 3rd ed. The Johns Hopkins University Press, 1996. Photocopies are available at itslearning.
• W. L. Briggs, V. E. Henson, S. F. Mc Cormick: A multigrid tutorial, SIAM, 2000. Available online at SIAM.
• J. W. Demmel: Applied Numerical Linear Algebra, SIAM, 1997. Available online at SIAM.

## Exercises

The exercises are optional and are not to be handed in. Nevertheless, they are part of the curriculum, and you are encouraged to do them. A new exercise will be posted every one or two weeks.

## Solutions

Solutions will be published one or two weeks after the relevant exercises were published.

• Temporary Solution 5. This version does not contain figures, but will be replaced by the final version soon.

## Semester project

The semester project consists of two parts. The first part, which counts for 10 % of the final grade, will be given in September, and the second part, which counts for 20 %, will be given in the end of October/beginning of November. The project is to be done individually or – preferably – in groups of two. Hand in your reports in PDF format and your code by e-mail. You may choose which programming language to use. Please do not type your names on the report, only your candidate numbers. If your candidate number is not available yet, you may use your student number.

Students who are taking the course as a specialization course (fordypningsemne) are not required to do the semester project.

The project will cover learning outcomes L2.1–2.4, and L3.1.

You will need the note Eigenvalues of tridiagonal Toeplitz matrices in the first part of the semester project.