Lecture plan

Date Topics Reading
Introduction, notion and existence of solutions
Week 2 Overview
Formulation of a minimization problem
Definitions of minima
Lower semi-continuity and coercivity
Existence theorems for minima
N&W, Chapters 1, 2.1
Minimisers of Optimisation Problems
Theory and methods for unconstrained optimization
Taylor's theorem in higher dimensions
1st order necessary conditions
N&W, Chapter 2.1
Week 3 2nd order necessary and sufficient conditions
Convex sets and functions
N&W, Chapter 2.1
Basic Properties of Convex Functions
Line search methods
Armijo (backtracking) line search
The Wolfe conditions
N&W, Chapter 3.1
Week 4 Convergence of backtracking methods
Goldstein conditions
Wolfe conditions
Algorithmic approach to the Wolfe conditions
Zoutendijk's result
Convergence rate of line search methods
N&W, Chapters 3.1-3.3
For the lecture on Friday I suggest to read through chapter 3.2 including the proof of Theorem 3.2.
Week 5 Conjugate Gradient methods
Nonlinear CG methods
Fletcher-Reeves and Polack-Ribière
N&W, Chapters 5.1, 5.2
Week 6 Polack-Ribière and Hestenes-Stiefel method N&W, Chapter 5.2
Quasi-Newton methods
SR1 method
DFP and BFGS methods
Superlinear Convergence of the BFGS method
N&W, Chapters 6.1, 6.2, 6.4
We won't discuss the proof of the superlinear convergence of BFGS. However, feel free to read the details yourself. In this case, you might also want to read through Theorems 3.6 and 3.7 in Chapter 3, though.
Week 7 Summary of Line search methods
Convergence speed
Discussion of computational complexity
Idea of trust region methods
Cauchy point and dogleg method
N&W, Chapter 4.1
Week 8 2d subspace minimization
Exact solution of the trust region problem
N&W, Chapter 4.1
Non-linear least squares
Gauss-Newton method
Levenberg-Marquardt method
N&W, Chapter 10.3
Theory and methods for constrained optimization
Notation and basic ideas N&W, Chapters 12.1, 12.2
Week 9 Tangent cones
Feasible and linearized feasible directions
Constraint qualifications
Necessary conditions
KKT-conditions
Second order necessary and sufficient conditions
N&W Chapters 12.1-12.4
Week 10 Quadratic penalty method for equality constrained optimisation
Nonsmooth penalty methods
Augmented Lagrangian method
N&W, Chapters 17.1-17.3
Week 11 Augmented Lagrangian method
Penalty methods for inequality constraints
Barrier methods for inequality constraints
Sequential quadratic programming
Merit functions
N&W Chapters 17.3, 19.1, 19.6, 18.1-18.4
In contrast to the approach taken by Nocedal & Wright, we will only regard barrier methods as an alternative to penalty methods, but not consider the relation to linear interior point methods (which we will discuss in the following weeks).
Week 12 Non-smooth merit function for Newton's methods
Sequential quadratic programming with inequality constraints
Maratos effect
N&W, Chapters 18.1-18.4, 15.5
Duality in constrained optimisation
Weak duality
N&W, Chapter 12.9
Our initial discussion about primal and dual problems was a bit more detailed than what you find in Nocedal & Wright. However, you will find a short summary of the main results (although without proofs) on a future exercise sheet.
Basics of linear optimisation
Problems in standard form
N&W, Chapter 13.1
Week 13 Linear Optimisation
Primal and dual problems
Geometry of linear optimisation problems
Interior point methods for linear programming
N&W, Chapters 13.1, 13.2, 14.1
Week 14 There will be no lectures in this week. Instead, I will be available in Nullrommet at the usual lecture times for questions concerning the second project.
Week 15 Easter holidays
Week 16 Basics of quadratic programming
Interior point methods for quadratic programming
The gradient projection method
N&W, Chapters 16.1, 16.6, 16.7
There is no lecture on Tuesday in this week.
Week 17 Repetition and questions The last lecture is held on Tuesday, April 25th.

N&W always refers to the textbook J. Nocedal and S. Wright: Numerical Optimization, 2nd Edition. Springer, 2006.

2017-04-20, Markus Grasmair