Lecture plan

This is a tentative lecture plan, which is still subject to modifications. N&W always refers to the second edition of Nocedal & Wright, Numerical Optimization, Springer 2006.

Most of the code below will require the file TMA4180_definitions.py, which contains implementations of different test functions for optimisation. This file will be updated continuously throughout the course. Make sure that you have an up-to-date version of this file (and all other necessary files) in your working directory before running any of the jupyter notebooks below.

Date Topics Reading Slides and code
Introduction, notion and existence of solutions
Week 2 Overview
Formulation of a minimization problem
Definitions of minima
Lower semi-continuity and coercivity
Existence theorems for minima
First order necessary conditions
Second order necessary and sufficient conditions
Note on unconstrained optimisation, Sec 1, 2
preliminary (updated 22.02.)
N&W 1, 2.1
Introduction
Lecture 2
Week 3 Second order necessary and sufficient conditions
Convex functions
First and second order characterisations
Note on unconstrained optimisation, Sec 2, 3
preliminary (updated 22.02.)
Lecture 3
Basic methods for free optimisation
Derivative free methods
Gradient descent method
Exact line search
Armijo condition and backtracking
N&W, 9.5 (be aware of the unusually large number of misprints in this chapter)
N&W, 3.1
Lecture 4
Nelder-Mead notebook
Gradient Descent notebook
NelderMead.py
LineSearchMethods.py
TMA4180_definitions.py
Week 4 Convergence of backtracking
Goldstein conditions
Wolfe conditions
Algorithmic approach to Wolfe conditions
Convergence rate of the gradient descent method
Newton's method
Damped Newton method
Regularisation of the damped Newton method
N&W, 3.1-3.3 Lecture 5
Lecture 6
Gradient Descent and Newton method
LineSearchMethods.py
TMA4180_definitions.py
Week 5 Linear Conjugate Gradient method
Fletcher–Reeves method
Polak–Ribière method
Summary of free optimisation
N&W, 5.1, 5.2 Lecture 7
Gradient Descent for a quadratic problem
Nonlinear CG methods
LineSearchMethods.py
TMA4180_definitions.py
Theory and basic methods for constrained optimisation
Week 6 Constrained optimisation over convex sets
Feasible directions
First order optimality conditions for convex sets
Projections on convex sets
Gradient projection method
Normal cones
Conditions for linear constraints
Lagrange multipliers
Tangent cones for nonconvex sets
Necessary conditions for non-convex sets
Linearised feasible directions
Note on optimisation with convex constraints
N&W 12.1, 12.2
Lecture 9
Lecture 10
Week 7 Constraint qualifications
LICQ
Farkas' Lemma
KKT-conditions
Second order necessary and sufficient conditions
N&W 12.1-12.4 Lecture 11
Lecture 12
Week 8 Slater's constraint qualification
KKT conditions for convex problems
Quadratic penalty method
Logarithmic Barrier methods
Augmented Lagrangian method
Note on optimisation with convex constraints, Section 4
N&W 17.1, 17.3, 17.4, 19.1
Lecture 13
Lecture 14
Advanced methods for free optimisation
Week 9 Quasi-Newton methods
SR1-method
DFP and BFGS method
Trust region methods
N&W 6.1, 6.2
(see N&W 7.2 for limited memory methods)
N&W 4.0
Lecture 15
Lecture 16
BFGS Method
LineSearchMethods.py
TMA4180_definitions.py
Week 10 Exact solution of the trust region problem
Cauchy point
Convergence of trust region methods
Dogleg method
Non-linear least squares problems
Gauß–Newton method
Levenberg–Marquardt method
N&W 4.3, 4.1
N&W 10.3
Lecture 17
Trust Region Method
LineSearchMethods.py
TMA4180_definitions.py
Lecture 18
Advanced methods for constrained optimisation
Linear constraints
Basics of linear programming
N&W 13.0, 13.2
Week 11 Quadratic programming
Active set method
SQP method for equality constraints
N&W 16.1, 16.5
N&W 18.1, 18.3
Lecture 19
Lecture 20
Week 12 SQP method for inequality constraints N&W 18.1, 18.3 Lecture 21
Vector optimisation
Partial orders
Ordered vector spaces and cones
Pareto optimal solutions
Weighted sum method
Multicriteria optimisation
(updated on April 04)
Lecture 22
Week 13 No lectures
Week 14 Easter holidays
Duality theory
Week 15 Lagrangian duality
Primal and dual linear programmes
Weak duality
Saddle points
Strong duality for convex problems
Lagrangian duality
(revised on April 13)
Week 16 Dual projected gradient method
Primal-dual interior point method for linear programming
Central path
Path following methods
Lagrangian duality
N&W, 14.1
Lecture 24
Lecture 25
Summary
Week 17 Summary
Last lecture on Monday, April 24
Summary
Week 18 Question session for the exam on Friday, May 05
2023-04-24, Markus Grasmair