Convex Optimisation: Fall 2019

Course information

Synopsis

The objective of this module course is to get acquainted with the basic ideas of convex analysis in order to be able to apply convex techniques to the solution of convex (and sometimes non-convex) optimisation problem.

References

Lecture Plan

Date Topics Reading
Week 37 Introduction
Backtracking gradient descent
Failure of gradient descent
Week 38 Review of smooth optimisation
Line search and trust region approaches
Gradient descent
Preconditioning of gradient descent
Newton and Quasi-Newton methods
N&W, Chaps. 1-7
Week 39 Convex sets
Projections on convex sets
Convex cones
Polar and dual cones
B&C, Chaps. 3.1, 3.2, 6.1, 6.3, 6.4
Week 40 Tangent and normal cones
Convex functions
Infimal convolution
Lipschitz continuity
B&C, Chaps. 6.4, 8.1, 8.2, 8.3; Cl, Chap. 3
Week 41 Convex conjugation
Young's inequality
Properties of convex conjugates
Convex subdifferentials and subgradients
Properties of subgradients
B&C, Chaps. 13, 15.1, 16.1, 16.2, 16.3; Cl. Chaps. 5, 4
Week 42 Characterisation of subdifferentials
Subdifferentials of sums
Linear chain rule
Cl, Chaps. 4, 5; B&C, Chap. 16.4
Week 44 Fenchel–Rockafeller Theorem
Primal and dual problems
Dual projected gradient descent
Cl, Chap. 5; B&C, Chap. 19.1
Week 45 Saddle point formulation of duality
Proximal points
Forward-backward splitting
Cl, Chaps. 6.2, 7.1, 7.2; B&C, Chap. 28.5
Week 46 Douglas–Rachford splitting
Arrow–Hurwicz and Chambolle–Pock methods
Cl, Chaps. 7.3, 7.4; B&C, Chap. 28.3
Week 47 Lagrangian method and dual gradient descent
Augmented Lagrangian methods
ADMM
ADMM, Secs. 1, 2, 7

Exercises

Here you can find exercises to try your luck on. Currently, you can find there exercises related to the topics discussed in weeks 38–40, but I will add additional exercises later. Do not expect solutions.

2019-11-20, Markus Grasmair