Pensum (Curriculum)
Final pensum in the course for spring 2019:
First of all, the pensum includes all the material and all the results and approaches that were discussed in the lectures. Summaries can be found here.
Lecture notes:
- Minimisers of Optimisation Problems, in particular the results concerning the existence of minimisers.
- Basic Properties of Convex Functions, in particular, the different characterisations of convexity.
- Convergence of descent methods with backtracking, in particular, the main idea for the proof of the convergence of backtracking methods.
- Optimisation with convex constraints, in particular, the necessary and sufficient conditions for optimisation with convex constraints, the idea of projections, and Slater's constraint qualification.
- Lagrangian duality, in particular, the notion of dual optimisation problems, weak duality, the notion of saddle points, Slater's constraint qualification, and strong duality for convex programmes.
All the exercises.
The following chapters in Nocedal & Wright, Numerical Optimization, Second Edition, 2006 (most important points in boldface):
- Basics of optimization: Chap. 1, 2.
- Most important notions/results:
- Notions of (strict) global/local solutions of optimisation problems.
- First order necessary conditions.
- Second order sufficient and necessary conditions.
- Basic idea of trust region/line search.
- Line Search Methods: Chap. 3.1-3.3.
- Most important notions/results:
- Definitions of Wolfe conditions.
- "Well-posedness" of Wolfe conditions (Lemma 3.1).
- Zoutendijk's result (Theorem 3.2).
- Idea of backtracking line search.
- Linear convergence of the gradient descent method, importance of the condition of the Hessian.
- Quadratic convergence of Newton's method.
- Less important results:
- Superlinear convergence of Quasi-Newton methods
- Trust-Region Methods: Chap. 4.1-4.3.
- Most important notions/results:
- Basic trust-region algorithm (Algorithm 4.1).
- Characterisation of trust-region steps (Theorem 4.1).
- Cauchy point and dogleg method.
- Proof of Theorem 4.1.
- Less important results:
- All the details about the convergence results in Section 4.2.
- Details of the numerical solution of the trust-region subproblem in 4.3.
- Conjugate Gradient Methods: Chap. 5.
- Most important notions/results:
- Idea of linear CG, conjugacy.
- Theorem 5.1.
- Linear CG method.
- Convergence rate in (5.36) and comparison to (3.29).
- Fletcher-Reeves and Polak-Ribière method.
- Requirements for the line-search (Lemma 5.6), convergence results (Theorems 5.7 and 5.8).
- Less important results:
- Theorems 5.2 and 5.3.
- Details of the convergence rates, in particular Theorems 5.4 and 5.5.
- Please ignore:
- Everything concerning preconditioning.
- Explicit forms of the algorithms (linear CG, different non-linear CG methods).
- Proofs of the convergence results.
- Quasi-Newton Methods: Chap. 6.1, 6.4.
- Most important notions/results:
- Idea of Quasi-Newton methods.
- Necessity of Wolfe conditions.
- Main convergence results (Theorems 6.5 and 6.6).
- Less important results:
- Proof of the convergence results.
- SR1 method.
- Please ignore:
- Definitions of the DFP and BFGS updates.
- Least-Squares Problems: Chap. 10.1-10.3.
- Most important notions/results:
- Linear least squares and normal equations (10.14).
- Idea of the Gauss-Newton method and relation to trust-region methods (Levenberg-Marquardt method).
- Convergence of Gauss-Newton (Theorem 10.1).
- Less important results:
- Methods for the solution of linear least-squares problems (most of Section 10.2).
- Please ignore:
- Implementation details, convergence of Levenberg-Marquardt, methods for large residual problems.
- Proof of Theorem 10.1.
- Constrained Optimization: Chap. 12.1-12.5, 12.9.
- Most important notions/results:
- Notions of local and global solutions.
- Active set.
- Lagrangian of a constrained optimisation problem.
- Tangent cone and cone of linearised feasible directions.
- Linear independence constraint qualification (LICQ).
- KKT-conditions.
- Farkas' Lemma.
- Second order sufficient and necessary conditions.
- Dual of a constrained optimisation problem.
- Weak duality.
- Less important results:
- Nothing, really.
- Please ignore:
- Projected Hessians.
- Wolfe dual.
- Linear Programming: Chap. 13.1-13.2, 14.1.
- Most important notions/results:
- Standard form of linear programmes.
- Optimality conditions for linear programmes.
- Dual problems and strong duality (Theorem 13.1).
- Less important results:
- Idea of interior point methods.
- Primal-dual path-following and the central path.
- Long-step path-following (Algorithm 14.2).
- Convergence of Algorithm 14.2 (Theorem 14.4).
- Details of the proofs in Section 13.2.
- All the details necessary for the derivation of Theorem 14.4.
- Please ignore:
- The simplex method.
- Quadratic Programming: Chap. 16.1, 16.4, 16.5, 16.7.
- Most important notions/results:
- KKT-conditions for quadratic programmes (equations (16.5) and (16.37)).
- Differences between linear programmes and quadratic programmes (in particular Figures 16.1, 16.2).
- Idea of active set methods for quadratic programmes.
- Gradient projection method.
- Less important results:
- Subspace minimization.
- Please ignore:
- Implementation details for the active set method.
- Everything containing reduced Hessians.
- Penalty Methods: Chap. 17.1-17.3.
- Most important notions/results:
- Quadratic penalty method and Framework 17.1.
- Convergence of the quadratic penalty method (Theorem 17.1).
- Accuracy of the quadratic penalty method and asymptotic ill-conditioning (Theorem 17.2 and p. 505).
- Idea of non-smooth penalty functions.
- Augmented Lagrangian method (Framework 17.3).
- Exactness and convergence of the Augmented Lagrangian method (Theorems 17.5 and 17.6).
- Less important results:
- Well-conditioned reformulation of the quadratic penalty method (p. 506).
- All details in section 17.2.
- Details of the proofs concerning the Augmented Lagrangian method.
- Sequential Quadratic Programming: Chap. 18.1-18.3 (and 15.5).
- Most important notions/results:
- SQP for equality-constrained problems and Newton's method for the KKT-conditions.
- Construction of the SQP-method for inequality-constrained problems.
- Merit functions for enforcing convergence.
- Properties of the l1-merit function (mainly Theorem 18.2).
- Problems caused by the Maratos effect.
- Please ignore:
- Quasi-Newton methods for SQP.
- Reduced-Hessian Quasi-Newton methods.
- Interior-Point Methods for Nonlinear Programming: Chap. 19.1, 19.6.
- Most important notions/results:
- Logarithmic barrier problem (equation (19.4)).
- Please ignore:
- Everything else.
Note: Programming is not part of the exam; this part of the lecture has already been covered by the project.