Pensum (Curriculum)
Pensum in the course for spring 2023:
First of all, the core part of the pensum consists of all the material and all the results and approaches that were discussed in the lectures, as well as the exercises. Summaries of the lectures can be found here. Also, here are the slides from the summary lecture on April 24.
You can find more detailed expositions of the material discussed in class in the following lecture notes, as well as in the book by Nocedal & Wright, Numerical Optimization, Second Edition, 2006 (most important points in boldface):
Lecture notes:
- Note on unconstrained optimisation, in particular the results concerning the existence of minimisers, the necessary and sufficient optimality conditions, and the different characterisations of convexity.
- Note on optimisation with convex constraints, in particular the necessary and sufficient conditions for optimisation with convex constraints, the idea of projections, and Slater's constraint qualification..
- Note on multicriteria optimisation, in particular the notion of Pareto optimality and the weighted sum method.
- Note on Lagrangian duality, in particular the notion of dual optimisation problems, weak duality, the notion of saddle points, Slater's constraint qualification, and strong duality for convex programmes.
Relevant chapters in Nocedal & Wright:
- Basics of optimization: Chap. 1, 2.
- Most important notions/results:
- Notions of (strict) global/local solutions of optimisation problems.
- First order necessary conditions.
- Second order sufficient and necessary conditions.
- Basic idea of trust region/line search.
- Line Search Methods: Chap. 3.1-3.3.
- Most important notions/results:
- Definitions of Wolfe conditions.
- "Well-posedness" of Wolfe conditions (Lemma 3.1).
- Zoutendijk's result (Theorem 3.2).
- Idea of backtracking line search.
- Linear convergence of the gradient descent method, importance of the condition of the Hessian.
- Quadratic convergence of Newton's method.
- Less important results:
- Superlinear convergence of Quasi-Newton methods
- Trust-Region Methods: Chap. 4.1-4.3.
- Most important notions/results:
- Basic trust-region algorithm (Algorithm 4.1).
- Characterisation of trust-region steps (Theorem 4.1).
- Cauchy point and dogleg method.
- Proof of Theorem 4.1.
- Less important results:
- All the details about the convergence results in Section 4.2.
- Details of the numerical solution of the trust-region subproblem in 4.3.
- Two-dimensional subspace minimization.
- Conjugate Gradient Methods: Chap. 5.
- Most important notions/results:
- Idea of linear CG, conjugacy.
- Theorem 5.1.
- Linear CG method.
- Convergence rate in (5.36) and comparison to (3.29).
- Fletcher-Reeves and Polak-Ribière method.
- Requirements for the line-search (Lemma 5.6), convergence results (Theorems 5.7 and 5.8).
- Less important results:
- Theorems 5.2 and 5.3.
- Details of the convergence rates, in particular Theorems 5.4 and 5.5.
- Please ignore:
- Everything concerning preconditioning.
- Explicit forms of the linear CG method.
- Proofs of the convergence results.
- Quasi-Newton Methods: Chap. 6.1, 6.2, 6.4.
- Most important notions/results:
- Idea of Quasi-Newton methods.
- Idea of SR1 method.
- Necessity of Wolfe conditions for BFGS and DFP.
- Main convergence results (Theorems 6.5 and 6.6).
- Less important results:
- Proof of the convergence results.
- Least-Squares Problems: Chap. 10.1-10.3.
- Most important notions/results:
- Linear least squares and normal equations (10.14).
- Idea of the Gauss-Newton method and relation to trust-region methods (Levenberg-Marquardt method).
- Less important results:
- Convergence of Gauss-Newton (Theorem 10.1).
- Please ignore:
- Implementation details, convergence of Levenberg-Marquardt, methods for large residual problems.
- Proof of Theorem 10.1.
- Constrained Optimization: Chap. 12.1-12.5, 12.9.
- Most important notions/results:
- Notions of local and global solutions.
- Active set.
- Lagrangian of a constrained optimisation problem.
- Tangent cone and cone of linearised feasible directions.
- Linear independence constraint qualification (LICQ).
- KKT-conditions.
- Farkas' Lemma.
- Second order sufficient and necessary conditions.
- Dual of a constrained optimisation problem, Weak duality.1)
- Less important results:
- Nothing, really.
- Please ignore:
- Projected Hessians.
- Wolfe dual.
- Linear Programming: Chap. 13.1-13.2, 14.1.
- Most important notions/results:
- Standard form of linear programmes.
- Optimality conditions for linear programmes, dual problems and strong duality (Theorem 13.1).2)
- Less important results:
- Idea of interior point methods.
- Primal-dual path-following and the central path.
- Long-step path-following (Algorithm 14.2).
- Convergence of Algorithm 14.2 (Theorem 14.4).
- Details of the proofs in Section 13.2.
- All the details necessary for the derivation of Theorem 14.4.
- Please ignore:
- The simplex method.
- Quadratic Programming: Chap. 16.1, 16.4, 16.5.
- Most important notions/results:
- KKT-conditions for quadratic programmes (equations (16.5) and (16.37)).
- Differences between linear programmes and quadratic programmes (in particular Figures 16.1, 16.2).
- Idea of active set methods for quadratic programmes.
- Please ignore:
- Subspace minimization.
- Gradient projection method.
- Implementation details for the active set method.
- Everything containing reduced Hessians.
- Penalty Methods: Chap. 17.1, 17.3.
- Most important notions/results:
- Quadratic penalty method and Framework 17.1.
- Convergence of the quadratic penalty method (Theorem 17.1).
- Accuracy of the quadratic penalty method and asymptotic ill-conditioning (Theorem 17.2 and p. 505).
- Augmented Lagrangian method (Framework 17.3).
- Exactness and convergence of the Augmented Lagrangian method (Theorems 17.5 and 17.6).
- Less important results:
- Well-conditioned reformulation of the quadratic penalty method (p. 506).
- Details of the proofs concerning the Augmented Lagrangian method.
- Sequential Quadratic Programming: Chap. 18.1-18.3 (and 15.5).
- Most important notions/results:
- SQP for equality-constrained problems and Newton's method for the KKT-conditions.
- Construction of the SQP-method for inequality-constrained problems.
- Merit functions for enforcing convergence.
- Properties of the l1-merit function (mainly Theorem 18.2).
- Problems caused by the Maratos effect.
- Please ignore:
- Quasi-Newton methods for SQP.
- Reduced-Hessian Quasi-Newton methods.
- Interior-Point Methods for Nonlinear Programming: Chap. 19.1, 19.6.
- Most important notions/results:
- Logarithmic barrier problem (equation (19.4)).
- Please ignore:
- Everything else.
Note: Programming is not part of the exam; this part of the lecture has already been covered by the project.