====== Lecture plan ====== ^ Date ^ Topics ^ Reading ^ Slides and code ^\\ ^ **Introduction, notion and existence of solutions** |||| \\ ^ Week 2 | Overview \\ Formulation of a minimization problem \\ Definitions of minima \\ Lower semi-continuity and coercivity \\ Existence theorems for minima \\ First order necessary conditions \\ Second order necessary and sufficient conditions | N&W, Chapters 1, 2.1 \\ \\ {{:tma4180:2017v:note1.pdf|Minimisers of Optimisation Problems}} | {{ :tma4180:2019v:introduction.pdf |Introduction}} \\ \\ {{ :tma4180:2019v:definitions.pdf |Definitions}} | ^ Week 3 | Convex functions \\ First and second order characterisations | {{:tma4180:2017v:note2.pdf|Basic Properties of Convex Functions}} | {{ :tma4180:2019v:optimalityconditions.pdf |Optimality conditions}} | ^ **Line search methods for free optimisation** |||| \\ ^ | Gradient descent method \\ Exact line search methods \\ Armijo conditions \\ Wolfe conditions | N&W, Chapter 3.1 \\ {{ :tma4180:2018v:note03.pdf|Convergence of descent methods with backtracking}} | {{ :tma4180:2019v:convexity.pdf |Convex functions}} \\ {{ :tma4180:2019v:undamped_graddesc.py |}} \\ {{ :tma4180:2019v:graddesc_backtracking.py |}} | ^ Week 4 | Exact line search \\ Goldstein conditions \\ (strong and weak) Wolfe conditions \\ Algorithmic approach to Wolfe conditions \\ Convergence rate of the gradient descent method \\ Newton's method \\ Damped Newton method \\ Regularisation of the damped Newton method | N&W, Chapter 3.1 \\ {{ :tma4180:2018v:note03.pdf|Convergence of descent methods with backtracking}} \\ \\ N&W, Chapter 3.3 \\ \\ \\ \\ N&W, Chapter 3.5 | {{ :tma4180:2019v:gradientdescent.pdf |Backtracking gradient descent}} \\ \\ \\ {{ :tma4180:2019v:linesearches_v2.pdf |Line searches}} | ^ Week 5 | Linear Conjugate Gradient method \\ Fletcher--Reeves method \\ Polak--Ribière method \\ Zoutendijk's result \\ Summary | N&W, Chapter 5 \\ \\ \\ N&W, Chapter 3.2 | {{ :tma4180:2019v:convergencerates.pdf |Gradient descent and Newton}} \\ \\ {{ :tma4180:2019v:cg.pdf |CG methods}}| ^ **Theory and basic methods for constrained optimisation** |||| \\ ^ Week 6 | Constrained optimisation over convex sets \\ Feasible directions \\ First order optimality conditions for convex sets \\ Projections on convex sets \\ Gradient projection method \\ Normal cones \\ Conditions for linear constraints \\ Tangent cones for nonconvex sets \\ Necessary conditions for non-convex sets \\ Linearised feasible directions | {{ :tma4180:2019v:convexopt.pdf |Optimisation with convex constraints}} \\ \\ \\ \\ \\ \\ \\ N&W, Chapter 12.1--12.4.((12.2 is particularly relevant here.)) | {{ :tma4180:2019v:cg_nonlinear.pdf |Nonlinear CG methods}} \\ \\ \\ \\ \\ \\ {{ :tma4180:2019v:convex_constraints.pdf |Optimisation with convex constraints}}| ^ Week 7 | Constraint qualifications \\ LICQ \\ Farkas' Lemma \\ KKT-conditions \\ Second order necessary and sufficient conditions | N&W, Chapter 12.2--12.5 | {{ :tma4180:2019v:linear_and_nonconvex_constraints.pdf |Linear and non-convex constraints}} \\ \\ {{ :tma4180:2019v:licq.pdf |Constraint qualifications}} | ^ Week 8 | Quadratic penalty method \\ Nonsmooth penalty methods \\ Augmented Lagrangian method | N&W, Chapters 17.1--17.3 | | ^ Week 9 | Penalty methods for inequality constraints \\ Logarithmic barrier methods | N&W, Chapters 17.1--17.2, 17.4 \\ N&W, Chapter 19.1 | {{ :tma4180:2019v:kkt_qp_alm.pdf |KKT conditions and more}} \\ \\ {{ :tma4180:2019v:barrier_methods.pdf |Barrier methods}} | ^ **Advanced methods for free optimisation** |||| \\ ^ | Quasi-Newton methods \\ SR1-method \\ DFP and BFGS method | N&W, Chapters 6.1, 6.2, 6.4 | | ^ Week 10 | BFGS method \\ Properties of Quasi-Newton methods \\ Limited memory BFGS \\ Trust region methods \\ Exact solution of the trust region problem | N&W, Chapters 6.1, 6.2, 6.4, 7.2 \\ \\ \\ N&W 4.1, 4.2, 4.3 | {{ :tma4180:2019v:sr1_dfp.pdf |SR1 and DFP methods}} \\ \\ \\ {{ :tma4180:2019v:quasinewton.pdf |Quasi-Newton methods}} | ^ Week 11 | Cauchy point \\ Convergence of trust region methods \\ Dogleg method \\ Non-linear least squares problems \\ Gauß--Newton method \\ Levenberg--Marquardt method | N&W 4.1, 4.2 \\ \\ \\ N&W 10.1, 10.3 | {{ :tma4180:2019v:trustregion.pdf |Trust region methods}} \\ \\ \\ {{ :tma4180:2019v:trustregion2.pdf |Trust region methods, II}} | ^ **Advanced methods for constrained optimisation** |||| \\ ^ | Lagrangian duality \\ Primal and dual linear programmes \\ Weak duality | N&W, Chapter 12.9 | | ^ Week 12 | Strong duality for convex problems \\ Saddle point properties \\ Dual projected gradient method \\ Primal-dual interior point method for linear programming \\ Central path \\ Path following methods | N&W, Chapter 12.9 \\ {{ :tma4180:2019v:lagrangian_duality.pdf |Lagrangian Duality}} \\ \\ \\ N&W, Chapters 13.1,14.1 | {{ :tma4180:2019v:nonlinleastsquares_duality.pdf |Non-linear Least squares and Lagrangian duality}} \\ \\ \\ {{ :tma4180:2019v:duality.pdf |Lagrangian duality}} | ^ Week 13 | Quadratic programming \\ Direct solution of equality constrained problems \\ Active set methods \\ SQP method (equality constraints) \\ Newton's method for the KKT conditions \\ Merit functions \\ Maratos effect | N&W, Chapters 16.1, 16.2, 16.5, 16.7 \\ \\ \\ N&W, Chapters 18.1, 18.3, 15.4, 15.5 | {{ :tma4180:2019v:interior_point_methods.pdf |Interior point methods}} \\ \\ \\ {{ :tma4180:2019v:quadratic_programming.pdf |Quadratic programming}} | ^ Week 14 | SQP method (inequality constraints) | N&W, Chapters 18.1, 18.3 | {{ :tma4180:2019v:sqp.pdf |SQP method}} | ^ **Summary** |||| \\ ^ | Thursday: Summary of theory \\ Friday: Summary of numerical methods \\ Questions | | {{ :tma4180:2019v:summary.pdf |Summary}} | ^ Week 15 | Primal dual methods for convex problems \\ Questions | {{ :tma4180:2019v:primal_dual_ell1.pdf |Some poor, handwritten notes}} | |