Modern Techniques of Large Scale Optimization for Data Science

Jacek Gondzio

Lecture 1: Interior Point Methods (IPMs) for LP

- Motivation, logarithmic barrier function, central path, neighbourhoods,

- path-following method, convergence proof, complexity of the algorithm,

- practical implementation issues.

Lecture 2: Interior Point Methods for QP, (convex) NLP, SOCP and SDP

- Quadratic Programming (QP) problems, primal-dual pair of QPs,

- Nonlinear (convex) inequality constraints,

- Second-Order Cone Programming,

- Semidefinite Programming,

- Newton method, logarithmic barrier function, self-concordant barriers.

Lecture 3:

- Sparse Approximations with IPMs:

  modern applications of optimization which require a selection  of a 'sparse' solution originating from computational statistics,  signal or image processing, compressed sensing, machine learning, and discrete optimal transport, to mention just a few.

- Alternating Direction Method of Multipliers (ADMM).

Exercise/Lab classes:

Filippo Zanetti and Margherita Porcelli will help with the sessions

Exercise on Monday: 

IPMs:  From LP to QP 

Conjugate Gradients for positive definite linear systems 

Exercise on Tuesday: 

Examples of IPMs in action: 

Material-separating regularizer for multi-energy X-ray tomography 

Semidefinite Programming: Matrix Completion