Convex Optimization -- Spring 2013

EE227A -- Syllabus / Outline

1. [22/01] Introduction to optimization

  • Course organization details
  • Homeworks, homework rules
  • Quizzes
  • Project – aim is to strive for publication
  • Nonlinear programming models
  • Example of why nonconvex problems are NP-Hard (I like showing subset sum problem as illustration)
  • Examples of common optimization problems (Least squares, linear programming, SDPs)
  • History of convex analysis, and optimization
  • Some remarks on applications
  • Course objectives are:
    • learn basics of convex analysis
    • formulating problems as convex optimization
    • algorithms for convex optimization
  • Mini homework: Download and install CVX software

2. [24/01] Convex sets and functions

  • Some linear algebra review
  • Convex sets
  • Operations preserving convexity
  • Definition of convex functions
  • First order, and second order characterization
  • Pointwise sup of f(x; alpha) over alpha in A
  • Pointwise inf of f(x,y) – non-trivial

3. [29/01] Convex functions in more detail

  • Norms
  • Mixed norms, group norms, structured norms
  • Matrix norms, NP-Hardness of computing them
  • Fenchel conjugates
  • Log convex, log concave
  • Gamma function, special functions
  • Exponential convexity, Bernstein theorem
  • Operator convex functions
  • Maybe something else if time permits

4. [31/01] Subdifferentials

  • Fenchel conjugates
  • Subgradients
  • Subdifferential calculus
  • Epsilon-subdifferentials

5. [05/02] Optimization problems

  • Convex models
  • Basic optimality conditions

6. [07/02] Optimization problems

  • cone programs
  • socp representations, sdp representations

7. [12/02] Cancelled

8. [14/02] Weak duality, sdp example

  • Laurent's lecture actually

9. [19/02] Quiz; Revision

  • Nothing;

10. [21/02] Lagrangians, strong duality, minimax, S-lemma

11. [26/02] Duality, minimax, optimality

  • Dual for regularized problems
  • Dual via Fenchel conjugates
  • Variable splitting
  • Conic duality
  • Failure of strong duality in SDPs
  • Optimality conditions–KKT conditions
  • Minimax, saddle points

12. [28/02] Subgradient methods

  • ordinary subgradient method
  • rate of convergence analysis
  • hints on lower bound
  • bundle, cutting plane, spectral bundle
  • sparse subgrads, black box
  • subgrad mirror descent version?

13. [05/03] Gradient methods

  • Recap, including KKT
  • Descent methods
  • Choice of descent directions
  • Choice of stepsizes
  • Nonmonotonic BB steps

14. [07/03] Gradient methods – II

  • Gradient-descent convergence
  • Convergence for differentiable
  • Descent lemma
  • Rate of convg. for convex
  • Results for strongly cvx, etc.
  • Lower bounds (mention only)
  • Constrained problems
  • Feasible direction, descent methods
  • Conditional gradient method
  • Projected gradient convergence
  • Optimal gradient methods

15. [12/03] Gradient methods – III

  • Optimal gradient methods
  • Polyak's momentum term
  • Nonsmooth optimization
  • Lower bound on nonsmooth optimization
  • Composite objective optimization
  • Proximity operators
  • Proximal gradient method (FBS)

16. [14/03] Proximal methods

  • Convergence theory for projected gradient
  • Convergence for proximal gradients
  • Optimal proximal gradients
  • Monotone operators intro
  • Set-valued maps, generalized equations
  • Resolvents, prox operator as resolvent

17. [19/03] Operator splitting

  • Monotone operators revision
  • Resolvents
  • Prox-operators and resolvents
  • Proximal-splitting
  • Douglas-Rachford derivation

18. [21/03] Proximal methods, Incremental - I

  • Douglas-Rachford details
  • Proximity for several functions
  • Product space trick
  • Incremental methods
  • Incremental gradient methods
  • Incremental proximal method
  • Incremental proximal gradients
  • Gradient descent with error

19. [02/04] Incremental, stochastic methods

20. [04/04] Coordinate descent

21. [09/04] Dual decomp, ADMM, etc.

22. [11/04] Parallel, distributed opt

23. [16/04] Canceled

24. [18/04] Quasi-Newton, Newton methods

25. [23/04] Interior point methods

26. [25/04] Robust Optimization

27. [30/04] Derivative free optimization

28. [02/05] SOS, Convex alg. geometry

Date: 2013-02-25 Mon

Author: suvrit

Org version 7.8.11 with Emacs version 24

Validate XHTML 1.0