Optimization
Content
Unconstrained problems
- Gradient descent
- Newton’s method
- Conjugate gradient descent
- Broden-Fletcher-Goldfarb-Shanno (BFGS)
Convexity
- Properties
- Lagrange function
- Wolfe dual
Batch methods
- Distributed subgradient
- Bundle methods
Online methods
- Unconstrained subgradient
- Gradient projections
- Parallel optimization
Supplementary material
Slides in PDF and Keynote. If you want to extract the equations from the slides you can do so by using LaTeXit, simply by dragging the equation images into it. There’s also an optimization chapter from the Learning with Kernels book.
- Boyd and Vandenberghe book (the default reference for convex optimization)
- Submodular optimization and applications site
- Nesterov and Vial paper on expected convergence
- Bartlett, Hazan, Rakhlin paper which uses strong convexity.
- TAO (Toolkit for advanced optimization) site
- Ratliff, Bagnell, Zinkevich regret proof
- Shalev-Shwartz, Srebro, Singer Pegasos
- Langford, Smola, Zinkevich proof of multicore convergence
- Recht, Wright, Re proof of asynchronous updates in Hogwild
Videos
This is unedited video straight from a Lumix GF2 with a 20mm lens which should explain the sound (it doesn’t have a dedicated audio input) … But it should help as a supplement with the slides (YouTube typically makes the 1080i version available within 1 week of the upload).