Location: Gilman 132
When: October 26th at 1:30 p.m.
Title: On the affine invariance of the conditional gradient algorithm
Abstract: The conditional gradient algorithm is a popular projection-free first-order method to minimize a smooth convex function over a convex domain. The projection-free property makes it particularly appealing in some practical settings where projections are unavailable or prohibitively costly. Furthermore, in sharp contrast to most other first-order methods, the conditional gradient algorithm algorithm is affine invariant. This means that the algorithm is invariant under affine transformations of the domain. Although this property has been known for decades, many (if not most) analyses of convergence of the conditional gradient algorithm rely on affine-dependent objects such as norms and Lipschitz constants which is completely at odds with the affine invariance property.
This talk will introduce the conditional gradient method and its affine invariance property. Most important, we will discuss a novel affine-invariant approach to establish convergence results for the conditional gradient method. We will also discuss how this novel approach subsumes, extends, and sharpens a number of previous affine-dependent results.
Bio: Javier Pena is the Bajaj Chair Professor of Operations Research at the Tepper School of Business, Carnegie Mellon University. Prior to joining Carnegie Mellon, he earned his PhD in Applied Mathematics from Cornell University and held a postdoctoral position at the Mathematical Sciences Research Institute in Berkeley, California. He does research on theory and algorithms for convex optimization, with an emphasis on first-order methods. He also works on applications of optimization in finance and in data science.
Zoom link: https://wse.zoom.us/j/94601022340