Title: Iteratively Reweighted Least Squares: New Formulations and Guarantees
Abstract: Iteratively Reweighted Least Squares (IRLS) is a simple algorithmic framework for non-smooth optimization that has been studied since the 1930’s and has been widely used in approximation theory, statistics, computer vision and beyond. Despite its popularity, a thorough understanding of the framework had been elusive. In this talk, we present advances in both the theory and formulation of IRLS for several problems in data science. First, we provide the first global linear convergence results for IRLS applied to l1-norm minimization.
Furthermore, we formulate MatrixIRLS, an optimal formulation for IRLS optimizing non-convex spectral functions which are of interest in low-rank matrix optimization. We show fast local convergence guarantees for MatrixIRLS with quadratic or superlinear rate for low-rank matrix completion problems if only a minimal number of random entries is provided. Furthermore, we illustrate its scalability and empirical data-efficiency compared to the state-of-the-art for matrix completion problems.
Here is the zoom link is: https://wse.zoom.us/j/95448608570