moteur de recherche

Séminaire Algo - Adrian Vladu
Séminaire Algo - Adrian Vladu
3-Dec-2019 14:00
Age: 7 days





Adrian Vladu

Improved Convergence for L∞ and L1 Regression via Iteratively Reweighted Least Squares

Salle de séminaire (4B125) - Bâtiment Copernic

Abstract: The iteratively reweighted least squares method (IRLS) is a popular technique used in practice for solving regression problems. Various versions of this method have been proposed, but theoretical analyses usually fail to capture their good practical performance.

I will present a simple and natural version of IRLS for solving L∞ and L1 regression, which provably converges to a (1+ε)-approximate solution in O(m^1/3 log(1/ε)/ε^(2/3) + log(m/ε)/ε^2) iterations, where m is the number of rows of the input matrix. This running time is independent of the conditioning of the input, and up to poly(1/ε) beats the O(m^1/2) iterations required by standard interior-point methods.

This improves upon the highly specialized algorithms of Chin et al. (ITCS ’12), and Christiano et al. (STOC ’11), and yields a truly efficient natural algorithm for the slime mold dynamics (Straszak-Vishnoi, SODA ’16, ITCS ’16, ITCS ’17).

I will also highlight a few connections to packing/covering LP solvers and higher order optimization methods.

Arxiv link: https://arxiv.org/pdf/1902.06391.pdf, appears in ICML 2019.








<- Back to: Accueil