Dual Extrapolation for Faster Lasso Solvers.
Convex sparsity-inducing regularizations are ubiquitous in high-dimensionmachine learning, but their non-differentiability requires the use of iterativesolvers. To accelerate such solvers, state-of-the-art approaches consist inreducing the size of the optimization problem at hand. In the context ofregression, this can be achieved either by discarding irrelevant features(screening techniques) or by prioritizing features likely to be included in thesupport of the solution (working set techniques). Duality comes into play atseveral steps in these techniques. Here, we propose an extrapolation techniquestarting from a sequence of iterates in the dual that leads to the constructionof an improved dual point. This enables a tighter control of optimality as usedin stopping criterion, as well as better screening performance of Gap Saferules. Finally, we propose a working set strategy based on an aggressive use ofGap Safe rules and our new dual point construction, which improvesstate-of-the-art time performance on Lasso problems.
Stay in the loop.
Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.