Select Lab Publications


Parallel Coordinate Descent for L1-Regularized Loss Minimization (2011)

By: Joseph K. Bradley, Aapo Kyrola, Danny Bickson, and Carlos Guestrin

Abstract: We propose Shotgun, a parallel coordinate descent algorithm for minimizing L1-regularized losses. Though coordinate descent seems inherently sequential, we prove convergence bounds for Shotgun which predict linear speedups, up to a problem-dependent limit. We present a comprehensive empirical study of Shotgun for Lasso and sparse logistic regression. Our theoretical predictions on the potential for parallelism closely match behavior on real data. Shotgun outperforms other published solvers on a range of large problems, proving to be one of the most scalable algorithms for L1.

Download Information
Joseph K. Bradley, Aapo Kyrola, Danny Bickson, and Carlos Guestrin (2011). "Parallel Coordinate Descent for L1-Regularized Loss Minimization." International Conference on Machine Learning (ICML 2011). pdf   talk        
BibTeX citation

@inproceedings{Bradley+al:icml11parlasso,
title = {Parallel Coordinate Descent for L1-Regularized Loss Minimization},
author = {Joseph K. Bradley and Aapo Kyrola and Danny Bickson and Carlos Guestrin},
booktitle = {International Conference on Machine Learning (ICML 2011)},
month = {June},
year = {2011},
address = {Bellevue, Washington},
wwwfilebase = {icml2011-bradley-kyrola-bickson-guestrin},
wwwtopic = {Parallel Learning}
}



full list