Sciweavers

PKDD
2010
Springer

Efficient and Numerically Stable Sparse Learning

13 years 1 months ago
Efficient and Numerically Stable Sparse Learning
We consider the problem of numerical stability and model density growth when training a sparse linear model from massive data. We focus on scalable algorithms that optimize certain loss function using gradient descent, with either 0 or 1 regularization. We observed numerical stability problems in several existing methods, leading to divergence and low accuracy. In addition, these methods typically have weak controls over sparsity, such that model density grows faster than necessary. We propose a framework to address the above problems. First, the update rule is numerically stable with convergence guarantee and results in more reasonable models. Second, besides 1 regularization, it exploits the sparsity of data distribution and achieves a higher degree of sparsity with a PAC generalization error bound. Lastly, it is parallelizable and suitable for training large margin classifiers on huge datasets. Experiments show that the proposed method converges consistently and outperforms other ba...
Sihong Xie, Wei Fan, Olivier Verscheure, Jiangtao
Added 14 Feb 2011
Updated 14 Feb 2011
Type Journal
Year 2010
Where PKDD
Authors Sihong Xie, Wei Fan, Olivier Verscheure, Jiangtao Ren
Comments (0)