Sciweavers

JMLR
2010

Exploiting Feature Covariance in High-Dimensional Online Learning

12 years 11 months ago
Exploiting Feature Covariance in High-Dimensional Online Learning
Some online algorithms for linear classification model the uncertainty in their weights over the course of learning. Modeling the full covariance structure of the weights can provide a significant advantage for classification. However, for high-dimensional, largescale data, even though there may be many second-order feature interactions, it is computationally infeasible to maintain this covariance structure. To extend second-order methods to high-dimensional data, we develop low-rank approximations of the covariance structure. We evaluate our approach on both synthetic and real-world data sets using the confidence-weighted (Dredze et al., 2008; Crammer et al., 2009a) online learning framework. We show improvements over diagonal covariance matrices for both low and high-dimensional data.
Justin Ma, Alex Kulesza, Mark Dredze, Koby Crammer
Added 19 May 2011
Updated 19 May 2011
Type Journal
Year 2010
Where JMLR
Authors Justin Ma, Alex Kulesza, Mark Dredze, Koby Crammer, Lawrence K. Saul, Fernando Pereira
Comments (0)