Sciweavers
Explore
Publications
Books
Software
Tutorials
Presentations
Lectures Notes
Datasets
Labs
Conferences
Community
Upcoming
Conferences
Top Ranked Papers
Most Viewed Conferences
Conferences by Acronym
Conferences by Subject
Conferences by Year
Tools
PDF Tools
Image Tools
Text Tools
OCR Tools
Symbol and Emoji Tools
On-screen Keyboard
Latex Math Equation to Image
Smart IPA Phonetic Keyboard
Community
Sciweavers
About
Terms of Use
Privacy Policy
Cookies
Free Online Productivity Tools
i2Speak
i2Symbol
i2OCR
iTex2Img
iWeb2Print
iWeb2Shot
i2Type
iPdf2Split
iPdf2Merge
i2Bopomofo
i2Arabic
i2Style
i2Image
i2PDF
iLatex2Rtf
Sci2ools
115
click to vote
NIPS
2008
133
views
Information Technology
»
more
NIPS 2008
»
Sparse Online Learning via Truncated Gradient
15 years 4 months ago
Download
jmlr.csail.mit.edu
We propose a general method called truncated gradient to induce sparsity in the weights of onlinelearning algorithms with convex loss functions. This method has several essential properties:
John Langford, Lihong Li, Tong Zhang
Real-time Traffic
Convex Loss Functions
|
Essential Properties
|
General Method
|
Information Technology
|
NIPS 2008
|
claim paper
Related Content
»
Online Learning via Congregational Gradient Descent
»
Learning incoherent sparse and lowrank patterns from multiple tasks
»
Denoising sparse noise via online dictionary learning
»
NonlinearDynamical Attention Allocation via Information Geometry
»
Online Detection of Unusual Events in Videos via Dynamic Sparse Coding
»
Adaptive Subgradient Methods for Online Learning and Stochastic Optimization
»
Sparse incremental learning for interactive robot control policy estimation
»
Proximal Methods for Sparse Hierarchical Dictionary Learning
»
Dual Averaging Methods for Regularized Stochastic Learning and Online Optimization
more »
Post Info
More Details (n/a)
Added
29 Oct 2010
Updated
29 Oct 2010
Type
Conference
Year
2008
Where
NIPS
Authors
John Langford, Lihong Li, Tong Zhang
Comments
(0)
Researcher Info
Information Technology Study Group
Computer Vision