Sciweavers

247 search results - page 15 / 50
» Superset Learning Based on Generalized Loss Minimization
Sort
View
101
Voted
COLT
2001
Springer
15 years 4 months ago
Geometric Bounds for Generalization in Boosting
We consider geometric conditions on a labeled data set which guarantee that boosting algorithms work well when linear classifiers are used as weak learners. We start by providing ...
Shie Mannor, Ron Meir
115
Voted
KDD
2004
ACM
158views Data Mining» more  KDD 2004»
16 years 24 days ago
A generalized maximum entropy approach to bregman co-clustering and matrix approximation
Co-clustering is a powerful data mining technique with varied applications such as text clustering, microarray analysis and recommender systems. Recently, an informationtheoretic ...
Arindam Banerjee, Inderjit S. Dhillon, Joydeep Gho...
ICDM
2009
IEEE
172views Data Mining» more  ICDM 2009»
15 years 7 months ago
Sparse Least-Squares Methods in the Parallel Machine Learning (PML) Framework
—We describe parallel methods for solving large-scale, high-dimensional, sparse least-squares problems that arise in machine learning applications such as document classificatio...
Ramesh Natarajan, Vikas Sindhwani, Shirish Tatikon...
ICML
2007
IEEE
16 years 1 months ago
Gradient boosting for kernelized output spaces
A general framework is proposed for gradient boosting in supervised learning problems where the loss function is defined using a kernel over the output space. It extends boosting ...
Florence d'Alché-Buc, Louis Wehenkel, Pierr...
94
Voted
ICML
2004
IEEE
16 years 1 months ago
Learning with non-positive kernels
In this paper we show that many kernel methods can be adapted to deal with indefinite kernels, that is, kernels which are not positive semidefinite. They do not satisfy Mercer...
Alexander J. Smola, Cheng Soon Ong, Stéphan...