Sciweavers

30 search results - page 2 / 6
» The Most Robust Loss Function for Boosting
Sort
View
ICML
2004
IEEE
14 years 6 months ago
Leveraging the margin more carefully
Boosting is a popular approach for building accurate classifiers. Despite the initial popular belief, boosting algorithms do exhibit overfitting and are sensitive to label noise. ...
Nir Krause, Yoram Singer
ECCV
2010
Springer
13 years 11 months ago
Robust Multi-View Boosting with Priors
Many learning tasks for computer vision problems can be described by multiple views or multiple features. These views can be exploited in order to learn from unlabeled data, a.k.a....
NIPS
2004
13 years 7 months ago
Optimal Aggregation of Classifiers and Boosting Maps in Functional Magnetic Resonance Imaging
We study a method of optimal data-driven aggregation of classifiers in a convex combination and establish tight upper bounds on its excess risk with respect to a convex loss funct...
Vladimir Koltchinskii, Manel Martínez-Ram&o...
IJCNN
2007
IEEE
14 years 11 days ago
Optimizing 0/1 Loss for Perceptrons by Random Coordinate Descent
—The 0/1 loss is an important cost function for perceptrons. Nevertheless it cannot be easily minimized by most existing perceptron learning algorithms. In this paper, we propose...
Ling Li, Hsuan-Tien Lin
ICML
2004
IEEE
14 years 6 months ago
Boosting margin based distance functions for clustering
The performance of graph based clustering methods critically depends on the quality of the distance function, used to compute similarities between pairs of neighboring nodes. In t...
Tomer Hertz, Aharon Bar-Hillel, Daphna Weinshall