Sciweavers

COLT
2008
Springer

On the Equivalence of Weak Learnability and Linear Separability: New Relaxations and Efficient Boosting Algorithms

13 years 6 months ago
On the Equivalence of Weak Learnability and Linear Separability: New Relaxations and Efficient Boosting Algorithms
Boosting algorithms build highly accurate prediction mechanisms from a collection of lowaccuracy predictors. To do so, they employ the notion of weak-learnability. The starting point of this paper is a proof which shows that weak learnability is equivalent to linear separability with 1 margin. While this equivalence is a direct consequence of von Neumann's minimax theorem, we derive the equivalence directly using Fenchel duality. We then use our derivation to describe a family of relaxations to the weak-learnability assumption that readily translates to a family of relaxations of linear separability with margin. This alternative perspective sheds new light on known soft-margin boosting algorithms and also enables us to derive several new relaxations of the notion of linear separability. Last, we describe and analyze an efficient boosting framework that can be used for minimizing the loss functions derived from our family of relaxations. In particular, we obtain efficient boosting...
Shai Shalev-Shwartz, Yoram Singer
Added 18 Oct 2010
Updated 18 Oct 2010
Type Conference
Year 2008
Where COLT
Authors Shai Shalev-Shwartz, Yoram Singer
Comments (0)