Sciweavers

NIPS
2007

A Risk Minimization Principle for a Class of Parzen Estimators

13 years 6 months ago
A Risk Minimization Principle for a Class of Parzen Estimators
This paper1 explores the use of a Maximal Average Margin (MAM) optimality principle for the design of learning algorithms. It is shown that the application of this risk minimization principle results in a class of (computationally) simple learning machines similar to the classical Parzen window classifier. A direct relation with the Rademacher complexities is established, as such facilitating analysis and providing a notion of certainty of prediction. This analysis is related to Support Vector Machines by means of a margin transformation. The power of the MAM principle is illustrated further by application to ordinal regression tasks, resulting in an O(n) algorithm able to process large datasets in reasonable time.
Kristiaan Pelckmans, Johan A. K. Suykens, Bart De
Added 30 Oct 2010
Updated 30 Oct 2010
Type Conference
Year 2007
Where NIPS
Authors Kristiaan Pelckmans, Johan A. K. Suykens, Bart De Moor
Comments (0)