Sciweavers

TNN
2011

Extended Input Space Support Vector Machine

12 years 11 months ago
Extended Input Space Support Vector Machine
—In some applications, the probability of error of a given classifier is too high for its practical application, but we are allowed to gather more independent test samples from the same class to reduce the probability of error of the final decision. From the point of view of hypothesis testing, the solution is given by the Neyman–Pearson lemma. However, there is no equivalent result to the Neyman–Pearson lemma when the likelihoods are unknown, and we are given a training dataset. In this brief, we explore two alternatives. First, we combine the soft (probabilistic) outputs of a given classifier to produce a consensus labeling for K test samples. In the second approach, we build a new classifier that directly computes the label for K test samples. For this second approach, we need to define an extended input space training set and incorporate the known symmetries in the classifier. This latter approach gives more accurate results, as it only requires an accurate classificat...
Ricardo Santiago-Mozos, Fernando Pérez-Cruz
Added 15 May 2011
Updated 15 May 2011
Type Journal
Year 2011
Where TNN
Authors Ricardo Santiago-Mozos, Fernando Pérez-Cruz, Antonio Artés-Rodríguez
Comments (0)