Theoretically well-founded, Support Vector Machines (SVM)are well-knownto be suited for efficiently solving classification problems. Althoughimprovedgeneralization is the maingoal of this newtype of learning machine,recent workshave tried to use themdifferently. For instance, feature selection has beenrecently viewedas an indirect consequenceof the SVM approach. In this paper, we also exploit SVMsdifferently from what they are originally intended. We investigatethemasa datareductiontechnique,useful forimprovingcase-basedlearningalgorithms,sensitive tonoiseandcomputationallyexpensive.Adoptingthe marginmaximizationprincipleforreducingtheStructuralRisk,ourstrategyallowsnotonlytoeliminateirrelevantinstancesbutalsotoimprovetheperformances ofthestandardk-Nearest-Neighborclassifier.A wide comparativestudyispresentedonseveralbenchmarks ofUCIrepository,showingtheutilityofourapproach.