Sciweavers

55 search results - page 3 / 11
» On Relevant Dimensions in Kernel Feature Spaces
Sort
View
COLT
1999
Springer
13 years 9 months ago
Covering Numbers for Support Vector Machines
—Support vector (SV) machines are linear classifiers that use the maximum margin hyperplane in a feature space defined by a kernel function. Until recently, the only bounds on th...
Ying Guo, Peter L. Bartlett, John Shawe-Taylor, Ro...
BDA
2007
13 years 6 months ago
Hyperplane Queries in a Feature-Space M-tree for Speeding up Active Learning
In content-based retrieval, relevance feedback (RF) is a noticeable method for reducing the “semantic gap” between the low-level features describing the content and the usually...
Michel Crucianu, Daniel Estevez, Vincent Oria, Jea...
ICANN
2005
Springer
13 years 10 months ago
Smooth Bayesian Kernel Machines
Abstract. In this paper, we consider the possibility of obtaining a kernel machine that is sparse in feature space and smooth in output space. Smooth in output space implies that t...
Rutger W. ter Borg, Léon J. M. Rothkrantz
MLDM
1999
Springer
13 years 9 months ago
Independent Feature Analysis for Image Retrieval
Content-based image retrieval methods based on the Euclidean metric expect the feature space to be isotropic. They su€er from unequal di€erential relevance of features in comput...
Jing Peng, Bir Bhanu
NIPS
2000
13 years 6 months ago
Text Classification using String Kernels
We propose a novel approach for categorizing text documents based on the use of a special kernel. The kernel is an inner product in the feature space generated by all subsequences...
Huma Lodhi, John Shawe-Taylor, Nello Cristianini, ...