Sciweavers

576 search results - page 37 / 116
» Structured metric learning for high dimensional problems
Sort
View
IJCAI
2007
14 years 11 months ago
A Subspace Kernel for Nonlinear Feature Extraction
Kernel based nonlinear Feature Extraction (KFE) or dimensionality reduction is a widely used pre-processing step in pattern classification and data mining tasks. Given a positive...
Mingrui Wu, Jason D. R. Farquhar
JMLR
2002
137views more  JMLR 2002»
14 years 9 months ago
The Subspace Information Criterion for Infinite Dimensional Hypothesis Spaces
A central problem in learning is selection of an appropriate model. This is typically done by estimating the unknown generalization errors of a set of models to be selected from a...
Masashi Sugiyama, Klaus-Robert Müller
NN
2010
Springer
183views Neural Networks» more  NN 2010»
14 years 8 months ago
Dimensionality reduction for density ratio estimation in high-dimensional spaces
The ratio of two probability density functions is becoming a quantity of interest these days in the machine learning and data mining communities since it can be used for various d...
Masashi Sugiyama, Motoaki Kawanabe, Pui Ling Chui
EMMCVPR
2011
Springer
13 years 9 months ago
High Resolution Segmentation of Neuronal Tissues from Low Depth-Resolution EM Imagery
The challenge of recovering the topology of massive neuronal circuits can potentially be met by high throughput Electron Microscopy (EM) imagery. Segmenting a 3-dimensional stack o...
Daniel Glasner, Tao Hu, Juan Nunez-Iglesias, Lou S...
ICPR
2006
IEEE
15 years 10 months ago
Dimensionality Reduction with Adaptive Kernels
1 A kernel determines the inductive bias of a learning algorithm on a specific data set, and it is beneficial to design specific kernel for a given data set. In this work, we propo...
Shuicheng Yan, Xiaoou Tang