Sciweavers

10 search results - page 2 / 2
» Deep Bottleneck Classifiers in Supervised Dimension Reductio...
Sort
View
SAC
2006
ACM
13 years 11 months ago
The impact of sample reduction on PCA-based feature extraction for supervised learning
“The curse of dimensionality” is pertinent to many learning algorithms, and it denotes the drastic raise of computational complexity and classification error in high dimension...
Mykola Pechenizkiy, Seppo Puuronen, Alexey Tsymbal
CVPR
2010
IEEE
14 years 1 months ago
Sufficient Dimensionality Reduction for Visual Sequence Classification
When classifying high-dimensional sequence data, traditional methods (e.g., HMMs, CRFs) may require large amounts of training data to avoid overfitting. In such cases dimensional...
Alex Shyr, Raquel Urtasun, Michael Jordan
IRI
2007
IEEE
13 years 11 months ago
Enhancing Text Analysis via Dimensionality Reduction
Many applications require analyzing vast amounts of textual data, but the size and inherent noise of such data can make processing very challenging. One approach to these issues i...
David G. Underhill, Luke McDowell, David J. Marche...
BMCBI
2010
224views more  BMCBI 2010»
13 years 5 months ago
An adaptive optimal ensemble classifier via bagging and rank aggregation with applications to high dimensional data
Background: Generally speaking, different classifiers tend to work well for certain types of data and conversely, it is usually not known a priori which algorithm will be optimal ...
Susmita Datta, Vasyl Pihur, Somnath Datta
JMLR
2008
100views more  JMLR 2008»
13 years 5 months ago
Hit Miss Networks with Applications to Instance Selection
In supervised learning, a training set consisting of labeled instances is used by a learning algorithm for generating a model (classifier) that is subsequently employed for decidi...
Elena Marchiori