Sciweavers

396 search results - page 32 / 80
» Lossy Reduction for Very High Dimensional Data
Sort
View
ICIP
2003
IEEE
16 years 1 months ago
Reversible integer KLT for progressive-to-lossless compression of multiple component images
In this paper, we presented a method for integer reversible implementation of KLT for multiple component image compression. The progressive-to-lossless compression algorithm emplo...
P. Hao, Q. Shi
VLDB
2005
ACM
136views Database» more  VLDB 2005»
15 years 5 months ago
On k-Anonymity and the Curse of Dimensionality
In recent years, the wide availability of personal data has made the problem of privacy preserving data mining an important one. A number of methods have recently been proposed fo...
Charu C. Aggarwal
SAC
2006
ACM
15 years 5 months ago
The impact of sample reduction on PCA-based feature extraction for supervised learning
“The curse of dimensionality” is pertinent to many learning algorithms, and it denotes the drastic raise of computational complexity and classification error in high dimension...
Mykola Pechenizkiy, Seppo Puuronen, Alexey Tsymbal
IJCV
2008
155views more  IJCV 2008»
14 years 11 months ago
Fast Transformation-Invariant Component Analysis
For software and more illustrations: http://www.psi.utoronto.ca/anitha/fastTCA.htm Dimensionality reduction techniques such as principal component analysis and factor analysis are...
Anitha Kannan, Nebojsa Jojic, Brendan J. Frey
NIPS
2004
15 years 1 months ago
Two-Dimensional Linear Discriminant Analysis
Linear Discriminant Analysis (LDA) is a well-known scheme for feature extraction and dimension reduction. It has been used widely in many applications involving high-dimensional d...
Jieping Ye, Ravi Janardan, Qi Li