Sciweavers

219 search results - page 24 / 44
» Learning the Dimensionality of Hidden Variables
Sort
View
AAAI
2012
13 years 2 months ago
Sparse Probabilistic Relational Projection
Probabilistic relational PCA (PRPCA) can learn a projection matrix to perform dimensionality reduction for relational data. However, the results learned by PRPCA lack interpretabi...
Wu-Jun Li, Dit-Yan Yeung
ICML
2010
IEEE
15 years 23 days ago
Local Minima Embedding
Dimensionality reduction is a commonly used step in many algorithms for visualization, classification, clustering and modeling. Most dimensionality reduction algorithms find a low...
Minyoung Kim, Fernando De la Torre
MLDM
2009
Springer
15 years 6 months ago
A Two-fold PCA-Approach for Inter-Individual Recognition of Emotions in Natural Walking
This paper describes recognition of emotions of an unkown person during natural walking. As gait data is redundant, high dimensional and variable, effective feature extraction is ...
Michelle Karg, Robert Jenke, Kolja Kühnlenz, ...
NIPS
2001
15 years 1 months ago
Unsupervised Learning of Human Motion Models
This paper presents an unsupervised learning algorithm that can derive the probabilistic dependence structure of parts of an object (a moving human body in our examples) automatic...
Yang Song, Luis Goncalves, Pietro Perona
JMLR
2008
188views more  JMLR 2008»
14 years 11 months ago
Maximal Causes for Non-linear Component Extraction
We study a generative model in which hidden causes combine competitively to produce observations. Multiple active causes combine to determine the value of an observed variable thr...
Jörg Lücke, Maneesh Sahani