Sciweavers

CORR
2010
Springer

Regression on fixed-rank positive semidefinite matrices: a Riemannian approach

13 years 1 months ago
Regression on fixed-rank positive semidefinite matrices: a Riemannian approach
The paper addresses the problem of learning a regression model parameterized by a fixed-rank positive semidefinite matrix. The focus is on the nonlinear nature of the search space and on scalability to high-dimensional problems. The mathematical developments rely on the theory of gradient descent algorithms adapted to the Riemannian geometry that underlies the set of fixedrank positive semidefinite matrices. In contrast with previous contributions in the literature, no restrictions are imposed on the range space of the learned matrix. The resulting algorithms maintain a linear complexity in the problem size and enjoy important invariance properties. We apply the proposed algorithms to the problem of learning a distance function parameterized by a positive semidefinite matrix. Good performance is observed on classical benchmarks.
Gilles Meyer, Silvere Bonnabel, Rodolphe Sepulchre
Added 22 Mar 2011
Updated 22 Mar 2011
Type Journal
Year 2010
Where CORR
Authors Gilles Meyer, Silvere Bonnabel, Rodolphe Sepulchre
Comments (0)