Sciweavers

ICPR
2008
IEEE

Harmonic mean for subspace selection

13 years 10 months ago
Harmonic mean for subspace selection
Under the homoscedastic Gaussian assumption, it has been shown that Fisher’s linear discriminant analysis (FLDA) suffers from the class separation problem when the dimensionality of subspace selected by FLDA is strictly less than the class number minus 1, i.e., the projection to a subspace tends to merge close class pairs. A recent result shows that maximizing the geometric mean of Kullback-Leibler (KL) divergences of class pairs can significantly reduce this problem. In this paper, to further reduce the class separation problem, the harmonic mean is applied to replace the geometric mean for subspace selection. The new method is termed maximization of the harmonic mean of all pairs of symmetric KL divergences (MHMD). As MHMD is invariant to rotational transformations, an efficient optimization procedure can be conducted on the Grassmann manifold. Thorough empirical studies demonstrate the effective of harmonic mean in dealing with the class separation problem.
Wei Bian, Dacheng Tao
Added 30 May 2010
Updated 30 May 2010
Type Conference
Year 2008
Where ICPR
Authors Wei Bian, Dacheng Tao
Comments (0)