Sciweavers

ECML
2005
Springer

Fitting the Smallest Enclosing Bregman Ball

13 years 10 months ago
Fitting the Smallest Enclosing Bregman Ball
Finding a point which minimizes the maximal distortion with respect to a dataset is an important estimation problem that has recently received growing attentions in machine learning, with the advent of one class classification. In this paper, we study the problem from a general standpoint, and suppose that the distortion is a Bregman divergence, without restriction. Applications of this formulation can be found in machine learning, statistics, signal processing and computational geometry. We propose two theoretically founded generalizations of a popular smallest enclosing ball approximation algorithm for Euclidean spaces coined by B˘adoiu and Clarkson in 2002. Experiments clearly display the advantages of being able to tune the divergence depending on the data’s domain. As an additional result, we unveil an useful bijection between Bregman divergences and a family of popular averages that includes the arithmetic, geometric, harmonic and power means.
Richard Nock, Frank Nielsen
Added 27 Jun 2010
Updated 27 Jun 2010
Type Conference
Year 2005
Where ECML
Authors Richard Nock, Frank Nielsen
Comments (0)