Agnostic active learning

13 years 2 months ago
Agnostic active learning
We state and analyze the first active learning algorithm which works in the presence of arbitrary forms of noise. The algorithm, A2 (for Agnostic Active), relies only upon the assumption that the samples are drawn i.i.d. from a fixed distribution. A2 achieves an exponential improvement (i.e., requires only O ln 1 samples to find an -optimal classifier) over the usual sample complexity of supervised learning, for several settings considered before in the realizable case. These include learning threshold classifiers and learning homogeneous linear separators with respect to an input distribution which is uniform over the unit sphere. Key words: Active Learning, Agnostic Setting, Sample Complexity, Linear Separators, Uniform Distribution, Exponential Improvement.
Maria-Florina Balcan, Alina Beygelzimer, John Lang
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 2006
Where ICML
Authors Maria-Florina Balcan, Alina Beygelzimer, John Langford
Comments (0)