Sciweavers

134 search results - page 4 / 27
» On the Complexity of Approximating the VC Dimension
Sort
View
JALC
2007
90views more  JALC 2007»
13 years 5 months ago
Learning Unary Automata
We determine the complexity of learning problems for unary regular languages. We begin by investigating the minimum consistent dfa (resp. nfa) problem which is known not to be app...
Gregor Gramlich, Ralf Herrmann
CORR
2008
Springer
72views Education» more  CORR 2008»
13 years 5 months ago
Statistical Learning of Arbitrary Computable Classifiers
Statistical learning theory chiefly studies restricted hypothesis classes, particularly those with finite Vapnik-Chervonenkis (VC) dimension. The fundamental quantity of interest i...
David Soloveichik
NIPS
2007
13 years 7 months ago
A general agnostic active learning algorithm
We present a simple, agnostic active learning algorithm that works for any hypothesis class of bounded VC dimension, and any data distribution. Our algorithm extends a scheme of C...
Sanjoy Dasgupta, Daniel Hsu, Claire Monteleoni
COMPGEOM
2007
ACM
13 years 9 months ago
On approximate halfspace range counting and relative epsilon-approximations
The paper consists of two major parts. In the first part, we re-examine relative -approximations, previously studied in [12, 13, 18, 25], and their relation to certain geometric p...
Boris Aronov, Sariel Har-Peled, Micha Sharir
COLT
2000
Springer
13 years 10 months ago
Model Selection and Error Estimation
We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalizatio...
Peter L. Bartlett, Stéphane Boucheron, G&aa...