Sciweavers

692 search results - page 7 / 139
» Why Experimentation can be better than
Sort
View
107
Voted
MCS
2000
Springer
15 years 1 months ago
Ensemble Methods in Machine Learning
Ensemble methods are learning algorithms that construct a set of classi ers and then classify new data points by taking a (weighted) vote of their predictions. The original ensembl...
Thomas G. Dietterich
66
Voted
SYNTHESE
2008
51views more  SYNTHESE 2008»
14 years 9 months ago
Rebutting formally valid counterexamples to the Humean "is-ought" dictum
Various formally valid counterexamples have been adduced against the Humean dictum that one cannot derive an "ought" from an "is." There are formal rebuttals--s...
Daniel Guevara
AAAI
1998
14 years 11 months ago
Boosting in the Limit: Maximizing the Margin of Learned Ensembles
The "minimum margin" of an ensemble classifier on a given training set is, roughly speaking, the smallest vote it gives to any correct training label. Recent work has sh...
Adam J. Grove, Dale Schuurmans
GECCO
2006
Springer
146views Optimization» more  GECCO 2006»
15 years 1 months ago
The dispersion metric and the CMA evolution strategy
An algorithm independent metric is introduced that measures the dispersion of a uniform random sample drawn from the top ranked percentiles of the search space. A low dispersion f...
Monte Lunacek, Darrell Whitley
SIGIR
2012
ACM
12 years 12 months ago
Unsupervised linear score normalization revisited
We give a fresh look into score normalization for merging result-lists, isolating the problem from other components. We focus on three of the simplest, practical, and widelyused l...
Ilya Markov, Avi Arampatzis, Fabio Crestani