Sciweavers

255 search results - page 1 / 51
» Generalized Boosting Algorithms for Convex Optimization
Sort
View
CORR
2011
Springer
127views Education» more  CORR 2011»
12 years 8 months ago
Generalized Boosting Algorithms for Convex Optimization
Boosting is a popular way to derive powerful learners from simpler hypothesis classes. Following previous work (Mason et al., 1999; Friedman, 2000) on general boosting frameworks,...
Alexander Grubb, J. Andrew Bagnell
ECAI
2006
Springer
13 years 8 months ago
A Real Generalization of Discrete AdaBoost
Scaling discrete AdaBoost to handle real-valued weak hypotheses has often been done under the auspices of convex optimization, but little is generally known from the original boost...
Richard Nock, Frank Nielsen
ICML
2004
IEEE
14 years 5 months ago
Surrogate maximization/minimization algorithms for AdaBoost and the logistic regression model
Surrogate maximization (or minimization) (SM) algorithms are a family of algorithms that can be regarded as a generalization of expectation-maximization (EM) algorithms. There are...
Zhihua Zhang, James T. Kwok, Dit-Yan Yeung
JMLR
2006
105views more  JMLR 2006»
13 years 4 months ago
Some Theory for Generalized Boosting Algorithms
We give a review of various aspects of boosting, clarifying the issues through a few simple results, and relate our work and that of others to the minimax paradigm of statistics. ...
Peter J. Bickel, Yaacov Ritov, Alon Zakai
SIAMIS
2010
283views more  SIAMIS 2010»
12 years 11 months ago
A General Framework for a Class of First Order Primal-Dual Algorithms for Convex Optimization in Imaging Science
We generalize the primal-dual hybrid gradient (PDHG) algorithm proposed by Zhu and Chan in [M. Zhu, and T. F. Chan, An Efficient Primal-Dual Hybrid Gradient Algorithm for Total Var...
Ernie Esser, Xiaoqun Zhang, Tony F. Chan