Sciweavers

56 search results - page 10 / 12
» Stochastic methods for l1 regularized loss minimization
Sort
View
110
Voted
SDM
2004
SIAM
212views Data Mining» more  SDM 2004»
15 years 1 months ago
Clustering with Bregman Divergences
A wide variety of distortion functions, such as squared Euclidean distance, Mahalanobis distance, Itakura-Saito distance and relative entropy, have been used for clustering. In th...
Arindam Banerjee, Srujana Merugu, Inderjit S. Dhil...
JMLR
2010
101views more  JMLR 2010»
14 years 7 months ago
Classification Using Geometric Level Sets
A variational level set method is developed for the supervised classification problem. Nonlinear classifier decision boundaries are obtained by minimizing an energy functional tha...
Kush R. Varshney, Alan S. Willsky
NIPS
2001
15 years 1 months ago
Boosting and Maximum Likelihood for Exponential Models
We derive an equivalence between AdaBoost and the dual of a convex optimization problem, showing that the only difference between minimizing the exponential loss used by AdaBoost ...
Guy Lebanon, John D. Lafferty
FOCM
2011
117views more  FOCM 2011»
14 years 7 months ago
Accuracy and Stability of Computing High-order Derivatives of Analytic Functions by Cauchy Integrals
Abstract High-order derivatives of analytic functions are expressible as Cauchy integrals over circular contours, which can very effectively be approximated, e.g., by trapezoidal s...
Folkmar Bornemann
ICDM
2007
IEEE
248views Data Mining» more  ICDM 2007»
15 years 4 months ago
Adapting SVM Classifiers to Data with Shifted Distributions
Many data mining applications can benefit from adapting existing classifiers to new data with shifted distributions. In this paper, we present Adaptive Support Vector Machine (Ada...
Jun Yang 0003, Rong Yan, Alexander G. Hauptmann