We present novel semi-supervised boosting algorithms that incrementally build linear combinations of weak classifiers through generic functional gradient descent using both labele...
Boosting is a general method for improving the accuracy of a learning algorithm. AdaBoost, short form for Adaptive Boosting method, consists of repeated use of a weak or a base le...
T. Ravindra Babu, M. Narasimha Murty, Vijay K. Agr...
L1 regularized logistic regression is now a workhorse of machine learning: it is widely used for many classification problems, particularly ones with many features. L1 regularized...
Su-In Lee, Honglak Lee, Pieter Abbeel, Andrew Y. N...
Regression testing has been a popular quality assurance technique. Most regression testing techniques are based on code or software design. This paper proposes a scenario-based fu...
Raymond A. Paul, Lian Yu, Wei-Tek Tsai, Xiaoying B...