Sciweavers

Share
ICML
2006
IEEE

Two-dimensional solution path for support vector regression

9 years 11 months ago
Two-dimensional solution path for support vector regression
Recently, a very appealing approach was proposed to compute the entire solution path for support vector classification (SVC) with very low extra computational cost. This approach was later extended to a support vector regression (SVR) model called -SVR. However, the method requires that the error parameter be set a priori, which is only possible if the desired accuracy of the approximation can be specified in advance. In this paper, we show that the solution path for SVR is also piecewise linear with respect to . We further propose an efficient algorithm for exploring the two-dimensional solution space defined by the regularization and error parameters. As opposed to the algorithm for SVC, our proposed algorithm for -SVR initializes the number of support vectors to zero and then increases it gradually as the algorithm proceeds. As such, a good regression function possessing the sparseness property can be obtained after only a few iterations.
Gang Wang, Dit-Yan Yeung, Frederick H. Lochovsky
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 2006
Where ICML
Authors Gang Wang, Dit-Yan Yeung, Frederick H. Lochovsky
Comments (0)
books