Free Online Productivity Tools
i2Speak
i2Symbol
i2OCR
iTex2Img
iWeb2Print
iWeb2Shot
i2Type
iPdf2Split
iPdf2Merge
i2Bopomofo
i2Arabic
i2Style
i2Image
i2PDF
iLatex2Rtf
Sci2ools

IJCNN

2006

IEEE

2006

IEEE

— We present a new optimization procedure which is particularly suited for the solution of second-order kernel methods like e.g. Kernel-PCA. Common to these methods is that there is a cost function to be optimized, under a positive deﬁnite quadratic constraint, which bounds the solution. For example, in Kernel-PCA the constraint provides unit length and orthogonal (in feature space) principal components. The cost function is often quadratic which allows to solve the problem as a generalized eigenvalue problem. However, in contrast to Support Vector Machines, which employ box constraints, quadratic constraints usually do not lead to sparse solutions. Here we give up the structure of the generalized eigenvalue problem in favor of a non-quadratic regularization term added to the cost function, which enforces sparse solutions. To optimize this more ’complicated’ cost function, we introduce a modiﬁed conjugate gradient descent method. Starting from an admissible point, all iterati...

Related Content

Added |
11 Jun 2010 |

Updated |
11 Jun 2010 |

Type |
Conference |

Year |
2006 |

Where |
IJCNN |

Authors |
Roland Vollgraf, Klaus Obermayer |

Comments (0)