Sciweavers

NIPS
2004

Efficient Kernel Machines Using the Improved Fast Gauss Transform

13 years 5 months ago
Efficient Kernel Machines Using the Improved Fast Gauss Transform
The computation and memory required for kernel machines with N training samples is at least O(N2 ). Such a complexity is significant even for moderate size problems and is prohibitive for large datasets. We present an approximation technique based on the improved fast Gauss transform to reduce the computation to O(N). We also give an error bound for the approximation, and provide experimental results on the UCI datasets.
Changjiang Yang, Ramani Duraiswami, Larry S. Davis
Added 31 Oct 2010
Updated 31 Oct 2010
Type Conference
Year 2004
Where NIPS
Authors Changjiang Yang, Ramani Duraiswami, Larry S. Davis
Comments (0)