Sciweavers

PRL
2011

Efficient approximate Regularized Least Squares by Toeplitz matrix

12 years 10 months ago
Efficient approximate Regularized Least Squares by Toeplitz matrix
Machine Learning based on the Regularized Least Square (RLS) model requires one to solve a system of linear equations. Direct-solution methods exhibit predictable complexity and storage, but often prove impractical for large-scale problems; Iterative methods attain approximate solutions at lower complexities, but heavily depend on learning parameters. The paper shows that applying the properties of Toeplitz matrixes to RLS yields two benefits: first, both the computational cost and the memory space required to train an RLS-based machine reduce dramatically; secondly, timing and storage requirements are defined analytically. The paper proves this result formally for the one-dimensional case, and gives an analytical criterion for an effective approximation in multidimensional domains. The approach validity is demonstrated in several real-world problems involving huge data sets with highly dimensional data.
Sergio Decherchi, Paolo Gastaldo, Rodolfo Zunino
Added 14 May 2011
Updated 14 May 2011
Type Journal
Year 2011
Where PRL
Authors Sergio Decherchi, Paolo Gastaldo, Rodolfo Zunino
Comments (0)