Sciweavers

Share
SIAMIS
2011

Gradient-Based Methods for Sparse Recovery

9 years 11 months ago
Gradient-Based Methods for Sparse Recovery
The convergence rate is analyzed for the sparse reconstruction by separable approximation (SpaRSA) algorithm for minimizing a sum f(x) + ψ(x), where f is smooth and ψ is convex, but possibly nonsmooth. It is shown that if f is convex, then the error in the objective function at iteration k is bounded by a/k for some a independent of k. Moreover, if the objective function is strongly convex, then the convergence is R-linear. An improved version of the algorithm based on a cyclic version of the BB iteration and an adaptive line search is given. The performance of the algorithm is investigated using applications in the areas of signal processing and image reconstruction. Key words. sparse reconstruction by separable approximation, iterative shrinkage thresholding algorithm, sparse recovery, sublinear convergence, linear convergence, image reconstruction, denoising, compressed sensing, nonsmooth optimization, nonmonotone convergence, BB method AMS subject classifications. 90C06, 90C25, ...
William W. Hager, Dzung T. Phan, Hongchao Zhang
Added 15 May 2011
Updated 15 May 2011
Type Journal
Year 2011
Where SIAMIS
Authors William W. Hager, Dzung T. Phan, Hongchao Zhang
Comments (0)
books