Sciweavers

JSCIC
2016

On the Global and Linear Convergence of the Generalized Alternating Direction Method of Multipliers

8 years 19 days ago
On the Global and Linear Convergence of the Generalized Alternating Direction Method of Multipliers
The formulation min x,y f(x) + g(y), subject to Ax + By = b, where f and g are extended-value convex functions, arises in many application areas such as signal processing, imaging and image processing, statistics, and machine learning either naturally or after variable splitting. In many common problems, one of the two objective functions is strictly convex and has Lipschitz continuous gradient. On this kind of problem, a very effective approach is the alternating direction method of multipliers (ADM or ADMM), which solves a sequence of f/g-decoupled subproblems. However, its effectiveness has not been matched by a provably fast rate of convergence; only sublinear rates such as O(1/k) and O(1/k2) were recently established in the literature, though the O(1/k) rates do not require strict convexity. This paper shows that global linear convergence can be guaranteed under the assumptions of strict convexity and Lipschitz gradient on one of the two functions, along with certain rank assump...
Wei Deng, Wotao Yin
Added 07 Apr 2016
Updated 07 Apr 2016
Type Journal
Year 2016
Where JSCIC
Authors Wei Deng, Wotao Yin
Comments (0)