Randomized block proximal gradient methods for a class of structured nonlinear programming
主 题: Randomized block proximal gradient methods for a class of structured nonlinear programming
报告人: Prof. Zhaosong Lu (Simon Fraser University)
时 间: 2014-07-03 15:00-16:00
地 点: Room 29 at Quan Zhai, BICMR(主持人:文再文)
Nowadays randomized block proximal gradient descent (RBPG) methods become a prevalent tool for solving large-scale optimization problems arising in machine learning, compressed sensing, image and signal processing. In the first part of this talk we study a randomized monotone block gradient method for minimizing the sum of a smooth convex function and a block-separable convex function. We present some new results on rate of convergence and high-probability type of iteration complexity for this method. We also propose an accelerated RBPG method and establish its rate of convergence. We also present some computational results. In the second part we propose a randomized nonmonotone block proximal gradient (RNBPG) method for minimizing the sum of a smooth (possibly nonconvex) function and a block-separable (possibly nonconvex nonsmooth) function. Under some assumptions, we establish its global convergence and rate of convergence. We also present some computational results demonstrating that our method substantially outperform the RBPG method proposed by Richtarik and Takac (2012).