0


凸优化非精确近端梯度算法的收敛速度

Convergence Rates of Inexact Proximal-Gradient Methods for Convex Optimization
课程网址: http://videolectures.net/nips2011_schmidt_convex/  
主讲教师: Mark Schmidt
开课单位: 不列颠哥伦比亚大学
开课时间: 2012-01-25
课程语种: 英语
中文简介:
我们考虑使用近端梯度方法优化平滑凸函数和非平滑凸函数之和的问题,其中在平滑项的梯度计算中或在邻近算子中相对于第二项存在误差。我们证明了基本近端梯度法,具有强凸性假设的基本近似梯度法和加速近端梯度法实现了与无误差情况下相同的收敛速度,条件是误差以适当的速率减小。我们对结构化稀疏性问题的实验结果表明,具有这些吸引人的理论属性的错误序列可以带来实际的性能改进。
课程简介: We consider the problem of optimizing the sum of a smooth convex function and a non-smooth convex function using proximal-gradient methods, where an error is present in the calculation of the gradient of the smooth term or in the proximity operator with respect to the second term. We show that the basic proximal-gradient method, the basic proximal-gradient method with a strong convexity assumption, and the accelerated proximal-gradient method achieve the same convergence rates as in the error-free case, provided the errors decrease at an appropriate rate. Our experimental results on a structured sparsity problem indicate that sequences of errors with these appealing theoretical properties can lead to practical performance improvements.
关 键 词: 近端梯度; 平滑凸函数; 凸性假设
课程来源: 视频讲座网
最后编审: 2020-01-17:chenxin
阅读次数: 99