0


线性搜索凸优化的快速一阶方法

Fast first-order methods for convex optimization with line search
课程网址: http://videolectures.net/nipsworkshops2011_scheinberg_convexoptim...  
主讲教师: Katya Scheinberg
开课单位: 里海大学
开课时间: 2012-01-25
课程语种: 英语
中文简介:
我们建议使用非单调选择prox参数的加速一阶方法,它基本上控制步长。这与大多数加速方案形成对比,其中prox参数被假定为常数或非增加。特别地,我们证明了回溯策略可以在FISTA [2]和FALM算法[5]中使用,从任意参数值开始,保留其最坏情况的O迭代复杂度。我们还得出依赖于“平均”步长的复杂性估计。函数梯度的全局Lipschitz常数,为这些方法提供了理论上的合理性,因此本文的主要贡献是理论上的。
课程简介: We propose accelerated first-order methods with non-monotonic choice of the prox parameter, which essentially controls the step size. This is in contrast with most accelerated schemes where the prox parameter is either assumed to be constant or non-increasing. In particular we show that a backtracking strategy can be used within FISTA [2] and FALM algorithms [5] starting from an arbitrary parameter value preserving their worst-case iteration complexities of O. We also derive complexity estimates that depend on the “average” step size rather than the global Lipschitz constant for the function gradient, which provide better theoretical justification for these methods, hence the main contribution of this paper is theoretical.
关 键 词: 加速一阶方法; 加速方案; 函数梯度
课程来源: 视频讲座网
最后编审: 2019-09-07:lxf
阅读次数: 79