0


非光滑凸优化问题的拟牛顿法

A Quasi-Newton Approach to Nonsmooth Convex Optimization
课程网址: http://videolectures.net/icml08_yu_aqna/  
主讲教师: Jin Yu
开课单位: 澳大利亚ICT卓越研究中心
开课时间: 2008-08-29
课程语种: 英语
中文简介:
我们将著名的BFGS拟牛顿方法及其有限记忆变量(LBFGS)扩展到非光滑凸目标的优化。这是通过将BFGS的三个组成部分推广到次微分来严格实现的:局部二次模型、下降方向的识别和沃尔夫线搜索条件。我们将得到的子(L)BFGS算法应用于具有二元铰链损失的L2正则化风险最小化,并将其测向分量应用于具有逻辑损失的L1正则化的风险最小化。在这两种情况下,我们的通用算法的性能与专业最先进的求解器中的同类算法相当或更好。
课程简介: We extend the well-known BFGS quasi-Newton method and its limited-memory variant (LBFGS) to the optimization of nonsmooth convex objectives. This is done in a rigorous fashion by generalizing three components of BFGS to subdifferentials: The local quadratic model, the identification of a descent direction, and the Wolfe line search conditions. We apply the resulting sub(L)BFGS algorithm to L2-regularized risk minimization with binary hinge loss, and its direction-finding component to L1-regularized risk minimization with logistic loss. In both settings our generic algorithms perform comparable to or better than their counterparts in specialized state-of-the-art solvers.
关 键 词: 拟牛顿方法; 二元铰链; 逻辑损失
课程来源: 视频讲座网
数据采集: 2022-11-02:chenjy
最后编审: 2022-11-02:chenjy
阅读次数: 46