0


稀疏学习的对偶增强拉格朗日算法的超线性收敛性

Super-Linear Convergence of Dual Augmented Lagrangian Algorithm for Sparse Learning
课程网址: http://videolectures.net/nipsworkshops09_tomioka_slc/  
主讲教师: Ryota Tomioka
开课单位: 芝加哥丰田技术学院
开课时间: 2010-01-19
课程语种: 英语
中文简介:
分析了一种新提出的稀疏学习算法——双增广拉格朗日算法(DAL)的收敛性。我们从理论上分析了DAL在非渐近全局意义上超线性收敛的条件。在大尺度_1-正则化逻辑回归问题上,实验验证了本文的分析方法,并将DAL算法与现有算法进行了比较。
课程简介: We analyze the convergence behaviour of a recently proposed algorithm for sparse learning called Dual Augmented Lagrangian (DAL). We theoretically analyze under some conditions that DAL converges super-linearly in a non-asymptotic and global sense. We experimentally confirm our analysis in a large scale ℓ1-regularized logistic regression problem and compare the efficiency of DAL algorithm to existing algorithms.
关 键 词: 计算机科学; 优化方法; 稀疏学习
课程来源: 视频讲座网
最后编审: 2020-06-03:毛岱琦(课程编辑志愿者)
阅读次数: 45