0


正则化损失最小化的随机双坐标上升法

Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
课程网址: http://videolectures.net/nipsworkshops2012_shalev_shwartz_minimiz...  
主讲教师: Shai Shalev-Shwartz
开课单位: 耶路撒冷希伯来大学
开课时间: 2013-01-16
课程语种: 英语
中文简介:
由于具有强大的理论保证,随机梯度下降(SGD)已经成为解决大规模监督机器学习优化问题(如SVM)的流行趋势。虽然密切相关的双坐标上升(DCA)方法已在各种软件包中实现,但迄今为止缺乏良好的收敛性分析。我们提出了随机双坐标上升(SDCA)的新分析,表明这类方法享有与SGD相当或更好的强大理论保证。该分析证明了SDCA在实际应用中的有效性。
课程简介: Stochastic Gradient Descent (SGD) has become popular for solving large scale supervised machine learning optimization problems such as SVM, due to their strong theoretical guarantees. While the closely related Dual Coordinate Ascent (DCA) method has been implemented in various software packages, it has so far lacked good convergence analysis. We present a new analysis of Stochastic Dual Coordinate Ascent (SDCA) showing that this class of methods enjoy strong theoretical guarantees that are comparable or better than SGD. This analysis justifies the effectiveness of SDCA for practical applications.
关 键 词: 随机梯度下降; 监督机器学习; 双坐标上升
课程来源: 视频讲座网
最后编审: 2019-09-08:lxf
阅读次数: 162