0


利用计算图进行大规模分布式机器学习

Exploiting the Computation Graph for Large Scale Distributed Machine Learning
课程网址: https://videolectures.net/videos/kdd2016_vishwanathan_machine_lea...  
主讲教师: S.V.N. Vishwanathan
开课单位: KDD 2016研讨会
开课时间: 2016-10-12
课程语种: 英语
中文简介:
许多机器学习算法将正则化风险降至最低。众所周知,随机优化算法在理论和实践上都有很好的动机来实现正则化风险最小化。不幸的是,随机优化不容易并行化。在本次演讲中,我们采用了一种全新的方法,并证明了处理拉格朗日问题产生的鞍点问题具有非常特定的计算图结构,可以利用该结构在多个处理器之间对参数进行自然划分。这使我们能够推导出一种新的并行随机优化算法,用于正则化风险最小化。与以下人员合作:Inderjit Dhillon、Cho Jui Hsieh、Shihao Ji、Shin Matsushima、Parameshwaran Raman、Hsiang Fu Yu和Hyokun Yun。
课程简介: Many machine learning algorithms minimize a regularized risk. It is well known that stochastic optimization algorithms are both theoretically and practically well motivated for regularized risk minimization. Unfortunately, stochastic optimization is not easy to parallelize. In this talk, we take a radically new approach and show that working with the saddle-point problem that arises out of the Lagrangian has a very specific computational graph structure which can be exploited to allow for a natural partitioning of the parameters across multiple processors. This allows us to derive a new parallel stochastic optimization algorithm for regularized risk minimization. Joint work with: Inderjit Dhillon, Cho-Jui Hsieh, Shihao Ji, Shin Matsushima, Parameshwaran Raman, Hsiang-Fu Yu, and Hyokun Yun.
关 键 词: 计算图; 分布式; 机器学习
课程来源: 视频讲座网
数据采集: 2025-01-07:liyq
最后编审: 2025-01-07:liyq
阅读次数: 8