0


最优的分布式在线预测使用迷你批

Optimal Distributed Online Prediction Using Mini-Batches
课程网址: http://videolectures.net/nipsworkshops2010_xiao_odo/  
主讲教师: Lin Xiao
开课单位: 微软公司
开课时间: 2011-01-13
课程语种: 英语
中文简介:
在线预测方法通常表示为在单个处理器上运行的串行算法。然而,在网络规模预测问题的时代,遇到单个处理器无法跟上输入到达的高速率的情况越来越普遍。在这项工作中,我们提出了分布式小批量算法,这是一种将任何基于串行梯度的在线预测算法转换为分布式算法的方法,该算法可以很好地扩展到多个核心,集群和网格。我们证明了这种方法的遗憾,这种方法对于平滑凸损失函数和随机输入是渐近最优的。此外,我们的遗憾分析明确考虑了网络上发生的通信延迟。我们的方法也可以适用于最优地解决密切相关的分布式随机优化问题。
课程简介: Online prediction methods are typically presented as serial algorithms running on a single processor. However, in the age of web-scale prediction problems, it is increasingly common to encounter situations where a single processor cannot keep up with the high rate at which inputs arrive. In this work we present the distributed mini-batch algorithm, a method of converting any serial gradient-based online pre- diction algorithm into a distributed algorithm that scales nicely to multiple cores, clusters, and grids. We prove a regret bound for this method that is asymptotically optimal for smooth convex loss functions and stochastic inputs. Moreover, our regret analysis explicitly takes into account communication latencies that occur on the network. Our method can also be adapted to optimally solve the closely- related distributed stochastic optimization problem.
关 键 词: 在线预测方法; 处理器; 网络
课程来源: 视频讲座网
最后编审: 2020-07-17:yumf
阅读次数: 43