0


基于Hadoop的梯度增强决策树

Gradient Boosted Decision Trees on Hadoop
课程网址: http://videolectures.net/nipsworkshops2010_ye_gbd/  
主讲教师: Jerry Ye
开课单位: 雅虎公司
开课时间: 2013-06-19
课程语种: 英语
中文简介:

随机梯度增强决策树(GBDT)是当今机器学习中使用最广泛的学习算法之一。它适应性强,易于解释,并且可以生成高度准确的模型。但是,当今的大多数实现在计算上都是昂贵的,并且要求所有训练数据都位于主存储器中。随着训练数据变得越来越大,我们有动力并行化GBDT算法。并行决策树训练是直观的,并且在现有文献中已经探索了各种方法。另一方面,随机提升本质上是一个顺序过程,尚未应用于分布式决策树。在本文中,我们描述了GBDT的分布式实现,该分布式实现是在2009年CIKM上提出的,在Hadoop网格环境中利用MPI。

课程简介: Stochastic Gradient Boosted Decision Trees (GBDT) is one of the most widely used learning algorithms in machine learning today. It is adaptable, easy to interpret, and produces highly accurate models. However, most implementations today are computationally expensive and require all training data to be in main memory. As training data becomes ever larger, there is motivation for us to parallelize the GBDT algorithm. Parallelizing decision tree training is intuitive and various approaches have been explored in existing literature. Stochastic boosting on the other hand is inherently a sequential process and have not been applied to distributed decision trees. In this paper, we describe a distributed implementation of GBDT that utilizes MPI on the Hadoop grid environment as presented by us at CIKM in 2009.
关 键 词: 决策树; 机器学习; 学习算法
课程来源: 视频讲座网
数据采集: 2020-10-28:zyk
最后编审: 2020-10-28:zyk
阅读次数: 67