0


用很少的资源训练大的随机森林

Training Big Random Forests with Little Resources
课程网址: http://videolectures.net/kdd2018_gieseke_training_forests/  
主讲教师: Fabian Gieseke
开课单位: 多特蒙德大学计算机科学系
开课时间: 2018-11-23
课程语种: 英语
中文简介:
由于无法访问大型计算集群,在大型数据集上构建随机森林仍然是一个具有挑战性的问题。如果需要完全生长的树木,情况尤其如此。我们提出了一个简单而有效的框架,该框架允许使用廉价的台式计算机和商品硬件,高效地构建用于数亿甚至数十亿训练实例的巨大树集合。基本思想是考虑一种多级构造方案,该方案为可用数据的小随机子集构建顶层树,然后将所有训练实例分发到顶层树的叶子以供进一步处理。虽然在概念上很简单,但总体效率主要取决于不同阶段的具体实施。我们的方法的实际优点通过使用具有数亿训练实例的密集数据集来证明。
课程简介: Without access to large compute clusters, building random forests on large datasets is still a challenging problem. This is, in particular, the case if fully-grown trees are desired. We propose a simple yet effective framework that allows to efficiently construct ensembles of huge trees for hundreds of millions or even billions of training instances using a cheap desktop computer with commodity hardware. The basic idea is to consider a multi-level construction scheme, which builds top trees for small random subsets of the available data and which subsequently distributes all training instances to the top trees’ leaves for further processing. While being conceptually simple, the overall efficiency crucially depends on the particular implementation of the different phases. The practical merits of our approach are demonstrated using dense datasets with hundreds of millions of training instances.
关 键 词: 大型计算集群; 构建随机森林; 廉价的台式计算机; 小随机子集构建顶层树
课程来源: 视频讲座网
数据采集: 2023-01-30:cyh
最后编审: 2023-01-30:cyh
阅读次数: 21