0


可扩展的前瞻线性回归树

Scalable Look-Ahead Linear Regression Trees
课程网址: http://videolectures.net/kdd07_vogel_slalrt/  
主讲教师: David Vogel
开课单位: 中央佛罗里达大学
开课时间: 2007-08-15
课程语种: 英语
中文简介:
前瞻线性回归树(LLRT)背后的动机是在迄今为止提出的所有方法中,没有可扩展的方法来详尽地评估叶节点中的所有可能模型以获得最佳分割。使用多种优化,LLRT能够每秒生成和评估数千个线性回归模型。这允许基于所得分支中的线性回归模型的拟合质量,对节点中的所有可能分裂进行近似详尽的评估。我们以预先计算其大部分的方式分解剩余平方和的计算。生成的方法具有高度可扩展性。我们观察它以获得具有强属性之间相互依赖性的问题的高预测准确性。我们报告了两个模拟和七个真实数据集的实验。
课程简介: The motivation behind Look-ahead Linear Regression Trees (LLRT) is that out of all the methods proposed to date, there has been no scalable approach to exhaustively evaluate all possible models in the leaf nodes in order to obtain an optimal split. Using several optimizations, LLRT is able to generate and evaluate thousands of linear regression models per second. This allows for a near-exhaustive evaluation of all possible splits in a node, based on the quality of fit of linear regression models in the resulting branches. We decompose the calculation of the Residual Sum of Squares in such a way that a large part of it is pre-computed. The resulting method is highly scalable. We observe it to obtain high predictive accuracy for problems with strong mutual dependencies between attributes. We report on experiments with two simulated and seven real data sets.
关 键 词: 前瞻线性回归树; 线性回归模型; 强属性
课程来源: 视频讲座网
最后编审: 2019-05-09:lxf
阅读次数: 51