0


推进基础Classifiers产品分类

Boosting Products of Base Classifiers
课程网址: http://videolectures.net/icml09_kegl_bpbc/  
主讲教师: Balazs Kegl
开课单位: 巴黎大学
开课时间: 2009-08-26
课程语种: 英语
中文简介:
本文介绍了如何提高简单基础学习者的学习效果。与树类似,我们将基础学习者称为子例程,但使用迭代而不是递归的方式。该方法的主要优点是简单和计算效率高。在基准数据集上,我们的决策树桩增强型产品明显优于增强型树,在mnist数据集上,该算法在无域知识算法中获得了继深信网之后的第二个最佳结果。作为第二个贡献,我们提出了一个改进的基础学习者的名义特征,并表明,提高这两个新的子集指标基础学习者的产品解决了最大利润矩阵因式分解问题用于形式化的协同过滤任务。在一个小的基准数据集上,我们得到的实验结果与基于半无限编程的解决方案相当,但计算成本要低得多。
课程简介: In this paper we show how to boost products of simple base learners. Similarly to trees, we call the base learner as a subroutine but in an iterative rather than recursive fashion. The main advantage of the proposed method is its simplicity and computational efficiency. On benchmark datasets, our boosted products of decision stumps clearly outperform boosted trees, and on the MNIST dataset the algorithm achieves the second best result among no-domain-knowledge algorithms after deep belief nets. As a second contribution, we present an improved base learner for nominal features and show that boosting the product of two of these new subset indicator base learners solves the maximum margin matrix factorization problem used to formalize the collaborative filtering task. On a small benchmark dataset, we get experimental results comparable to the semi-definite-programming-based solution but at a much lower computational cost.
关 键 词: 基准数据集; 名义特征; 子集指标; 有限编程
课程来源: 视频讲座网
最后编审: 2019-12-07:lxf
阅读次数: 49