0


随机分类噪声击败所有凸势的助推器

Random Classification Noise Defeats All Convex Potential Boosters
课程网址: http://videolectures.net/icml08_long_rcn/  
主讲教师: Phil Long
开课单位: 美国感知技术有限公司
开课时间: 2008-08-06
课程语种: 英语
中文简介:
一类广泛的提升算法可以解释为执行协调级梯度下降, 以最大限度地减少数据集边缘的一些潜在函数。此类包括 adaboost、logitboost 和其他广泛使用和研究良好的助推器。本文表明, 对于一类广泛的凸势函数, 任何此类提升算法都极易受到随机分类噪声的影响。我们这样做的方式表明, 对于任何这样的助推器和任何非零随机分类噪声率 r, 有一个简单的例子数据集, 如果没有噪音, 这种助推器可以有效地学习, 但如果有的话, 它不能比半更好地学会精度是 r 速率下的随机分类噪声。这一负面结果与已知的基于分支程序的助推器形成鲜明对比, 后者不属于凸势函数框架, 可以在随机分类噪声存在的情况下获得较高的精度。
课程简介: A broad class of boosting algorithms can be interpreted as performing coordinate-wise gradient descent to minimize some potential function of the margins of a data set. This class includes AdaBoost, LogitBoost, and other widely used and well-studied boosters. In this paper we show that for a broad class of convex potential functions, any such boosting algorithm is highly susceptible to random classification noise. We do this by showing that for any such booster and any nonzero random classification noise rate R, there is a simple data set of examples which is efficiently learnable by such a booster if there is no noise, but which cannot be learned to accuracy better than 1/2 if there is random classification noise at rate R. This negative result is in contrast with known branching program based boosters which do not fall into the convex potential function framework and which can provably learn to high accuracy in the presence of random classification noise.
关 键 词: 数据集; 随机分类噪声; 助推器; 凸势函数
课程来源: 视频讲座网
最后编审: 2020-06-23:liqy
阅读次数: 38