支持向量机的鲁棒特性和相关的方法Robustness properties of support vector machines and related methods |
|
课程网址: | http://videolectures.net/mcslw04_christmann_rpsvm/ |
主讲教师: | Andreas Christmann |
开课单位: | 拜罗伊特大学 |
开课时间: | 2007-02-25 |
课程语种: | 英语 |
中文简介: | 该讲座汇集了两个学科的方法:机器学习理论和强大的统计学。我们认为鲁棒性是一个重要的方面,我们表明许多现有的基于凸风险最小化的机器学习方法除了其他好的属性外,还具有在适当选择内核和损失函数时具有鲁棒性的优点。我们的结果涵盖分类和回归问题。给出了影响函数的存在和影响函数边界的假设。内核逻辑回归,支持向量机,最小二乘和AdaBoost损失函数被视为特殊情况。我们还考虑了来自Bites的Robust Learning,这是一种简单的方法,可以使凸算风险最小化的某些方法适用于当前可用算法非常慢的大数据集。例如,我们使用来自15家德国保险公司的数据集。 |
课程简介: | The talk brings together methods from two disciplines: machine learning theory and robust statistics. We argue that robustness is an important aspect and we show that many existing machine learning methods based on convex risk minimization have - besides other good properties - also the advantage of being robust if the kernel and the loss function are chosen appropriately. Our results cover classification and regression problems. Assumptions are given for the existence of the influence function and for bounds on the influence function. Kernel logistic regression, support vector machines, least squares and the AdaBoost loss function are treated as special cases. We also consider Robust Learning from Bites, a simple method to make some methods from convex risk minimization applicable for huge data sets for which currently available algorithms are much to slow. As an example we use a data set from 15 German insurance companies. |
关 键 词: | 机器学习理论; 鲁棒统计; 覆盖分类; 回归问题 |
课程来源: | 视频讲座网 |
最后编审: | 2020-06-29:yumf |
阅读次数: | 63 |