0


利用边距和半径比学习核函数

Learning Kernels via Margin-and-Radius Ratios
课程网址: http://videolectures.net/nipsworkshops2010_gai_lkm/  
主讲教师: Kun Gai
开课单位: 清华大学
开课时间: 2011-01-12
课程语种: 英语
中文简介:
大多数现有的MKL方法都采用大边缘原则来学习内核。但是,我们指出,由于缩放的疏忽,marginitself无法很好地描述内核的优点。我们使用赋予内核的特征空间中数据的最小封闭球的边缘和半径之间的比率来测量内核的好坏,并为内核学习提出一种新的缩放不变公式。我们提出的公式可以处理线性和非线性组合内核。在线性组合中,它不仅对组合系数的范数约束类型不变,而且对基础内核的初始标度也不变。通过建立一般类型的多级最优值函数的可微性,我们提出了一种简单的基于梯度的基于核的学习算法。实验表明我们的方法明显优于其他现有的核心学习方法。
课程简介: Most existing MKL approaches employ the large margin principle to learning kernels. However, we point out that the margin itself can not well describe the goodness of a kernel due to the negligence of the scaling. We use the ratio between the margin and the radius of the minimal enclosing ball of data in the feature space endowed with a kernel, to measure how good the kernel is, and propose a new scaling-invariant formulation for kernel learning. Our presented formulation can handle both linear and nonlinear combination kernels. In linear combination cases, it is also invariant not only to types of norm constraints on combination coefficients but also to initial scalings of basis kernels. By establishing the differentiability of a general type of multilevel optimal value functions, we present a simple and efficient gradient-based kernel learning algorithm. Experiments show that our approach significantly outperforms other state-of- art kernel learning methods.
关 键 词: 大边缘原则; 内核; 线性组合
课程来源: 视频讲座网
最后编审: 2019-09-07:lxf
阅读次数: 37