0


核分类规则的度量嵌入

Metric Embedding for Kernel Classification Rules
课程网址: http://videolectures.net/icml08_sriperumbudur_me/  
主讲教师: Bharath K. Sriperumbudur
开课单位: 圣地亚哥加州大学
开课时间: 2008-08-29
课程语种: 英语
中文简介:
本文考虑了基于平滑内核的分类规则, 提出了一种通过学习平滑内核的带宽和数据相关的距离度量来优化规则性能的算法。数据相关的距离度量是通过学习一个函数来获得的, 该函数将任意度量空间嵌入到欧几里得空间中, 同时最大限度地减少内核分类规则误差概率的重新替换估计的上限。通过将此嵌入函数限制在复制内核希尔伯特空间的基础上, 我们减少了求解半元程序的问题, 并将得到的内核分类规则显示为 k-最近邻域规则的一个变体。我们将内核规则 (使用学习的数据相关距离度量) 的性能与一些基准数据集上最先进的距离度量学习算法 (专为 k 最近邻居分类而设计) 进行了比较。结果表明, 该规则与其他度量学习算法相比, 具有更好或较好的分类精度。
课程简介: In this paper, we consider a smoothing kernel-based classification rule and propose an algorithm for optimizing the performance of the rule by learning the bandwidth of the smoothing kernel along with a data-dependent distance metric. The data-dependent distance metric is obtained by learning a function that embeds an arbitrary metric space into a Euclidean space while minimizing an upper bound on the resubstitution estimate of the error probability of the kernel classification rule. By restricting this embedding function to a reproducing kernel Hilbert space, we reduce the problem to solving a semidefinite program and show the resulting kernel classification rule to be a variation of the k-nearest neighbor rule. We compare the performance of the kernel rule (using the learned data-dependent distance metric) to state-of-the-art distance metric learning algorithms (designed for k-nearest neighbor classification) on some benchmark datasets. The results show that the proposed rule has either better or as good classification accuracy as the other metric learning algorithms.
关 键 词: 欧氏空间; 距离度量学习函数; 内核的方法; 再生核希尔伯特空间
课程来源: 视频讲座网
最后编审: 2020-06-03:张荧(课程编辑志愿者)
阅读次数: 87