0


最佳论文 - 信息理论度量学习

Best Paper - Information-Theoretic Metric Learning
课程网址: http://videolectures.net/icml07_kulis_itml/  
主讲教师: Brian Kulis
开课单位: 德克萨斯大学
开课时间: 2007-06-22
课程语种: 英语
中文简介:
在本文中,我们提出了一种学习马哈拉诺比斯距离函数的信息理论方法。我们将问题表述为在距离函数约束下最小化两个多元高斯之间的微分相对熵的问题。我们将此问题表达为特定的Bregman优化问题:将线性约束下的LogDet散度最小化。我们得到的算法比现有方法有几个优点。首先,我们的方法可以处理各种约束,并且可以选择在距离函数上包含先验。其次,它快速且可扩展。与大多数现有方法不同,不需要特征值计算或半定规划。我们还提供了一个在线版本,并为得到的算法得出了遗憾的界限。最后,我们在最新邻居分类的度量学习以及标准数据集的背景下,在最近的一个名为Clarify的软件错误报告系统上评估我们的方法。
课程简介: In this paper, we present an information-theoretic approach to learning a Mahalanobis distance function. We formulate the problem as that of minimizing the differential relative entropy between two multivariate Gaussians under constraints on the distance function. We express this problem as a particular Bregman optimization problem: that of minimizing the LogDet divergence subject to linear constraints. Our resulting algorithm has several advantages over existing methods. First, our method can handle a wide variety of constraints and can optionally incorporate a prior on the distance function. Second, it is fast and scalable. Unlike most existing methods, no eigenvalue computations or semi-definite programming are required. We also present an online version and derive regret bounds for the resulting algorithm. Finally, we evaluate our method on a recent error reporting system for software called Clarify, in the context of metric learning for nearest neighbor classification, as well as on standard data sets.
关 键 词: 距离函数; 多元高斯; 微分相对熵
课程来源: 视频讲座网
最后编审: 2020-06-15:heyf
阅读次数: 204