0


信息论度量学习

Information-Theoretic Metric Learning
课程网址: http://videolectures.net/lce06_davis_itml/  
主讲教师: Jason Davis
开课单位: 斯坦福大学
开课时间: 2007-02-25
课程语种: 英语
中文简介:
我们将度量学习问题表达为在马哈拉诺比斯距离函数约束下最小化两个多元高斯之间的微分相对熵。通过令人惊讶的等价,我们表明这个问题可以解决为低级内核学习问题。具体来说,我们将低秩内核的Burg分歧最小化到输入内核,受成对距离约束的影响。我们的方法比现有方法有几个优点。首先,我们提出了该问题的自然信息理论公式。其次,该算法利用Kulis等人开发的方法。
课程简介: We formulate the metric learning problem as that of minimizing the differential relative entropy between two multivariate Gaussians under constraints on the Mahalanobis distance function. Via a surprising equivalence, we show that this problem can be solved as a low-rank kernel learning problem. Specifically, we minimize the Burg divergence of a low-rank kernel to an input kernel, subject to pairwise distance constraints. Our approach has several advantages over existing methods. First, we present a natural information-theoretic formulation for the problem. Second, the algorithm utilizes the methods developed by Kulis et al. [6], which do not involve any eigenvector computation; in particular, the running time of our method is faster than most existing techniques. Third, the formulation offers insights into connections between metric learning and kernel learning.
关 键 词: 度量学习; 距离函数; 微分相对熵
课程来源: 视频讲座网
最后编审: 2020-06-12:王勇彬(课程编辑志愿者)
阅读次数: 318