0


用Bhattacharyya系数统一子空间和距离度量学习进行图像分类

Unifying Subspace and Distance Metric Learning with Bhattacharyya Coefficient for Image Classification
课程网址: http://videolectures.net/etvc08_metaxas_usadm/  
主讲教师: Dimitris N. Metaxas
开课单位: 新泽西州立大学
开课时间: 2008-12-05
课程语种: 英语
中文简介:
在本次演讲中,我们提出了贝叶斯框架下用于图像分类的子空间和距离度量学习的统一方案。根据数据的局部分布,我们将每个样本的k个最近邻分为内类集和类间集,并且我们的目标是在嵌入子空间中学习距离度量,这可以使得样本和它的内部类设置小于它与它的类间集之间的距离。为了达到这个目标,我们认为类内距离和类间距离分别来自两个不同的概率分布,并且我们通过最小化两个分布之间的重叠来对目标进行建模。受贝叶斯分类误差估计的启发,我们通过最小化两个分布之间的Bhattachyrra系数来制定目标函数。我们通过内核技巧进一步扩展它以学习非线性距离度量。通过CMU PIE人脸数据库,扩展YALE人脸数据库和COREL 5000自然图像数据库的一系列实验证明了所提出方法的能力和普遍性。
课程简介: In this talk, we propose a unified scheme of subspace and distance metric learning under the Bayesian framework for image classification. According to the local distribution of data, we divide the k-nearest neighbors of each sample into the intra-class set and the inter-class set, and we aim to learn a distance metric in the embedding subspace, which can make the distances between the sample and its intra-class set smaller than the distances between it and its inter-class set. To reach this goal, we consider the intra-class distances and the inter-class distances to be from two different probability distributions respectively, and we model the goal with minimizing the overlap between two distributions. Inspired by the Bayesian classification error estimation, we formulate the objective function by minimizing the Bhattachyrra coefficient between two distributions. We further extend it with the kernel trick to learn nonlinear distance metric. The power and generality of the proposed approach are demonstrated by a series of experiments on the CMU-PIE face database, the extended YALE face database, and the COREL-5000 nature image database.
关 键 词: 贝叶斯框架; 图像分类; 嵌入子空间
课程来源: 视频讲座网
最后编审: 2019-04-14:lxf
阅读次数: 94