0


大规模图像分类的度量学习:以接近零的代价推广到新的分类

Metric Learning for Large Scale Image Classification: Generalizing to New Classes at Near-Zero Cost
课程网址: http://videolectures.net/eccv2012_mensink_learning/  
主讲教师: Thomas Mensink
开课单位: 牛津大学
开课时间: 2012-11-12
课程语种: 英语
中文简介:

我们对大规模图像分类感兴趣,尤其是在将与新类或现有类对应的图像不断添加到训练集中的设置中。我们的目标是设计分类器,能够以(接近)零成本动态整合此类图像和类。我们将此问题转化为学习所有类共享的度量标准,并探索 k 个最近邻 (k NN) 和最近类均值 (NCM) 分类器。我们在 ImageNet 2010 挑战数据集上学习指标,其中包含超过 120 万张 1K 类的训练图像。令人惊讶的是,NCM 分类器优于更灵活的 k NN 分类器,并且具有与线性 SVM 相当的性能。我们还通过在 ImageNet 10K 数据集上使用学习的度量来研究泛化性能等,并获得了有竞争力的性能。最后,我们探索了零样本分类,并展示了如何将零样本模型与小型训练数据集非常有效地结合起来。

课程简介: We are interested in large-scale image classification and especially in the setting where images corresponding to new or existing classes are continuously added to the training set. Our goal is to devise classifiers which can incorporate such images and classes on-the-fly at (near) zero cost. We cast this problem into one of learning a metric which is shared across all classes and explore k-nearest neighbor (k-NN) and nearest class mean (NCM) classifiers. We learn metrics on the ImageNet 2010 challenge data set, which contains more than 1.2M training images of 1K classes. Surprisingly, the NCM classifier compares favorably to the more flexible k-NN classifier, and has comparable performance to linear SVMs. We also study the generalization performance, among others by using the learned metric on the ImageNet-10K dataset, and we obtain competitive performance. Finally, we explore zero-shot classification, and show how the zero-shot model can be combined very effectively with small training datasets.
关 键 词: 图像分类; 分类器; 数据集分类; 图像训练
课程来源: 视频讲座网
数据采集: 2021-08-21:zyk
最后编审: 2021-08-28:zyk
阅读次数: 93