0


在哪里有什么帮助?为什么?知识转移中的语义关联

What Helps Where - And Why? Semantic Relatedness for Knowledge Transfer
课程网址: http://videolectures.net/cvpr2010_rohrbach_whw/  
主讲教师: Marcus Rohrbach
开课单位: 加州大学
开课时间: 2010-07-19
课程语种: 英语
中文简介:

据报道,在识别单个对象类方面表现出色。但是,对于当今的识别方法而言,可扩展性达到大量类别仍然是一项重要的挑战。几位作者已提倡在班级之间进行知识转移,以应对这一挑战。但是,在先前的工作中,决定转让哪种知识需要人工监督或至少有一些培训示例来限制这些方法的可扩展性。在这项工作中,我们明确解决了以下问题:如何自动决定在类之间传输哪些信息而无需任何人工干预。为此,我们利用语言知识库来提供知识转移的源(内容)与目标(位置)之间的语义链接。我们对自然语言处理的不同知识库和最新技术进行了严格的实验评估,这远远超出了相关工作中有限的语言使用范围。我们还将深入介绍不同知识来源的适用性(为什么)以及知识转移的相似性度量。

课程简介: Remarkable performance has been reported to recognize single object classes. Scalability to large numbers of classes however remains an important challenge for today’s recognition methods. Several authors have promoted knowledge transfer between classes as a key ingredient to address this challenge. However, in previous work the decision which knowledge to transfer has required either manual supervision or at least a few training examples limiting the scalability of these approaches. In this work we explicitly address the question of how to automatically decide which information to transfer between classes without the need of any human intervention. For this we tap into linguistic knowledge bases to provide the semantic link between sources (what) and targets (where) of knowledge transfer. We provide a rigorous experimental evaluation of different knowledge bases and state-of-the-art techniques from Natural Language Processing which goes far beyond the limited use of language in related work. We also give insights into the applicability (why) of different knowledge sources and similarity measures for knowledge transfer.
关 键 词: 实验评估; 语言知识库; 知识转移; 相似性度量
课程来源: 视频讲座网
数据采集: 2021-05-28:zyk
最后编审: 2021-05-28:zyk
阅读次数: 29