0


有什么帮助?知识转移的语义关联

What Helps Where - And Why? Semantic Relatedness for Knowledge Transfer
课程网址: http://videolectures.net/cvpr2010_rohrbach_whw/  
主讲教师: Marcus Rohrbach
开课单位: 加州大学伯克利分校
开课时间: 2010-07-19
课程语种: 英语
中文简介:
据报道,识别单个对象类具有显著的性能。然而,对于当今的识别方法来说,可扩展到大量类仍然是一个重要的挑战。几位作者已经将课堂间的知识转移作为解决这一挑战的关键要素。然而,在之前的工作中,决定哪些知识需要转移,要么需要人工监督,要么至少需要一些限制这些方法可扩展性的培训示例。在这项工作中,我们明确地解决了一个问题,即如何在不需要任何人工干预的情况下自动决定在类之间传输哪些信息。为此,我们利用语言知识库来提供知识转移的来源(什么)和目标(在哪里)之间的语义联系。我们对自然语言处理的不同知识库和最先进技术进行了严格的实验评估,这远远超出了语言在相关工作中的有限使用。我们还对不同知识来源的适用性(为什么)和知识转移的相似性度量进行了深入的探讨。
课程简介: Remarkable performance has been reported to recognize single object classes. Scalability to large numbers of classes however remains an important challenge for today’s recognition methods. Several authors have promoted knowledge transfer between classes as a key ingredient to address this challenge. However, in previous work the decision which knowledge to transfer has required either manual supervision or at least a few training examples limiting the scalability of these approaches. In this work we explicitly address the question of how to automatically decide which information to transfer between classes without the need of any human intervention. For this we tap into linguistic knowledge bases to provide the semantic link between sources (what) and targets (where) of knowledge transfer. We provide a rigorous experimental evaluation of different knowledge bases and state-of-the-art techniques from Natural Language Processing which goes far beyond the limited use of language in related work. We also give insights into the applicability (why) of different knowledge sources and similarity measures for knowledge transfer.
关 键 词: 可扩展性; 自然语言处理; 知识转移
课程来源: 视频讲座网
最后编审: 2021-01-28:nkq
阅读次数: 14