首页函数论
0


可以帮助学习内核的性能?

Can Learning Kernels Help Performance?
课程网址: http://videolectures.net/icml09_cortes_clkh/  
主讲教师: Corinna Cortes
开课单位: 谷歌公司
开课时间: 2009-08-26
课程语种: 英语
中文简介:
自从在90年代初推出以来,内核方法与大边缘学习算法(如SVM)相结合已成功用于解决各种学习任务。但是,在这些方法的标准框架中,选择适当的内核留给用户,选择不当可能导致次优性能。相反,可以使用样本点来选择适合于由用户修复的一系列内核中的任务的内核函数。虽然这是一个吸引人的想法,在最近的一些理论保证的支持下,但在实验中,已经证明难以始终如一并且明显优于内核的简单固定组合方案。本讲座将研究用于学习内核的不同方法和算法,并将呈现新颖的结果,这些结果倾向于表明可以通过大量内核获得显着的性能改进。 (包括与Mehryar Mohri和Afshin Rostamizadeh的联合工作。)
课程简介: Kernel methods combined with large-margin learning algorithms such as SVMs have been used successfully to tackle a variety of learning tasks since their introduction in the early 90s. However, in the standard framework of these methods, the choice of an appropriate kernel is left to the user and a poor selection may lead to sub-optimal performance. Instead, sample points can be used to select a kernel function suitable for the task out of a family of kernels fixed by the user. While this is an appealing idea supported by some recent theoretical guarantees, in experiments, it has proven surprisingly difficult to consistently and significantly outperform simple fixed combination schemes of kernels. This talk will survey different methods and algorithms for learning kernels and will present novel results that tend to suggest that significant performance improvements can be obtained with a large number of kernels. (Includes joint work with Mehryar Mohri and Afshin Rostamizadeh.)
关 键 词: 核函数的选择; 样本点; 内核
课程来源: 视频讲座网
最后编审: 2020-06-08:吴雨秋(课程编辑志愿者)
阅读次数: 36