0


关于mkl的高斯过程观点

A Gaussian Process View on MKL
课程网址: http://videolectures.net/nipsworkshops2010_urtasun_gpv/  
主讲教师: Raquel Urtasun
开课单位: 多伦多大学
开课时间: 2011-01-12
课程语种: 英语
中文简介:
高斯过程(GP)为多核学习(MKL)提供了吸引人的概率框架。对于超过adecade,通常的做法是通过例如最大似然估计来学习内核的知识。在本讲中,我将首先介绍内核的高斯过程表达的总和。然后,我将展示如何通过学习GP协方差来超越凸形公式。特别是,我将首先介绍共同训练的协方差驱动连接的参数形式。然后,我将展示如何通过潜在空间学习非参数协方差。如果时间允许,我将谈论多任务学习以及多输出高斯过程和与度量学习的展示联系。我将在计算机视觉任务(如对象识别)中演示其中一些方法的性能。
课程简介: Gaussian processes (GPs) provide an appealing probabilistic framework for multiple kernel learning (MKL). For more than a decade, it has been common practice to learn the well known sum-of-kernels by, for example, maximum likelihood estimation. In this talk, I’ll first introduce the sum-of-kernels Gaussian process formulation. I’ll then show how to go beyond convex formulations by learning the GP covariance. In particular, I’ll first introduce parametric forms of the covariance driving connections to co- training. I’ll then show how to learn non-parametric covariances via latent spaces. If time permits, I’ll talk about multi-task learning as well as multi-output Gaussian processes and show connections to metric learning. I’ll demonstrate the performance of some of these approaches in computer vision tasks such as object recognition.
关 键 词: 高斯过程; 多核学习; 概率框架
课程来源: 视频讲座网
最后编审: 2019-09-07:lxf
阅读次数: 58