基于非参数学习预测子空间的多任务学习Multitask Learning Using Nonparametrically Learned Predictor Subspaces |
|
课程网址: | http://videolectures.net/nipsworkshops09_rai_mlun/ |
主讲教师: | Piyush Rai |
开课单位: | 犹他大学 |
开课时间: | 2010-01-19 |
课程语种: | 英语 |
中文简介: | 给定几个相关的学习任务,我们提出了一种非参数贝叶斯学习模型,它通过假设任务参数(即权重向量)共享潜在子空间来捕获任务相关性。更具体地说,该子空间的内在维度不被认为是先验已知的。我们使用一个有限的潜在特征模型,印度自助餐流程自动推断这个数字。我们还提出了这种模型的扩展,其中子空间学习可以包含(标记的,并且如果可用的另外未标记的)示例,或者任务参数共享子空间的混合,而不是共享单个子空间。后一种属性可以允许学习任务参数下的非线性流形结构,也可以帮助防止异常任务的负迁移。 |
课程简介: | Given several related learning tasks, we propose a nonparametric Bayesian learning model that captures task relatedness by assuming that the task parameters (i.e., weight vectors) share a latent subspace. More specifically, the intrinsic dimensionality of this subspace is not assumed to be known a priori. We use an infinite latent feature model - the Indian Buffet Process - to automatically infer this number. We also propose extensions of this model where the subspace learning can incorporate (labeled, and additionally unlabeled if available) examples, or the task parameters share a mixture of subspaces, instead of sharing a single subspace. The latter property can allow learning nonlinear manifold structure underlying the task parameters, and can also help in preventing negative transfer from outlier tasks. |
关 键 词: | 非参数贝叶斯; 权重向量; 非线性流形结构 |
课程来源: | 视频讲座网 |
最后编审: | 2019-09-07:lxf |
阅读次数: | 89 |