0


校准多任务学习

Calibrated Multi‑Task Learning
课程网址: http://videolectures.net/kdd2018_hu_calibrated_learning/  
主讲教师: Zhanxuan Hu
开课单位: 西北工业大学
开课时间: 2018-11-23
课程语种: 英语
中文简介:
本文提出了一种新的非凸校准多任务学习算法(NC-CMTL),用于联合学习多个相关回归任务。NC-CMTL采用非凸低秩正则化器来探索不同任务之间的共享信息,而不是利用核范数。此外,考虑到每个回归任务的正则化参数取决于其噪声水平,我们用平方根损失函数代替最小二乘损失函数。在计算上,由于所提出的模型具有非光滑损失函数和非凸正则化项,我们构造了一个有效的加权方法来优化它。理论上,我们首先给出了所构造方法的收敛性分析,然后证明了所导出的解是原问题的平稳点。特别地,本文中使用的正则化器和优化方法也适用于其他秩最小化问题。对合成数据和真实数据的数值实验表明,NC-CMTL优于几种最先进的方法。
课程简介: This paper proposes a novel algorithm, named Non-Convex Calibrated Multi-Task Learning (NC-CMTL), for learning multiple related regression tasks jointly. Instead of utilizing the nuclear norm, NC-CMTL adopts a non-convex low rank regularizer to explore the shared information among different tasks. In addition, considering that the regularization parameter for each regression task desponds on its noise level, we replace the least squares loss function by square-root loss function. Computationally, as proposed model has a nonsmooth loss function and a non-convex regularization term, we construct an efcient re-weighted method to optimize it. Theoretically, we frst present the convergence analysis of constructed method, and then prove that the derived solution is a stationary point of original problem. Particularly, the regularizer and optimization method used in this paper are also suitable for other rank minimization problems. Numerical experiments on both synthetic and real data illustrate the advantages of NC-CMTL over several state-of-the-art methods.
关 键 词: 非凸校准多任务学习算法; 联合学习多个相关回归任务; 有效的加权方法
课程来源: 视频讲座网
数据采集: 2023-02-03:cyh
最后编审: 2023-02-03:cyh
阅读次数: 34