0


多阶段多任务特征学习

Multi-Stage Multi-Task Feature Learning
课程网址: http://videolectures.net/machine_zhang_learning/  
主讲教师: Changshui Zhang
开课单位: 清华大学
开课时间: 2013-01-15
课程语种: 英语
中文简介:
多任务稀疏特征学习旨在通过利用任务之间的共享特征来提高泛化性能。它已成功应用于许多应用,包括计算机视觉和生物医学信息学。大多数现有的多任务稀疏特征学习算法被公式化为凸稀疏正则化问题,由于其近似于l0型正则化器的松散性而通常是次优的。在本文中,我们提出了一种基于新型正则化器的多任务稀疏特征学习的非凸公式。为了解决非凸优化问题,我们提出了一种多阶段多任务特征学习(MSMTFL)算法。此外,我们提出了详细的理论分析,表明MSMTFL实现了比凸形式更好的参数估计误差界限。对合成和现实世界数据集的实证研究证明了MSMTFL与现有技术的多任务稀疏特征学习算法相比的有效性。
课程简介: Multi-task sparse feature learning aims to improve the generalization performance by exploiting the shared features among tasks. It has been successfully applied to many applications including computer vision and biomedical informatics. Most of the existing multi-task sparse feature learning algorithms are formulated as a convex sparse regularization problem, which is usually suboptimal, due to its looseness for approximating an ℓ0-type regularizer. In this paper, we propose a non-convex formulation for multi-task sparse feature learning based on a novel regularizer. To solve the non-convex optimization problem, we propose a Multi-Stage Multi-Task Feature Learning (MSMTFL) algorithm. Moreover, we present a detailed theoretical analysis showing that MSMTFL achieves a better parameter estimation error bound than the convex formulation. Empirical studies on both synthetic and real-world data sets demonstrate the effectiveness of MSMTFL in comparison with the state of the art multi-task sparse feature learning algorithms.
关 键 词: 多任务稀疏; 共享特征; 泛化性能
课程来源: 视频讲座网
最后编审: 2021-01-29:nkq
阅读次数: 103