0


学习从多个任务不相干的稀疏和低秩的模式

Learning Incoherent Sparse and Low-Rank Patterns from Multiple Tasks
课程网址: http://videolectures.net/kdd2010_chen_lislrpmt/  
主讲教师: Jianhui Chen
开课单位: 亚利桑那州立大学
开课时间: 2010-10-01
课程语种: 英语
中文简介:
我们考虑从多个任务中学习语无伦次的稀疏和低等级模式的问题。我们的方法基于线性多任务学习公式, 在该公式中, 稀疏和低阶模式分别由基数正则化项和低阶约束诱导。这种提法是非凸的;我们将其转换为凸代孕, 可以通过小尺寸问题的半定编程来例行求解。我们建议采用一般的投影梯度方案来有效地解决这样一个凸代孕项;然而, 在优化公式中, 目标函数是不可微分的, 可行的域是不可忽视的。我们提出了计算投影梯度和确保投影梯度方案的全局收敛的过程。投影梯度的计算涉及一个约束优化问题;通过求解无约束优化子问题和欧几里得投影子问题, 可以得到该问题的最优解。此外, 我们提出了两个投影梯度算法, 并讨论了它们的收敛速度。在基准数据集上的实验结果表明了所提出的多任务学习公式的有效性和所提出的投影梯度算法的有效性。
课程简介: We consider the problem of learning incoherent sparse and low-rank patterns from multiple tasks. Our approach is based on a linear multi-task learning formulation, in which the sparse and low-rank patterns are induced by a cardinality regularization term and a low-rank constraint, respectively. This formulation is non-convex; we convert it into its convex surrogate, which can be routinely solved via semidefinite programming for small-size problems. We propose to employ the general projected gradient scheme to efficiently solve such a convex surrogate; however, in the optimization formulation, the objective function is non-differentiable and the feasible domain is non-trivial. We present the procedures for computing the projected gradient and ensuring the global convergence of the projected gradient scheme. The computation of projected gradient involves a constrained optimization problem; we show that the optimal solution to such a problem can be obtained via solving an unconstrained optimization subproblem and an Euclidean projection subproblem. In addition, we present two projected gradient algorithms and discuss their rates of convergence. Experimental results on benchmark data sets demonstrate the effectiveness of the proposed multi-task learning formulation and the efficiency of the proposed projected gradient algorithms.
关 键 词: 多任务学习的制定; 投影梯度算法; 优化配方
课程来源: 视频讲座网
最后编审: 2020-06-03:张荧(课程编辑志愿者)
阅读次数: 37