首页概率论
   首页应用数学
   首页函数论
0


论多标签学习的一致性

On the Consistency of Multi-Label Learning
课程网址: http://videolectures.net/colt2011_gao_consistency/  
主讲教师: Wei Gao
开课单位: 南京大学
开课时间: 2011-08-02
课程语种: 英语
中文简介:
多标签学习在过去几年中引起了很多关注。已经开发了许多多标记学习方法,主要用于替代损失函数,因为由于非凸性和不连续性,多标记丢失函数通常难以直接优化。尽管这些方法是有效的,但就我们所知,对于学习函数的风险与贝叶斯风险的收敛没有理论结果。在本文中,着重于两个众所周知的多标签丢失函数,即排名损失和汉明损失,我们证明了基于替代损失函数的多标签学习的一致性的充分必要条件。我们的结果显示,令人惊讶的是,没有凸的替代损失与排名损失一致。受此发现的启发,我们引入了部分排名损失,其中一些替代函数是一致的。对于汉明丢失,我们表明,即使对于确定性多标签分类,一些最近的多标签学习方法也是不一致的,并且给出了对于确定性情况一致的替代损失函数。最后,我们讨论了通过分解为一组二元分类问题来解决多标签学习的学习方法的一致性。
课程简介: Multi-label learning has attracted much attention during the past few years. Many multilabel learning approaches have been developed, mostly working with surrogate loss functions since multi-label loss functions are usually difficult to optimize directly owing to non-convexity and discontinuity. Though these approaches are effective, to the best of our knowledge, there is no theoretical result on the convergence of risk of the learned functions to the Bayes risk. In this paper, focusing on two well-known multi-label loss functions, i.e., ranking loss and hamming loss, we prove a necessary and sufficient condition for the consistency of multi-label learning based on surrogate loss functions. Our results disclose that, surprisingly, none convex surrogate loss is consistent with the ranking loss. Inspired by the finding, we introduce the partial ranking loss, with which some surrogate functions are consistent. For hamming loss, we show that some recent multi-label learning approaches are inconsistent even for deterministic multi-label classification, and give a surrogate loss function which is consistent for the deterministic case. Finally, we discuss on the consistency of learning approaches which address multi-label learning by decomposing into a set of binary classification problems.
关 键 词: 多标记学习方法; 替代损失函数; 贝叶斯风险
课程来源: 视频讲座网
最后编审: 2019-03-05:lxf
阅读次数: 123