0


文本分类中的有效多标签主动学习

Effective Multi-Label Active Learning for Text Classification
课程网址: http://videolectures.net/kdd09_yang_emlaltc/  
主讲教师: Bishan Yang
开课单位: 北京大学
开课时间: 2009-09-14
课程语种: 英语
中文简介:
标记文本数据非常耗时,但对于自动文本分类非常重要。尤其是,当培训多标签文本分类器需要大量数据时,为每个文档手动创建多个标签可能变得不切实际。为了尽可能减少人工标记的工作量,我们提出了一种新的多标签主动学习方法,它可以在不牺牲分类精度的情况下减少所需的标记数据。传统的主动学习算法只能处理单标签问题,即每个数据只能有一个标签。我们的方法考虑了多标签信息,并选择了能够最大限度地减少预期模型损失的无标签数据。具体地说,模型损失由版本空间的大小来近似,版本空间大小的缩减率由支持向量机(SVM)来优化。设计了一种有效的标签预测方法,对每一个未标记的数据点进行可能的标签预测,并根据标签预测的最可靠结果,将所有标签上的损失相加,近似估计出多标签数据的预期损失。对多个现实数据集(所有数据集都是公开的)的实验表明,与最先进的方法相比,我们的方法可以获得标记数据更少的有前景的分类结果。
课程简介: Labeling text data is quite time-consuming but essential for automatic text classification. Especially, manually creating multiple labels for each document may become impractical when a very large amount of data is needed for training multi-label text classifiers. To minimize the human-labeling efforts, we propose a novel multi-label active learning approach which can reduce the required labeled data without sacrificing the classification accuracy. Traditional active learning algorithms can only handle single-label problems, that is, each data is restricted to have one label. Our approach takes into account the multi-label information, and select the unlabeled data which can lead to the largest reduction of the expected model loss. Specifically, the model loss is approximated by the size of version space, and the reduction rate of the size of version space is optimized with Support Vector Machines (SVM). An effective label prediction method is designed to predict possible labels for each unlabeled data point, and the expected loss for multi-label data is approximated by summing up losses on all labels according to the most confident result of label prediction. Experiments on several real-world data sets (all are publicly available) demonstrate that our approach can obtain promising classification result with much fewer labeled data than state-of-the-art methods.
关 键 词: 文本数据; 主动学习方法; 分类精度; 支持向量机
课程来源: 视频讲座网
最后编审: 2021-01-29:nkq
阅读次数: 121