0


自然语言处理的统一架构:多任务学习的深度神经网络

A Unified Architecture for Natural Language Processing: Deep Neural Networks with Multitask Learning
课程网址: http://videolectures.net/icml08_collobert_uanl/  
主讲教师: Ronan Collobert
开课单位: 美国NEC实验室
开课时间: 2008-07-30
课程语种: 英语
中文简介:
我们描述了一个单一的卷积神经网络体系结构, 它给出了一个句子, 输出了大量的语言处理预测: 词性标记、块、命名实体标记、语义角色、语义上相似的词和使用语言模型, 句子是有意义的 (语法和语义上)。整个网络通过多任务学习的一个实例--体重分担, 在所有这些任务中接受联合培训。所有任务都使用标记数据, 但语言模型除外, 该模型是从未标记的文本中学习的, 它代表了一种针对共享任务执行半监督学习的新方法。我们展示了多任务学习和半监督学习如何提高共享任务的泛化, 从而形成了具有最先进性能的学习模型。
课程简介: We describe a single convolutional neural network architecture that given a sentence, outputs a host of language processing predictions: part-of-speech tags, chunks, named entity tags, semantic roles, semantically similar words and the likelihood that the sentence makes sense (grammatically and semantically) using a language model. The entire network is trained jointly on all these tasks using weight-sharing, an instance of multitask learning. All the tasks use labeled data except the language model which is learnt from unlabeled text and represents a novel way of performing semi-supervised learning for the shared tasks. We show how both multitask learning and semi-supervised learning improve the generalization of the shared tasks, resulting in a learnt model with state-of-the-art performance.
关 键 词: 卷积神经网络结构; 语言模型; 多任务学习和半监督学习
课程来源: 视频讲座网
最后编审: 2021-05-14:yumf
阅读次数: 93