使用增强RNN超越Seq2SeqBeyond Seq2Seq with Augmented RNNs |
|
课程网址: | http://videolectures.net/deeplearning2016_grefenstette_augmented_... |
主讲教师: | Edward Grefenstette |
开课单位: | 谷歌 |
开课时间: | 2016-08-23 |
课程语种: | 英语 |
中文简介: | 序列到序列模型以其最基本的形式,遵循编码器-解码器范例,将源序列表示压缩编码为单个矢量表示,并将该表示解码为目标序列。本讲座将讨论这种压缩方法的问题,一些涉及注意力和外部可微分记忆的解决方案,以及这些扩展所面临的问题。自始至终都会提供自然语言理解领域的励志范例。 |
课程简介: | Sequence to sequence models in their most basic form, following an encoder-decoder paradigm, compressively encode source sequence representations into a single vector representation and decode this representation into a target sequence. This lecture will discuss the problems with this compressive approach, some solutions involving attention and external differentiable memory, and issues faced by these extensions. Motivating examples from the field of natural language understanding will be provided throughout. |
关 键 词: | 序列到序列模型; 矢量表示; 自然语言理解领域 |
课程来源: | 视频讲座网 |
数据采集: | 2022-11-04:chenjy |
最后编审: | 2023-05-13:chenjy |
阅读次数: | 28 |