用递归神经网络学习多尺度时间动力学Learning multi-scale temporal dynamics with recurrent neural networks |
|
课程网址: | http://videolectures.net/netadis2015_taylor_neural_networks/ |
主讲教师: | Graham Taylor |
开课单位: | 圭尔夫大学 |
开课时间: | 2016-03-07 |
课程语种: | 英语 |
中文简介: | 在过去三年中,研究循环神经网络 (RNN) 的活动呈爆炸式增长,这是一种前馈神经网络的泛化,可以将序列映射到序列。使用时间反向传播训练 RNN 可能很困难,并且直到最近才被认为是无望的,因为训练中使用的梯度消失和爆炸。优化方法和架构的最新进展在语音、笔迹和语言建模方面取得了令人瞩目的成果。其他领域的应用正在涌现。在本次演讲中,我将回顾 RNN 的一些最新进展,并讨论我们在扩展和改进 Clockwork RNN(Koutnick 等人)方面的工作,这是一个简单而强大的模型,可对其隐藏单元进行分区以对特定的时间尺度进行建模。我们的“密集发条装置”是架构的一种平移不变形式,我们证明它比它们的前身更高效、更有效。我还将介绍最近与 Google 的合作,在该合作中,我们应用 Dense clockworks 根据加速度计和陀螺仪捕获的设备运动对手机用户进行身份验证。 |
课程简介: | The last three years have seen an explosion of activity studying recurrent neural networks (RNNs), a generalization of feedforward neural networks which can map sequences to sequences. Training RNNs using backpropagation through time can be difficult, and was thought up until recently to be hopeless due to vanishing and exploding gradients used in training. Recent advances in optimization methods and architectures have led to impressive results in modeling speech, handwriting and language. Applications to other areas are emerging. In this talk, I will review some recent progress on RNNs and discuss our work on extending and improving the Clockwork RNN (Koutnick et al.), a simple yet powerful model that partitions its hidden units to model specific temporal scales. Our “Dense clockworks” are a shift-invariant form of the architecture which which we show to be more efficient and effective than their predecessor. I will also describe a recent collaboration with Google in which we apply Dense clockworks to authenticating mobile phone users based on the movement of the device as captured by the accelerometer and gyroscope. |
关 键 词: | 循环神经网络; RNN; 密集发条装置 |
课程来源: | 视频讲座网 |
数据采集: | 2021-07-07:liyy |
最后编审: | 2021-07-09:liyy |
阅读次数: | 56 |