0


为什么无监督预训练有助于深层判别学习?

Why Does Unsupervised Pre-training Help Deep Discriminant Learning?
课程网址: http://videolectures.net/nipsworkshops09_erhan_wduphddl/  
主讲教师: Dumitru Erhan
开课单位: 蒙特利尔大学
开课时间: 2010-03-26
课程语种: 英语
中文简介:
最近的研究致力于深度架构的学习算法,例如Deep Belief Networks和自动编码器变体堆栈,在几个领域获得了令人印象深刻的结果。在监督学习任务中获得的最佳结果涉及无监督学习组件,通常在无人监督的训练前阶段,具有生成模型。尽管这些新算法已经使得训练深度模型能够通过判别标准进行微调,但是对于这种困难学习问题的性质仍存在许多问题。这里调查的主要问题如下:为什么无人监督的预训练工作以及为什么它能如此有效?如果要进一步改进深层体系结构的学习,回答这些问题非常重要。我们提出了几个解释性假设,并通过大量模拟进行测试。我们根据经验证明了无监督预训练对建筑深度,模型能力和训练样例数量的影响。实验证实并阐明了无监督预训练的优势。结果表明,无监督的预训练指导了对最小值吸引力盆地的学习,这些盆地在基础数据分布方面更好;这些结果的证据支持对训练前效果的不寻常的正则化解释。
课程简介: Recent research has been devoted to learning algorithms for deep architectures such as Deep Belief Networks and stacks of auto-encoder variants, with impressive results obtained in several areas. The best results obtained on supervised learning tasks involve an unsupervised learning component, usually in an unsupervised pre-training phase, with a generative model. Even though these new algorithms have enabled training deep models fine-tuned with a discriminant criterion, many questions remain as to the nature of this difficult learning problem. The main question investigated here is the following: why does unsupervised pre-training work and why does it work so well? Answering these questions is important if learning in deep architectures is to be further improved. We propose several explanatory hypotheses and test them through extensive simulations. We empirically show the influence of unsupervised pre-training with respect to architecture depth, model capacity, and number of training examples. The experiments confirm and clarify the advantage of unsupervised pre-training. The results suggest that unsupervised pre-training guides the learning towards basins of attraction of minima that are better in terms of the underlying data distribution; the evidence from these results supports an unusual regularization explanation for the effect of pre-training.
关 键 词: 无监督学习; 学习算法; 无监督预训练
课程来源: 视频讲座网
最后编审: 2020-06-27:zyk
阅读次数: 55