0


辍学:改进神经网络的一种简单有效的方法

Dropout: A simple and effective way to improve neural networks
课程网址: http://videolectures.net/nips2012_hinton_networks/  
主讲教师: Geoffrey E. Hinton
开课单位: 多伦多大学
开课时间: 2013-06-16
课程语种: 英语
中文简介:
在一个大型前向神经网络中,通过在每个训练样本中随机省略一半的隐单元,可以大大减少过拟合。这就防止了复杂的协同适应,在这种情况下,一个特征检测器只对其他几个特定的特征检测器有帮助。相反,每一个神经元学习检测一个特征,这个特征通常有助于产生正确的答案,因为它必须在各种各样的内部环境中进行操作。随机“辍学”使许多基准任务有了很大的改进,并为目标识别和分子活动预测创造了新的记录。默克分子活性挑战赛是由卡格尔主办、默克制药公司赞助的一项竞赛。比赛的目的是预测分子是否对特定的目标分子具有高度的活性。竞争数据包括由输入分子的化学结构生成的大量数字描述符和15个不同生物相关目标的活性数据。一个精确的模型在药物发现过程中有许多应用。乔治将讨论他的团队的首要解决方案,该方案是基于接受辍学训练的神经网络。
课程简介: In a large feedforward neural network, overfitting can be greatly reduced by randomly omitting half of the hidden units on each training case. This prevents complex co-adaptations in which a feature detector is only helpful in the context of several other specific feature detectors. Instead, each neuron learns to detect a feature that is generally helpful for producing the correct answer given the combinatorially large variety of internal contexts in which it must operate. Random “dropout” gives big improvements on many benchmark tasks and sets new records for object recognition and molecular activity prediction. The Merck Molecular Activity Challenge was a contest hosted by Kaggle and sponsored by the pharmaceutical company Merck. The goal of the contest was to predict whether molecules were highly active towards a given target molecule. The competition data included a large number of numerical descriptors generated from the chemical structures of the input molecules and activity data for fifteen different biologically relevant targets. An accurate model has numerous applications in the drug discovery process. George will discuss his team's first place solution based on neural networks trained with dropout.
关 键 词: 神经网络; 改进方法; 辍学
课程来源: 视频讲座网
数据采集: 2020-11-27:yxd
最后编审: 2020-11-27:yxd
阅读次数: 52