0


“辍学”: 一种改进神经网络的简单有效方法

Dropout: A simple and effective way to improve neural networks
课程网址: http://videolectures.net/nips2012_hinton_networks/  
主讲教师: Geoffrey E. Hinton
开课单位: 多伦多大学
开课时间: 2013-01-16
课程语种: 英语
中文简介:
在大型前馈神经网络中,通过随机省略每个训练案例中的一半隐藏单元,可以大大减少过度拟合。这可以防止复杂的共同适应,其中特征检测器仅在几个其他特定特征检测器的上下文中有用。相反,每个神经元学会检测一个特征,该特征通常有助于产生正确的答案,因为它必须在其中运行的组合上大量的内部情境。随机“辍学”对许多基准任务进行了重大改进,并为物体识别和分子活动预测设定了新记录。默克分子活动挑战赛由Kaggle主持,由制药公司Merck赞助。比赛的目的是预测分子是否对给定的靶分子具有高活性。竞争数据包括从输入分子的化学结构和十五种不同生物相关目标的活动数据产生的大量数字描述符。准确的模型在药物发现过程中具有许多应用。乔治将讨论他的团队基于辍学训练的神经网络的第一个解决方案。
课程简介: In a large feedforward neural network, overfitting can be greatly reduced by randomly omitting half of the hidden units on each training case. This prevents complex co-adaptations in which a feature detector is only helpful in the context of several other specific feature detectors. Instead, each neuron learns to detect a feature that is generally helpful for producing the correct answer given the combinatorially large variety of internal contexts in which it must operate. Random “dropout” gives big improvements on many benchmark tasks and sets new records for object recognition and molecular activity prediction. The Merck Molecular Activity Challenge was a contest hosted by Kaggle and sponsored by the pharmaceutical company Merck. The goal of the contest was to predict whether molecules were highly active towards a given target molecule. The competition data included a large number of numerical descriptors generated from the chemical structures of the input molecules and activity data for fifteen different biologically relevant targets. An accurate model has numerous applications in the drug discovery process. George will discuss his team's first place solution based on neural networks trained with dropout.
关 键 词: 神经网络; 神经元; 分子活动
课程来源: 视频讲座网
最后编审: 2019-09-07:lxf
阅读次数: 58