0


辍学:改进神经网络的一种简单有效的方法

Dropout: A simple and effective way to improve neural networks
课程网址: http://videolectures.net/nips2012_hinton_networks/  
主讲教师: Geoffrey E. Hinton
开课单位: 多伦多大学
开课时间: 2013-01-16
课程语种: 英语
中文简介:

在大型前馈神经网络中,通过在每个训练案例中随机省略一半的隐藏单元,可以大大减少过度拟合。这防止了复杂的协同适配,其中特征检测器仅在其他几个特定特征检测器的上下文中才有用。取而代之的是,每个神经元都会学会检测一种特征,该特征通常会在给定其必须在其中进行操作的多种内部环境的情况下,有助于产生正确答案。随机的“辍学”对许多基准任务进行了重大改进,并为物体识别和分子活性预测创造了新记录。默克分子活动挑战赛是由Kaggle主办,由制药公司默克公司赞助的竞赛。竞赛的目的是预测分子对给定的目标分子是否具有高活性。竞争数据包括从输入分子的化学结构生成的大量数字描述符以及15种不同生物学相关靶标的活性数据。准确的模型在药物发现过程中具有众多应用。乔治将讨论基于经过辍学训练的神经网络的团队的首位解决方案。

课程简介: In a large feedforward neural network, overfitting can be greatly reduced by randomly omitting half of the hidden units on each training case. This prevents complex co-adaptations in which a feature detector is only helpful in the context of several other specific feature detectors. Instead, each neuron learns to detect a feature that is generally helpful for producing the correct answer given the combinatorially large variety of internal contexts in which it must operate. Random “dropout” gives big improvements on many benchmark tasks and sets new records for object recognition and molecular activity prediction. The Merck Molecular Activity Challenge was a contest hosted by Kaggle and sponsored by the pharmaceutical company Merck. The goal of the contest was to predict whether molecules were highly active towards a given target molecule. The competition data included a large number of numerical descriptors generated from the chemical structures of the input molecules and activity data for fifteen different biologically relevant targets. An accurate model has numerous applications in the drug discovery process. George will discuss his team's first place solution based on neural networks trained with dropout.
关 键 词: 计算机科学; 机器学习; 神经网络
课程来源: 视频讲座网
数据采集: 2020-09-21:wuyq
最后编审: 2020-09-22:zyk
阅读次数: 20