0


学习使用许多例子

Learning using Many Examples
课程网址: http://videolectures.net/mmdss07_bottou_lume/  
主讲教师: Léon Bottou
开课单位: 美国NEC实验室
开课时间: 2007-11-26
课程语种: 英语
中文简介:
统计学习理论建议选择大容量的模型,几乎不能避免过度拟合训练数据。从这个角度来看,所有数据集都很小。当考虑到处理大型数据集的计算成本时,事情变得更加复杂。当一个人想要模拟智能时,计算上的挑战训练集就会出现:生物大脑从我们的六种感官产生的连续的感知数据流中非常有效地学习,使用有限量的糖作为能量来源。当一个人想要分析描述我们计算机社会生活的大量数据时,计算上的挑战训练集也会出现。我们了解的数据越多,我们就越享受竞争优势。–本教程的第一部分阐明了统计效率、学习算法设计及其计算成本之间的关系。–第二部分详细探讨了具体的学习算法及其实现,包括简单和复杂的示例。–第三部分考虑通过对数据进行一次传递来学习的算法。某些算法具有最佳特性,但往往成本过高。讨论了解决方法。–最后,第四部分展示了活动示例选择如何提供更高的速度并减少约束并行实现的反馈压力。
课程简介: The statistical learning theory suggests to choose large capacity models that barely avoid over-fitting the training data. In that perspective, all datasets are small. Things become more complicated when one considers the computational cost of processing large datasets. Computationally challenging training sets appear when one want to emulate intelligence: biological brains learn quite efficiently from the continuous streams of perceptual data generated by our six senses, using limited amounts of sugar as a source of power. Computationally challenging training sets also appear when one want to analyze the masses of data that describe the life of our computerized society. The more data we understand, the more we enjoy competitive advantages. – The first part of the tutorial clarifies the relation between the statistical efficiency, the design of learning algorithms and their computational cost. – The second part makes a detailed exploration of specific learning algorithms and of their implementation, with both simple and complex examples. – The third part considers algorithms that learn with a single pass over the data. Certain algorithms have optimal properties but are often too costly. Workarounds are discussed. – Finally, the fourth part shows how active example selection provides greater speed and reduces the feedback pressure that constrain parallel implementations.
关 键 词: 计算机科学; 机器学习; 统计学习
课程来源: 视频讲座网
最后编审: 2021-05-14:yumf
阅读次数: 48