0


样本计算权衡

The Sample-Computational Tradeoff
课程网址: http://videolectures.net/nipsworkshops2012_shalev_shwartz_tradeof...  
主讲教师: Shai Shalev-Shwartz
开课单位: 耶路撒冷希伯来大学
开课时间: 2013-01-16
课程语种: 英语
中文简介:
在分析学习算法的误差时,通常将误差分解为近似误差(测量假设类适合问题的程度)和估计误差(由于我们仅接收有限训练集的事实)。在实践中,由于我们具有有限的计算能力,我们通常会支付额外的错误,称为优化错误。我将描述这种三重权衡,并将演示更多的训练样例如何能够带来更有效的学习算法。
课程简介: When analyzing the error of a learning algorithm, it is common to decompose the error into approximation error (measuring how well the hypothesis class fits the problem) and estimation error (due to the fact we only receive a finite training set). In practice, we usually pay an additional error, called optimization error, due to the fact that we have a limited computational power. I will describe this triple tradeoff and will demonstrate how more training examples can lead to more efficient learning algorithms.
关 键 词: 学习算法; 近似误差; 估计误差
课程来源: 视频讲座网
最后编审: 2019-09-08:lxf
阅读次数: 51