首页函数论
   首页数学
0


对数凹分布下线性分选器的主动和被动学习

Active and passive learning of linear separators under log-concave distributions
课程网址: http://videolectures.net/colt2013_balcan_learning/  
主讲教师: Maria-Florina Balcan
开课单位: 乔治亚理工学院
开课时间: 2013-08-09
课程语种: 英语
中文简介:
证明了在接近对数凹分布的情况下,齐次线性分选器的主动学习比被动学习有指数级的改进。在此基础上,我们提出了一种计算效率较高的PAC算法,该算法具有最优的样本复杂度(最大为常数)。这解决了(Long, 1995, 2003;Bshouty等,2009)研究了单位球均匀分布下高效PAC算法的样本复杂度。此外,它提供了多项式时间PAC算法的第一个界,该算法在一般的数据分布类下,对于一个有趣的无穷类假设函数是紧的,为一个长期悬而未决的问题(Ehrenfeucht et al., 1989;Blumer等人,1989)。我们还为主动和被动学习提供了新的边界,在数据可能不是线性可分的情况下,无论是在不可知论的情况下和在茨巴科夫低噪声条件下。为了得到我们的结果,我们提供了新的结构结果(几乎)对数凹分布,这可能是独立的兴趣以及。
课程简介: We prove that active learning provides an exponential improvement over PAC (passive) learning of homogeneous linear separators under nearly log-concave distributions. Building on this, we provide a computationally efficient PAC algorithm with optimal (up to a constant factor) sample complexity for such problems. This resolves an open question of (Long, 1995, 2003; Bshouty et al., 2009) concerning the sample complexity of efficient PAC algorithms under the uniform distribution in the unit ball. Moreover, it provides the first bound for a polynomial-time PAC algorithm that is tight for an interesting infinite class of hypothesis functions under a general class of data-distributions, providing significant progress towards a long standing open question of (Ehrenfeucht et al., 1989; Blumer et al., 1989). We also provide new bounds for active and passive learning in the case that the data might not be linearly separable, both in the agnostic case and and under the Tsybakov low-noise condition. To derive our results, we provide new structural results for (nearly) log-concave distributions, which might be of independent interest as well.
关 键 词: 对数凹分布; 线性分选器; 最有的样本复杂度; 算法; 线性计算
课程来源: 视频讲座网公开课
最后编审: 2019-05-26:cwx
阅读次数: 65