首页数学
0


论核学习的复杂性

On the Complexity of Learning with Kernels
课程网址: https://videolectures.net/videos/colt2015_shamir_complexity_learn...  
主讲教师: Ohad Shamir
开课单位: 信息不详。欢迎您在右侧留言补充。
开课时间: 2025-02-04
课程语种: 英语
中文简介:
核学习的一个公认的限制是需要处理核矩阵,其大小在训练示例的数量上是二次的。已经提出了许多方法来减少这种计算成本,主要是通过使用核矩阵条目的子集,或某种形式的低秩矩阵近似,或随机投影方法。在本文中,我们研究了这些方法所能达到的误差的下界,如核矩阵中观察到的项数的函数或近似核矩阵的秩的函数。我们表明,在一些核学习问题中,没有这样的方法可以节省大量的计算量。我们的结果还量化了问题难度如何取决于参数,如损失函数的性质、正则化参数、期望预测器的范数和核矩阵秩。我们的结果还表明,在某些情况下,更有效的核学习是可能的。
课程简介: A well-recognized limitation of kernel learning is the requirement to handle a kernel matrix, whose size is quadratic in the number of training examples. Many methods have been proposed to reduce this computational cost, mostly by using a subset of the kernel matrix entries, or some form of low-rank matrix approximation, or a random projection method. In this paper, we study lower bounds on the error attainable by such methods as a function of the number of entries observed in the kernel matrix or the rank of an approximate kernel matrix. We show that there are kernel learning problems where no such method will lead to non-trivial computational savings. Our results also quantify how the problem difficulty depends on parameters such as the nature of the loss function, the regularization parameter, the norm of the desired predictor, and the kernel matrix rank. Our results also suggest cases where more efficient kernel learning might be possible.
关 键 词: 核矩阵; 低秩矩阵; 正则化函数
课程来源: 视频讲座网
数据采集: 2025-03-27:zsp
最后编审: 2025-03-28:liyy
阅读次数: 4