0


稀疏性遗化在线性回归中对单个序列的界限

Sparsity regret bounds for individual sequences in online linear regression
课程网址: http://videolectures.net/colt2011_gerchinovitz_linear/  
主讲教师: Sébastien Gerchinovitz
开课单位: 巴黎高等师范学校
开课时间: 2011-08-02
课程语种: 英语
中文简介:
当环境维数d可以远大于时间轮数T时,我们考虑任意确定性序列的在线线性回归问题。我们引入稀疏性后悔限制的概念,这是近期风险界限的确定性在线对应物。稀疏情景下的随机设置。我们证明了这种后悔限制在线学习算法SeqSEW并基于指数加权和数据驱动截断。在第二部分中,我们在i.i.d上应用此算法的参数免费版本。数据并得出与Dalalyan和Tsybakov(2008年,2011年)相同风味的风险界限,但解决了其中的两个问题。特别是如果后者是高斯的话,我们的风险界限对于噪声的未知方差是自适应的(高达对数因子)。
课程简介: We consider the problem of online linear regression on arbitrary deterministic sequences when the ambient dimension d can be much larger than the number of time rounds T. We introduce the notion of sparsity regret bound, which is a deterministic online counterpart of recent risk bounds derived in the stochastic setting under a sparsity scenario. We prove such regret bounds for an online-learning algorithm called SeqSEW and based on exponential weighting and data-driven truncation. In a second part we apply a parameter-free version of this algorithm on i.i.d. data and derive risk bounds of the same flavor as in Dalalyan and Tsybakov (2008, 2011) but which solve two questions left open therein. In particular our risk bounds are adaptive (up to a logarithmic factor) to the unknown variance of the noise if the latter is Gaussian.
关 键 词: 序列; 线性回归; 算法
课程来源: 视频讲座网
最后编审: 2019-02-23:chenxin
阅读次数: 82