首页数学
0


记忆有界稀疏线性回归的极大极小率

Minimax rates for memory-bounded sparse linear regression
课程网址: https://videolectures.net/videos/colt2015_steinhardt_linear_regre...  
主讲教师: Jacob Steinhardt
开课单位: 信息不详。欢迎您在右侧留言补充。
开课时间: 2025-02-04
课程语种: 英语
中文简介:
我们建立了$\Omega\p{\frac{kd}{b\epsilon}}$的极小极大下界,该下界是在给定$b$ bits的$k$-稀疏线性回归中估计维数$d$的参数所需的样本数$d$,其中$\epsilon$是$L_$参数错误。当回归量的协方差为单位矩阵时,我们还提供了一种算法,该算法使用$\tilde{O}(b+k)$比特,并且需要$\tilde{O}(\frac{kd}{b\epsilon^2})$样本来实现误差$\epsilon$。我们的下界也适用于更一般的通信边界设置,在这种设置中,每个样本最多允许(自适应地)通信$b$比特的信息,而不是内存边界
课程简介: We establish a minimax lower bound of $\Omega\p{\frac{kd}{b\epsilon}}$ on the number of samples needed to estimate the parameters in a $k$-sparse linear regression of dimension $d$ given a memory bound of $b$ bits, where $\epsilon$ is the $L_2$ parameter error. When the covariance of the regressors is the identity matrix, we also provide an algorithm that uses $\tilde{O}(b+k)$ bits and requires $\tilde{O}(\frac{kd}{b\epsilon^2})$ samples to achieve error $\epsilon$. Our lower bound also holds in the more general communication-bounded setting, where instead of a memory bound, at most $b$ bits of information are allowed to be (adaptively) communicated about each sample
关 键 词: 极小极大下; 稀疏线性回; 协方差
课程来源: 视频讲座网
数据采集: 2025-03-28:zsp
最后编审: 2025-03-28:zsp
阅读次数: 5