首页数学
0


用张量法学习过完备潜变量模型

Learning Overcomplete Latent Variable Models through Tensor Methods
课程网址: https://videolectures.net/videos/colt2015_anandkumar_tensor_metho...  
主讲教师: Animashree Anandkumar
开课单位: 信息不详。欢迎您在右侧留言补充。
开课时间: 2025-02-04
课程语种: 英语
中文简介:
我们为学习潜在变量模型提供了保证,强调了潜在空间的维数超过观测维数的过完备状态。特别地,我们考虑了多视图混合、ICA和稀疏编码模型。我们的主要工具是一种新的张量分解算法,它适用于过完全状态。在半监督设置中,我们利用标签信息得到模型参数的粗略估计,然后在未标记的样本上使用张量方法对其进行改进。当组件的数量缩放为$k=o(d^{p/2})$时,我们建立了学习保证,其中$d$是观察维度,$p$是张量方法中使用的观察矩的阶数(通常是$p=3,4$)。在无监督设置下,提出了一种基于张量切片SVD的简单初始化算法,并在更严格的条件下提供了保证$k\le \beta d$(其中常数$\beta$可以大于$1$)。对于学习应用,我们通过新颖的覆盖论证提供了严格的样本复杂度界限。
课程简介: We provide guarantees for learning latent variable models emphasizing on the overcomplete regime, where the dimensionality of the latent space exceeds the observed dimensionality. In particular, we consider multiview mixtures, ICA, and sparse coding models. Our main tool is a new algorithm for tensor decomposition that works in the overcomplete regime. In the semi-supervised setting, we exploit label information to get a rough estimate of the model parameters, and then refine it using the tensor method on unlabeled samples. We establish learning guarantees when the number of components scales as $k=o(d^{p/2})$, where $d$ is the observed dimension, and $p$ is the order of the observed moment employed in the tensor method (usually $p=3,4$). In the unsupervised setting, a simple initialization algorithm based on SVD of the tensor slices is proposed, and the guarantees are provided under the stricter condition that $k\le \beta d$ (where constant $\beta$ can be larger than $1$). For the learning applications, we provide tight sample complexity bounds through novel covering arguments.
关 键 词: 多视图; 张量分解算法; 张量切片
课程来源: 视频讲座网
数据采集: 2025-03-28:zsp
最后编审: 2025-03-28:zsp
阅读次数: 2