0


基于子空间学习的扰动格拉斯曼核

Disturbance Grassmann Kernels for Subspace Based Learning
课程网址: http://videolectures.net/kdd2018_hong_disturbance_grassmann/  
主讲教师: Junyuan Hong
开课单位: 中国科学技术大学
开课时间: 2018-11-23
课程语种: 英语
中文简介:
在本文中,我们关注基于子空间的学习问题,其中数据元素是线性子空间而不是向量。为了处理这类数据,提出了格拉斯曼核来测量空间结构,并与分类器(例如,支持向量机(SVM))一起使用。然而,现有的判别算法大多忽略了子空间的不稳定性,这会导致分类器被干扰实例误导。因此,我们建议在学习过程中考虑子空间的所有潜在干扰,以获得更鲁棒的分类器。首先,我们推导了具有已知分布扰动的线性分类器的对偶优化,得到了一个新的核,扰动格拉斯曼(DG)核。其次,我们研究了与子空间矩阵和基的奇异值相关的两种扰动,并将格拉斯曼流形上的投影核扩展到两个新的核。对动作数据的实验表明,与最先进的基于子空间的方法相比,所提出的内核表现更好,即使在更恶劣的环境中也是如此。
课程简介: In this paper, we focus on subspace-based learning problems, where data elements are linear subspaces instead of vectors. To handle this kind of data, Grassmann kernels were proposed to measure the space structure and used with classifiers, e.g., Support Vector Machines (SVMs). However, the existing discriminative algorithms mostly ignore the instability of subspaces, which would cause the classifiers to be misled by disturbed instances. Thus we propose considering all potential disturbances of subspaces in learning processes to obtain more robust classifiers. Firstly, we derive the dual optimization of linear classifiers with disturbances subject to a known distribution, resulting in a new kernel, Disturbance Grassmann (DG) kernel. Secondly, we research into two kinds of disturbance, relevant to the subspace matrix and singular values of bases, with which we extend the Projection kernel on Grassmann manifolds to two new kernels. Experiments on action data indicate that the proposed kernels perform better compared to state-of-the-art subspace-based methods, even in a worse environment.
关 键 词: 子空间的学习问题; 线性子空间; 空间矩阵和基的奇异值; 线性分类器的对偶优化
课程来源: 视频讲座网
数据采集: 2023-03-09:cyh
最后编审: 2023-05-15:cyh
阅读次数: 24