0


线性判别降维

Linear Discriminant Dimensionality Reduction
课程网址: http://videolectures.net/ecmlpkdd2011_han_dimensionality/  
主讲教师: Jiawei Han
开课单位: 伊利诺伊大学
开课时间: 2011-11-30
课程语种: 英语
中文简介:
费舍尔准则在降维方面取得了巨大的成功。基于费舍尔准则的两种具有代表性的方法是费舍尔评分和线性判别分析 (lda)。前者是为特征选择而开发的, 而后者是为子空间学习而设计的。在过去十年中, 这两种方法往往是独立研究的。本文在观察费舍尔评分与 lda 互补的基础上, 提出将费舍尔分数和 lda 集成在一个统一的框架中, 即线性判别维度约简 (ldr)。我们的目标是寻找一个特征子集, 在此基础上, 通过 lda 学习的线性变换最大化了 fisher 标准。ldr 继承了费舍尔评分和 lda 的优点, 能够同时进行特征选择和子空间学习。费舍尔得分和 lda 都可以看作是该方法的特例。由此产生的优化问题是一个混合整数规划, 很难解决。将其放宽到 l2, 1 规范约束最小二乘法中, 并通过加速近端梯度下降算法求解。对基准人脸识别数据集的实验表明, 该方法的性能可以说优于最先进的方法。
课程简介: Fisher criterion has achieved great success in dimensionality reduction. Two representative methods based on Fisher criterion are Fisher Score and Linear Discriminant Analysis (LDA). The former is developed for feature selection while the latter is designed for subspace learning. In the past decade, these two approaches are often studied independently. In this paper, based on the observation that Fisher score and LDA are complementary, we propose to integrate Fisher score and LDA in a unified framework, namely Linear Discriminant Dimensionality Reduction (LDDR). We aim at finding a subset of features, based on which the learnt linear transformation via LDA maximizes the Fisher criterion. LDDR inherits the advantages of Fisher score and LDA and is able to do feature selection and subspace learning simultaneously. Both Fisher score and LDA can be seen as the special cases of the proposed method. The resultant optimization problem is a mixed integer programming, which is difficult to solve. It is relaxed into a L2,1-norm constrained least square problem and solved by accelerated proximal gradient descent algorithm. Experiments on benchmark face recognition data sets illustrate that the proposed method outperforms the state of the art methods arguably.
关 键 词: 计算机科学; 机器学习; 线性模型
课程来源: 视频讲座网
最后编审: 2020-07-14:yumf
阅读次数: 98