基于线性优化的鲁棒近可分非负矩阵分解Robust Near-Separable Nonnegative Matrix Factorization Using Linear Optimization |
|
课程网址: | http://videolectures.net/roks2013_gillis_optimization/ |
主讲教师: | Nicolas Gillis |
开课单位: | 卢万天主教大学 |
开课时间: | 2013-08-26 |
课程语种: | 英语 |
中文简介: | 最近,非负矩阵因子分解(NMF)已被证明在可分性假设下是可处理的,这相当于输入数据矩阵的列属于由少量列生成的凸锥。Bittorf、Recht、R´e和Tropp(“用线性规划分解非负矩阵”,NIPS 2012)提出了一种线性规划(LP)模型,称为HottTopixx,该模型在输入矩阵的任何小扰动下都是稳健的。然而,HottTopixx有两个重要的缺点:(i)输入矩阵必须归一化,以及(ii)因子分解秩必须事先已知。在这次演讲中,我们推广了HottTopixx,以解决这两个缺点,即我们提出了一种新的LP模型,该模型不需要归一化,并自动检测因子分解秩。此外,新的LP模型更灵活,对噪声的容忍度明显更高,并且可以很容易地适应异常值和其他噪声模型。我们在几个合成数据集上表明,它优于HottTopixx,同时与两种最先进的方法竞争。 |
课程简介: | Nonnegative matrix factorization (NMF) has been shown recently to be tractable under the separability assumption, which amounts for the columns of the input data matrix to belong to the convex cone generated by a small number of columns. Bittorf, Recht, R´e and Tropp (‘Factoring nonnegative matrices with linear programs’, NIPS 2012) proposed a linear programming (LP) model, referred to as HottTopixx, which is robust under any small perturbation of the input matrix. However, HottTopixx has two important drawbacks: (i) the input matrix has to be normalized, and (ii) the factorization rank has to be known in advance. In this talk, we generalize HottTopixx in order to resolve these two drawbacks, that is, we propose a new LP model which does not require normalization and detects the factorization rank automatically. Moreover, the new LP model is more flexible, significantly more tolerant to noise, and can easily be adapted to handle outliers and other noise models. We show on several synthetic datasets that it outperforms HottTopixx while competing favorably with two state-of-the-art methods. |
关 键 词: | 线性优化; 矩阵分解; 因子分解 |
课程来源: | 视频讲座网 |
数据采集: | 2023-07-20:chenxin01 |
最后编审: | 2023-07-20:chenxin01 |
阅读次数: | 32 |