0


希尔伯特施密特的依赖最大化的方法来监督结构发现

A Hilbert-Schmidt Dependence Maximization Approach to Unsupervised Structure Discovery
课程网址: http://videolectures.net/mlg08_gretton_ahsdma/  
主讲教师: Arthur Gretton
开课单位: 伦敦大学学院
开课时间: 2008-08-25
课程语种: 英语
中文简介:
在(Song等人,2007)的最近工作中,已经提出通过求解分区矩阵,通过相对于预定义的簇结构Y最大化Hilbert-Schmidt独立性标准来执行聚类。我们在这里将这种方法扩展到集群结构Y不固定但是要优化的数量的情况,并且我们使用独立标准,该标准已被证明在小样本量下更敏感(Hilbert-Schmidt标准化信息准则) ,或HSNIC(Fukumizu等,2008))。我们在两个场景中演示了这个框架的用法。首先,我们采用一种集群结构选择方法,其中HSNIC用于从几个候选中选择一个结构。在第二个中,我们考虑通过直接优化Y来发现结构的情况。
课程简介: In recent work by (Song et al., 2007), it has been proposed to perform clustering by maximizing a Hilbert-Schmidt independence criterion with respect to a predefined cluster structure Y, by solving for the partition matrix. We extend this approach here to the case where the cluster structure Y is not fixed, but is a quantity to be optimized and we use an independence criterion which has been shown to be more sensitive at small sample sizes (the Hilbert-Schmidt Normalized Information Criterion, or HSNIC (Fukumizu et al., 2008)). We demonstrate the use of this framework in two scenarios. In the first, we adopt a cluster structure selection approach in which the HSNIC is used to select a structure from several candidates. In the second, we consider the case where we discover structure by directly optimizing Y.
关 键 词: 聚类与预定义; 集群结构最大化; 分区矩阵
课程来源: 视频讲座网
最后编审: 2020-10-22:chenxin
阅读次数: 61