0


学习生成原型图的信息理论方法

An Information Theoretic Approach to Learning Generative Graph Prototypes
课程网址: http://videolectures.net/simbad2011_hancock_generative/  
主讲教师: Edwin Hancock
开课单位: 约克大学
开课时间: 2011-10-17
课程语种: 英语
中文简介:
我们提出了一种通过采用最小描述长度方法为图集构建生成模型的方法。该方法是根据学习生成的上位图模型提出的,可以通过适当的采样机制从中获得新的样本。我们从构造超图上节点和边的出现概率分布开始。我们使用冯·诺依曼熵编码超图的复杂度。开发了一种EM算法的变体,以最小化描述长度标准,该标准将样本图和上标之间的节点对应关系视为缺失数据。最大化步骤包括使用分级分配更新节点对应信息和上标结构。在实验部分,我们证明了所提出算法的实用性,并表明我们的生成模型给出了良好的图分类结果。此外,我们展示了如何使用Jensen Shannon内核执行图聚类并生成新的样本图。
课程简介: We present a method for constructing a generative model for sets of graphs by adopting a minimum description length approach. The method is posed in terms of learning a generative supergraph model from which the new samples can be obtained by an appropriate sampling mechanism. We commence by constructing a probability distribution for the occurrence of nodes and edges over the supergraph. We encode the complexity of the supergraph using the von-Neumann entropy. A variant of EM algorithm is developed to minimize the description length criterion in which the node correspondences between the sample graphs and the supergraph are treated as missing data.The maximization step involves updating both the node correspondence information and the structure of supergraph using graduated assignment. In the experimental part, we demonstrate the practical utility of our proposed algorithm and show that our generative model gives good graph classification results. Besides, we show how to perform graph clustering with Jensen-Shannon kernel and generate new sample graphs.
关 键 词: 上位图模型; 采样机制; 熵编码超图
课程来源: 视频讲座网
最后编审: 2019-09-21:cwx
阅读次数: 22