0


连续分布的kullback-leibler散度估计

Kullback-Leibler Divergence Estimation of Continuous Distributions
课程网址: http://videolectures.net/ripd07_cruz_kld/  
主讲教师: Fernando Perez-Cruz
开课单位: 普林斯顿大学
开课时间: 2008-02-25
课程语种: 英语
中文简介:
我们提出了一种估算连续密度之间KL偏差的通用方法,我们证明它几乎可以收敛。通常首先解决发散估计以估计密度。我们的主要结果表明这个中间步骤是不必要的,并且可以使用经验cdf或k最近邻密度估计来估计发散,其不会收敛到有限k的真实量度。收敛证明基于使用等待时间分布描述我们的估计量的统计量,如指数或Erlang。我们说明了所提出的估计量并展示了它们与基于密度估计的现有方法的比较,并且我们还概述了如何使用我们的差异估计器来解决两个样本问题。
课程简介: We present a universal method for estimating the KL divergence between continuous densities and we prove it converges almost surely. Divergence estimation is typically solved estimating the densities first. Our main result shows this intermediate step is unnecessary and that the divergence can be either estimated using the empirical cdf or k-nearest-neighbour density estimation, which does not converge to the true measure for finite k. The convergence proof is based on describing the statistics of our estimator using waiting-times distributions, as the exponential or Erlang. We illustrate the proposed estimators and show how they compare to existing methods based on density estimation, and we also outline how our divergence estimators can be used for solving the two-sample problem.
关 键 词: 连续密度; KL偏差; 发散估计
课程来源: 视频讲座网
最后编审: 2019-09-14:lxf
阅读次数: 35