首页 → 数理统计学
信息几何Information Geometry |
|
| 课程网址: | http://videolectures.net/mlss05us_dasgupta_ig/ |
| 主讲教师: | Sanjoy Dasgupta |
| 开课单位: | 加州大学圣地亚哥分校 |
| 开课时间: | 2007-02-25 |
| 课程语种: | 英语 |
| 中文简介: | 本教程将重点介绍熵、指数族和信息投影。我们将从熵是唯一合理定义随机性的意义开始。然后,我们将使用熵来激励分布的指数族,包括普遍存在的高斯分布、泊松分布和二项式分布,以及非常一般的图形模型。将这种分布拟合到数据中的任务是一个凸优化问题,其几何解释为信息投影:将先验分布投影到线性子空间(由数据定义)上,以最小化特定的信息理论距离度量。这种投影操作是机器学习和统计中的一项核心优化任务,在其他方面较为熟悉。我们将研究这个问题的几何,并讨论两种常见的迭代算法。 |
| 课程简介: | This tutorial will focus on entropy, exponential families, and information projection. We'll start by seeing the sense in which entropy is the only reasonable definition of randomness. We will then use entropy to motivate exponential families of distributions — which include the ubiquitous Gaussian, Poisson, and Binomial distributions, but also very general graphical models. The task of fitting such a distribution to data is a convex optimization problem with a geometric interpretation as an "information projection": the projection of a prior distribution onto a linear subspace (defined by the data) so as to minimize a particular information-theoretic distance measure. This projection operation, which is more familiar in other guises, is a core optimization task in machine learning and statistics. We'll study the geometry of this problem and discuss two popular iterative algorithms for it. |
| 关 键 词: | 统计; 数学; 信息几何 |
| 课程来源: | 视频讲座网公开课 |
| 最后编审: | 2020-06-01:王勇彬(课程编辑志愿者) |
| 阅读次数: | 327 |
