0


高斯过程单眼3D的人跟踪

Gaussian Processes for Monocular 3D People tracking
课程网址: http://videolectures.net/gpip06_urtasun_gpm3p/  
主讲教师: Raquel Urtasun
开课单位: 多伦多大学
开课时间: 2007-02-25
课程语种: 英语
中文简介:
我们提倡使用高斯过程(GP)来学习3D人物跟踪的人体姿势和运动的先前模型。高斯过程潜变量模型(GPLVM)提供人体姿势的低维嵌入,并且定义密度函数,其提供接近训练数据的更高概率。高斯过程动态模型(GPDM)还根据另一个GP提供了复杂的动态模型。通过使用贝叶斯模型,可以从相对少量的训练数据中平均GPLVM和GPDM,并且它们优雅地推广到训练集之外的运动。我们表明,尽管图像测量较弱且噪声较大且图像可能性非常简单,但这些先验对于跟踪一系列人类行走方式仍然有效。跟踪是根据滑动时间窗内的短姿势序列上的MAP估计器来制定的。
课程简介: We advocate the use of Gaussian Processes (GPs) to learn prior models of human pose and motion for 3D people tracking. The Gaussian Process Latent variable model (GPLVM) provides a low-dimensional embedding of the human pose, and defines a density function that gives higher probability to poses close to the training data. The Gaussian Process Dynamical Model (GPDM) provides also a complex dynamical model in terms of another GP. With the use of Bayesian model averaging both GPLVM and GPDM can be learned from relatively small amounts of training data, and they generalize gracefully to motions outside the training set. We show that such priors are effective for tracking a range of human walking styles, despite weak and noisy image measurements and a very simple image likelihood. Tracking is formulated in terms of a MAP estimator on short sequences of poses within a sliding temporal window.
关 键 词: 高斯过程潜变量模型; 密度函数; 图像测量; 制定跟踪
课程来源: 视频讲座网
最后编审: 2020-06-27:yumf
阅读次数: 51