0


有限记忆拟牛顿和Hessianfree牛顿非光滑优化方法

Limited-memory quasi-Newton and Hessianfree Newton methods for non-smooth optimization
课程网址: http://videolectures.net/nipsworkshops2010_schmidt_lmq/  
主讲教师: Mark Schmidt
开课单位: 不列颠哥伦比亚大学
开课时间: 2011-01-13
课程语种: 英语
中文简介:
有限记忆准牛顿和Hessian自由Newtonmethods是高维平滑目标无约束优化的两个主力。然而,在许多情况下,我们希望优化由于存在“简单”非平滑正则化项而不平滑的高维无约束目标函数。由于估计稀疏图形模型中出现的问题所致,在本文中我们关注的是将有限记忆准牛顿和Hessian自由牛顿方法扩展到这种情况的无约束优化的策略。我们首先考虑两个度量(子)梯度投影方法,以解决正则化器可分离的问题,然后考虑用于群可分离和不可分离正则化器的近似牛顿方法。我们将讨论几个应用程序,其中稀疏性鼓励正则化器用于估计图形模型参数和/或结构,包括稀疏,块状稀疏和结构化稀疏模型的估计。
课程简介: Limited-memory quasi-Newton and Hessian-free Newton methods are two workhorses of unconstrained optimization of high-dimensional smooth objectives. However, in many cases we would like to optimize a high-dimensional unconstrained objective function that is non-smooth due to the presence of a ‘simple’ non-smooth regularization term. Motivated by problems arising in estimating sparse graphical models, in this talk we focus on strategies for extending limited-memory quasi- Newton and Hessian-free Newton methods for unconstrained optimization to this scenario. We first consider two-metric (sub-) gradient projection methods for problems where the regularizer is separable, and then consider proximal Newton-like methods for group-separable and non-separable regularizers. We will discuss several applications where sparsity-encouraging regularizers are used to estimate graphical model parameters and/or structure, including the estimation of sparse, blockwise-sparse, and structured-sparse models.
关 键 词: 高维平滑目标; 无约束优化; 稀疏图形模型
课程来源: 视频讲座网
最后编审: 2019-09-07:lxf
阅读次数: 63