0


大规模优化内点法的不精确搜索方向

Inexact Search Directions in Interior Point Methods for Large Scale Optimization
课程网址: http://videolectures.net/nipsworkshops2012_gondzio_optimization/  
主讲教师: Jacek Gondzio
开课单位: 爱丁堡大学
开课时间: 2013-01-15
课程语种: 英语
中文简介:
用于线性和二次优化的内点法(IPM)已经非常成功,但偶尔会遇到过多的内存需求和每次迭代的高计算成本。使用适当预处理的迭代方法克服了这些缺点。在演讲的第一部分,我将讨论在IPM中应用*不精确*牛顿方法的理论问题,并重新设计具有两个目标的方法:\\(a)避免显式访问问题数据,只允许用Hessian和Jacobian及其转置执行矩阵向量积; (b)允许该方法在有限的记忆体系中工作。在演讲的第二部分,我将评论适用于IPM背景的不同预处理器,并提供数值结果,证明几个这样的预处理器在挑战中的实际性能应用:压缩感知中出现的稀疏逼近问题,PageRank(Google)问题。计算经验表明,在这些应用的许多实例中,使用不精确牛顿方向的内部点方法优于一阶方法。
课程简介: Interior Point Methods (IPMs) for linear and quadratic optimization have been very successful but occasionally they struggle with excessive memory requirements and high computational costs per iteration. The use of appropriately preconditioned iterative methods overcomes these drawbacks. In the first part of the talk, I will address the theoretical issues of applying the *inexact* Newton method in an IPM and a redesign of the method bearing two objectives in mind:\\ (a) avoiding explicit access to the problem data and allowing only matrix-vector products to be executed with the Hessian and Jacobian and its transpose; and\\ (b) allowing the method to work in a limited-memory regime. In the second part of the talk, I will comment on different preconditioners applicable in the IPM context and present the numerical results which demonstrate the practical performance of several of such preconditioners on challenging applications: - sparse approximation problems arising in compressed sensing, - PageRank (Google) problems. The computational experience reveals that the interior point method using inexact Newton directions outperforms the first order methods on many instances of these applications.
关 键 词: 内点法; 高计算成本; 迭代方法
课程来源: 视频讲座网
最后编审: 2019-09-08:lxf
阅读次数: 74