首页应用数学
   首页数学
   首页自然科学
0


大规模优化内点法的不精确搜索方向

Inexact Search Directions in Interior Point Methods for Large Scale Optimization
课程网址: http://videolectures.net/nipsworkshops2012_gondzio_optimization/  
主讲教师: Jacek Gondzio
开课单位: 爱丁堡大学
开课时间: 2013-06-15
课程语种: 英语
中文简介:
用于线性和二次优化的内点方法(IPMs)已经非常成功,但有时它们会与过多的内存需求和每次迭代的高计算成本作斗争。使用适当的预处理迭代方法克服了这些缺点。 在演讲的第一部分,我将讨论在IPM中应用*不精确*牛顿方法的理论问题,以及重新设计该方法的两个目标: (a) 避免显式访问问题数据,只允许使用Hessian和Jacobian及其转置来执行矩阵向量积;以及 (b) 允许方法在有限内存范围内工作。 在演讲的第二部分,我将评论IPM上下文中适用的不同预处理程序,并给出数值结果,以展示这些预处理程序在具有挑战性的应用程序中的实际性能:-压缩感知中产生的稀疏近似问题,-PageRank(Google)问题。计算经验表明,使用不精确牛顿方向的内点法在许多情况下优于一阶方法。
课程简介: Interior Point Methods (IPMs) for linear and quadratic optimization have been very successful but occasionally they struggle with excessive memory requirements and high computational costs per iteration. The use of appropriately preconditioned iterative methods overcomes these drawbacks. In the first part of the talk, I will address the theoretical issues of applying the *inexact* Newton method in an IPM and a redesign of the method bearing two objectives in mind: (a) avoiding explicit access to the problem data and allowing only matrix-vector products to be executed with the Hessian and Jacobian and its transpose; and (b) allowing the method to work in a limited-memory regime. In the second part of the talk, I will comment on different preconditioners applicable in the IPM context and present the numerical results which demonstrate the practical performance of several of such preconditioners on challenging applications: - sparse approximation problems arising in compressed sensing, - PageRank (Google) problems. The computational experience reveals that the interior point method using inexact Newton directions outperforms the first order methods on many instances of these applications.
关 键 词: 线性; 内点方法; 迭代
课程来源: 视频讲座网
数据采集: 2020-12-07:yxd
最后编审: 2020-12-07:yxd
阅读次数: 31