0


稀疏编码的非凸优化分析

Analyzing Non-Convex Optimization for Sparse Coding
课程网址: http://videolectures.net/colt2015_ma_sparse_coding/  
主讲教师: Tengyu Ma
开课单位: 普林斯顿大学
开课时间: 2015-08-20
课程语种: 英语
中文简介:
稀疏编码是许多领域的基本任务,包括信号处理、神经科学和机器学习,其目标是学习一种基,使给定数据集(如果存在的话)能够进行稀疏表示。它的标准表述是一个非凸优化问题,在实践中采用基于交替最小化的启发式方法求解。最近的工作已经产生了几种具有可证明保证的稀疏编码算法,但有些令人惊讶的是,这些算法的性能优于简单的交替最小化启发式。在这里,我们给出了一个理解交替最小化的一般框架,我们利用它来分析现有的启发式,并设计具有可证明保证的新启发式。其中一些算法似乎可以在简单的神经架构上实现,这是Olshausen和Field \cite{of}引入稀疏编码的最初动机。我们还给出了第一个有效的稀疏编码算法,它几乎达到了非相干字典稀疏恢复的信息理论极限。以前所有接近或超过这个极限的算法都是在某个自然参数的时间指数中运行的。最后,我们的算法改进了现有方法的样本复杂度。我们相信我们的分析框架将在其他使用简单迭代算法的设置中有应用。
课程简介: Sparse coding is a basic task in many fields including signal processing, neuroscience and machine learning where the goal is to learn a basis that enables a sparse representation of a given set of data, if one exists. Its standard formulation is as a non-convex optimization problem which is solved in practice by heuristics based on alternating minimization. Recent work has resulted in several algorithms for sparse coding with provable guarantees, but somewhat surprisingly these are outperformed by the simple alternating minimization heuristics. Here we give a general framework for understanding alternating minimization which we leverage to analyze existing heuristics and to design new ones also with provable guarantees. Some of these algorithms seem implementable on simple neural architectures, which was the original motivation of Olshausen and Field cite{OF} in introducing sparse coding. We also give the first efficient algorithm for sparse coding that works almost up to the information theoretic limit for sparse recovery on incoherent dictionaries. All previous algorithms that approached or surpassed this limit run in time exponential in some natural parameter. Finally, our algorithms improve upon the sample complexity of existing approaches. We believe that our analysis framework will have applications in other settings where simple iterative algorithms are used.
关 键 词: 稀疏编码; 启发式方法; 非凸优化问题
课程来源: 视频讲座网
数据采集: 2022-12-06:chenjy
最后编审: 2022-12-06:chenjy
阅读次数: 13