首页人工智能
   首页机械学
0


机器学习方法稀疏的理论与算法

Sparse methods for machine learning: Theory and algorithms
课程网址: http://videolectures.net/ecmlpkdd2010_obozinski_smta/  
主讲教师: Guillaume Obozinski
开课单位: 国立巴黎高等矿业学院
开课时间: 2010-11-16
课程语种: 英语
中文简介:
近年来, l1 规范的规范化等稀疏方法引起了人们对统计、机器学习和信号处理的极大兴趣。在最小二乘线性回归的上下文中, 这个问题通常被称为套索或基础追求。本教程的目的是从理论和算法两方面统一概述稀疏凸方法最近对机器学习的贡献。本课程分为三部分: 第一部分重点是常规 l1 范数和变量选择, 介绍了关键算法和关键理论结果。然后, 在向量 (第二部分) 和矩阵 (第三部分) 上讨论了几个结构化的机器学习问题, 如多任务学习、稀疏主成分分析、多核学习和稀疏编码。
课程简介: Sparse methods such as regularization by the L1-norm has attracted a lot of interest in recent years in statistics, machine learning and signal processing. In the context of least-square linear regression, the problem is usually referred to as the Lasso or basis pursuit. The objective of the tutorial is to give a unified overview of the recent contributions of sparse convex methods to machine learning, both in terms of theory and algorithms. The course will be divided in three parts: in the first part, the focus will be on the regular L1-norm and variable selection, introducing key algorithms and key theoretical results. Then, several more structured machine learning problems will be discussed, on vectors (second part) and matrices (third part), such as multi-task learning, sparse principal component analysis, multiple kernel learning and sparse coding.
关 键 词: 计算机科学; 机器学习; 学习方法
课程来源: 视频讲座网
最后编审: 2020-06-11:dingaq
阅读次数: 60