0


学而不忘

Learning without Forgetting
课程网址: http://videolectures.net/eccv2016_li_without_forgetting/  
主讲教师: Zhizhong Li
开课单位: 伊利诺伊大学
开课时间: 2016-10-24
课程语种: 英语
中文简介:
在构建统一的视觉系统或逐渐向系统添加新功能时,通常的假设是所有任务的训练数据始终可用。然而,随着任务数量的增长,在这些数据上存储和再训练变得不可行的。一个新的问题出现了,我们向卷积神经网络(CNN)添加了新功能,但其现有功能的训练数据不可用。我们提出了不遗忘学习方法,该方法只使用新的任务数据来训练网络,同时保留原始能力。与常用的特征提取和微调适应技术相比,我们的方法表现良好,并且与使用我们假设不可用的原始任务数据的多任务学习表现相似。一个更令人惊讶的观察结果是,“不忘学习”可能会取代微调,成为提高新任务表现的标准实践。
课程简介: When building a unified vision system or gradually adding new capabilities to a system, the usual assumption is that training data for all tasks is always available. However, as the number of tasks grows, storing and retraining on such data becomes infeasible. A new problem arises where we add new capabilities to a Convolutional Neural Network (CNN), but the training data for its existing capabilities are unavailable. We propose our Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities. Our method performs favorably compared to commonly used feature extraction and fine-tuning adaption techniques and performs similarly to multitask learning that uses original task data we assume unavailable. A more surprising observation is that Learning without Forgetting may be able to replace fine-tuning as standard practice for improved new task performance.
关 键 词: 神经网络; 任务数据; 原始能力; 学而不忘
课程来源: 视频讲座网
数据采集: 2023-03-27:chenxin01
最后编审: 2023-05-22:chenxin01
阅读次数: 17