0


Net2Net:通过知识转移加速学习

Net2Net: Accelerating Learning via Knowledge Transfer
课程网址: http://videolectures.net/iclr2016_chen_net2net/  
主讲教师: Tianqi Chen
开课单位: 华盛顿大学
开课时间: 2016-05-27
课程语种: 英语
中文简介:
我们介绍了将存储在一个神经网络中的信息快速传输到另一个神经网络的技术。主要目的是加速训练一个显著更大的神经网络。在现实世界的工作流程中,人们经常在实验和设计过程中训练非常多不同的神经网络。这是一个浪费的过程,每个新模型都是从零开始训练的。我们的Net2Net技术通过瞬间将知识从以前的网络转移到每个更深或更宽的新网络,加速了实验过程。我们的技术是基于神经网络规范之间的函数保持转换的概念。这与之前的预训练方法不同,之前的预训练方法在向神经网络添加层时改变了神经网络表示的功能。使用我们的知识转移机制来增加Inception模块的深度,我们在ImageNet数据集上演示了一种新的最先进的准确性评级。
课程简介: We introduce techniques for rapidly transferring the information stored in one neural net into another neural net. The main purpose is to accelerate the training of a significantly larger neural net. During real-world workflows, one often trains very many different neural networks during the experimentation and design process. This is a wasteful process in which each new model is trained from scratch. Our Net2Net technique accelerates the experimentation process by instantaneously transferring the knowledge from a previous network to each new deeper or wider network. Our techniques are based on the concept of function-preserving transformations between neural network specifications. This differs from previous approaches to pre-training that altered the function represented by a neural net when adding layers to it. Using our knowledge transfer mechanism to add depth to Inception modules, we demonstrate a new state of the art accuracy rating on the ImageNet dataset.
关 键 词: 神经网络; 工作流程; 转移机制
课程来源: 视频讲座网
数据采集: 2023-04-16:chenxin01
最后编审: 2023-05-21:chenxin01
阅读次数: 30