0


用于改进密集预测的平滑扩张卷积

Smoothed Dilated Convolutions for Improved Dense Prediction
课程网址: http://videolectures.net/kdd2018_wang_improved_dense_prediction/  
主讲教师: Zhengyang Wang
开课单位: 华盛顿州立大学
开课时间: 2018-11-23
课程语种: 英语
中文简介:
扩展卷积,也称为萎缩卷积,已在深度卷积神经网络(DCNNs)中被广泛用于各种任务,如语义图像分割、对象检测、音频生成、视频建模和机器翻译。然而,扩张卷积受到网格伪影的影响,这阻碍了具有扩张卷积的DCNN的性能。在这项工作中,我们通过研究扩张卷积的分解,提出了两种简单而有效的降阶方法。与现有模型不同,我们的方法通过平滑扩张卷积本身来解决网格伪影,现有模型通过关注级联扩张卷积层来探索解决方案。通过在原始操作和分解视图中对它们进行分析,我们进一步指出,这两种降阶方法本质上是相关的,并定义了可分离和共享(SS)操作,从而推广了所提出的方法。我们在两个数据集上彻底评估了我们的方法,并通过有效的感受野分析可视化了平滑效果。实验结果表明,我们的方法在增加可忽略的额外训练参数的同时,显著且一致地提高了具有扩张卷积的DCNN的性能。
课程简介: Dilated convolutions, also known as atrous convolutions, have been widely explored in deep convolutional neural networks (DCNNs) for various tasks like semantic image segmentation, object detection, audio generation, video modeling, and machine translation. However, dilated convolutions suffer from the gridding artifacts, which hampers the performance of DCNNs with dilated convolutions. In this work, we propose two simple yet effective degridding methods by studying a decomposition of dilated convolutions. Unlike existing models, which explore solutions by focusing on a block of cascaded dilated convolutional layers, our methods address the gridding artifacts by smoothing the dilated convolution itself. By analyzing them in both the original operation and the decomposition views, we further point out that the two degridding approaches are intrinsically related and define separable and shared (SS) operations, which generalize the proposed methods. We evaluate our methods thoroughly on two datasets and visualize the smoothing effect through effective receptive field analysis. Experimental results show that our methods yield significant and consistent improvements on the performance of DCNNs with dilated convolutions, while adding negligible amounts of extra training parameters.
关 键 词: 扩展卷积; 神经网络; 训练参数
课程来源: 视频讲座网
数据采集: 2022-12-07:chenjy
最后编审: 2022-12-07:chenjy
阅读次数: 47