国际中文开源期刊平台

logo
open
cover
当前浏览量 747811
当前下载量 1060535

中国心理学前沿

Psychology of China

ISSN Print: 2664-1798
ISSN Online: 2664-1801
联系编辑部
加入我们
友情链接
邮箱订阅
选择期刊索引
选择期刊
您的邮箱地址

基于敏感性的卷积神经网络通道剪枝算法

Channel Pruning of Convolutional Neural Network based on Sensitivity

中国心理学前沿 / 2024,6(11):46-53 / 2024-11-11 look25 look25
  • 作者: 杨晨彬     
  • 单位:
    江苏警官学院,南京
  • 关键词: 卷积神经网络;通道剪枝;敏感性
  • Convolutional neural networks; Channel pruning; Sensitivity
  • 摘要: 深度神经网络模型(convolutionalneuralnetworks,CNNs)从卷积层到全连接层都存在着大量冗余的参数,模型剪枝是一种用于减少深度卷积神经网络模型内存消耗和浮点运算(FLOPs)的优化技术,它通过剔除模型中“不重要”的权重,使得模型的参数量和计算量大大减少,同时尽量保证模型的精度不受影响,可以实现参数量与模型性能之间的最佳平衡。然而,对于较高的剪枝率的情况下,当前的通道剪枝方法往往都是将通道的范数大小用作剪枝的度量标准,缺乏相应的理论支撑,也难以取得理想的剪枝效果,这样就会导致压缩后模型精度的显著下降。为了解决这个问题,本文提出一种新的基于敏感性度量的通道剪枝方法,利用二阶敏感性作为标准来对通道的重要性进行衡量。通过理论推导,将传统敏感性计算的计算方式从针对单一权重拓展到了针对整个网络通道剪枝(ChannelPruning),首先从理论上证明了可以使用通道中所有权重敏感性总和的标准来量化整个通道的敏感性,接着根据得到的通道敏感性的相对大小来删减其中不敏感的通道,以此来完成通道剪枝。在各种不同网络模型结构上的实验表明,相较于传统方法,本文可以在牺牲少量精度的同时显著提高剪枝率。例如,在精度轻微损失的情况下,使CIFAR-10数据集上的FLOPs减少60%以上。并且,在ImageNet数据集上,基于ResNet34的剪枝减少了52.1%的FLOPs,而只损失了0.23%的精度。
  • Convolutional neural networks (CNNs) have a large number of redundant parameters from the convolutional layer to the fully connected layer. Model pruning is an optimization technique used to reduce the memory consumption and floating-point operations (FLOPs) of deep convolutional neural network models. By removing “unimportant” weights from the model, it reduces the number of parameters and computation, while ensuring that the accuracy of the model is not affected as much as possible, achieving the best balance between parameter quantity and model performance. However, for the case of higher pruning rate, current channel pruning methods often use the norm the channel as the criterion of pruning, and lack the corresponding theoretical support. It is difficult to achieve the ideal pruning effect, which will lead to a significant decline in the accuracy of the compressed model. To solve this problem, this paper proposes a new channel pruning method based on sensitivity measurement, which uses second-order sensitivity as a criterion to measure the importance of channels. Through theoretical derivation, the traditional sensitivity calculation is extended from weight to channel, and it is proved that the sensitivity of the entire channel can be quantified by the sum of the weight’s sensitivity in the channel, and then the insensitive channel can be deleted to complete the channel pruning. Experiments on a variety of different CNNs architectures show that our paper can significantly improve the pruning rate while losing a small amount of accuracy. For example, with a slight loss of accuracy, FLOPs on the CIFAR-10 dataset were reduced by more than 60%. Additionally, on the ImageNet dataset, ResNet34-based pruning reduces FLOPs by 52.1% while losing only 0.23% accuracy.
  • DOI: https://doi.org/10.35534/pc.0611006
  • 引用: 杨晨彬.基于敏感性的卷积神经网络通道剪枝算法[J].中国心理学前沿,2024,6(11):46-53.
已有账号
027-59302486