Channel Pruning via Multi-Criteria based on Weight Dependency


Abstract in English

Channel pruning has demonstrated its effectiveness in compressing ConvNets. In many related arts, the importance of an output feature map is only determined by its associated filter. However, these methods ignore a small part of weights in the next layer which disappears as the feature map is removed. They ignore the phenomenon of weight dependency. Besides, many pruning methods use only one criterion for evaluation and find a sweet spot of pruning structure and accuracy in a trial-and-error fashion, which can be time-consuming. In this paper, we proposed a channel pruning algorithm via multi-criteria based on weight dependency, CPMC, which can compress a pre-trained model directly. CPMC defines channel importance in three aspects, including its associated weight value, computational cost, and parameter quantity. According to the phenomenon of weight dependency, CPMC gets channel importance by assessing its associated filter and the corresponding partial weights in the next layer. Then CPMC uses global normalization to achieve cross-layer comparison. Finally, CPMC removes less important channels by global ranking. CPMC can compress various CNN models, including VGGNet, ResNet, and DenseNet on various image classification datasets. Extensive experiments have shown CPMC outperforms the others significantly.

Download