Literature DB >> 30990448

Toward Compact ConvNets via Structure-Sparsity Regularized Filter Pruning.

Shaohui Lin, Rongrong Ji, Yuchao Li, Cheng Deng, Xuelong Li.   

Abstract

The success of convolutional neural networks (CNNs) in computer vision applications has been accompanied by a significant increase of computation and memory costs, which prohibits their usage on resource-limited environments, such as mobile systems or embedded devices. To this end, the research of CNN compression has recently become emerging. In this paper, we propose a novel filter pruning scheme, termed structured sparsity regularization (SSR), to simultaneously speed up the computation and reduce the memory overhead of CNNs, which can be well supported by various off-the-shelf deep learning libraries. Concretely, the proposed scheme incorporates two different regularizers of structured sparsity into the original objective function of filter pruning, which fully coordinates the global output and local pruning operations to adaptively prune filters. We further propose an alternative updating with Lagrange multipliers (AULM) scheme to efficiently solve its optimization. AULM follows the principle of alternating direction method of multipliers (ADMM) and alternates between promoting the structured sparsity of CNNs and optimizing the recognition loss, which leads to a very efficient solver ( 2.5× to the most recent work that directly solves the group sparsity-based regularization). Moreover, by imposing the structured sparsity, the online inference is extremely memory-light since the number of filters and the output feature maps are simultaneously reduced. The proposed scheme has been deployed to a variety of state-of-the-art CNN structures, including LeNet, AlexNet, VGGNet, ResNet, and GoogLeNet, over different data sets. Quantitative results demonstrate that the proposed scheme achieves superior performance over the state-of-the-art methods. We further demonstrate the proposed compression scheme for the task of transfer learning, including domain adaptation and object detection, which also show exciting performance gains over the state-of-the-art filter pruning methods.

Year:  2019        PMID: 30990448     DOI: 10.1109/TNNLS.2019.2906563

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw Learn Syst        ISSN: 2162-237X            Impact factor:   10.451


  3 in total

1.  Number of necessary training examples for Neural Networks with different number of trainable parameters.

Authors:  Th I Götz; S Göb; S Sawant; X F Erick; T Wittenberg; C Schmidkonz; A M Tomé; E W Lang; A Ramming
Journal:  J Pathol Inform       Date:  2022-07-06

2.  IoTNet: An Efficient and Accurate Convolutional Neural Network for IoT Devices.

Authors:  Tom Lawrence; Li Zhang
Journal:  Sensors (Basel)       Date:  2019-12-14       Impact factor: 3.576

3.  Building a Compact Convolutional Neural Network for Embedded Intelligent Sensor Systems Using Group Sparsity and Knowledge Distillation.

Authors:  Jungchan Cho; Minsik Lee
Journal:  Sensors (Basel)       Date:  2019-10-04       Impact factor: 3.576

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.