AGT: Channel Pruning Using Adaptive Gradient Training for Accelerating Convolutional Neural Networks

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Channel pruning is a widely used approach that can efficiently reduce inference time and memory footprint by removing unnecessary channels in convolutional neural networks. In previous studies, channel pruning based on sparsity training was performed by imposing ℓ1 regularization on the scaling factor in batch normalization, and thereafter removing channels/filters below the predefined threshold. However, because channel pruning based on sparsity training imposes ℓ1 penalty on all scaling factors and uses the deformed gradient, an accuracy drop is inevitable. To address this problem, we propose a new sparsity training method referred to as adaptive gradient training (AGT). The proposed AGT can create a compact network without performance degradation using the original gradient to the extent possible without ℓ1 penalty usage. The proposed AGT can reduce the FLOPs of MobileNetV1 by 71.7% on the CIFAR-10 dataset while achieving an accuracy improvement of 0.04%. Consequently, the proposed method outperformed existing channel pruning methods for all datasets and models.

Original languageEnglish
Title of host publication2023 International Conference on Electronics, Information, and Communication, ICEIC 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798350320213
DOIs
StatePublished - 2023
Event2023 International Conference on Electronics, Information, and Communication, ICEIC 2023 - Singapore, Singapore
Duration: 5 Feb 20238 Feb 2023

Publication series

Name2023 International Conference on Electronics, Information, and Communication, ICEIC 2023

Conference

Conference2023 International Conference on Electronics, Information, and Communication, ICEIC 2023
Country/TerritorySingapore
CitySingapore
Period5/02/238/02/23

Keywords

  • Adaptive Gradient Training
  • Channel Pruning
  • Convolutional Neural Network
  • Pruning

Fingerprint

Dive into the research topics of 'AGT: Channel Pruning Using Adaptive Gradient Training for Accelerating Convolutional Neural Networks'. Together they form a unique fingerprint.

Cite this