RepSGD: Channel Pruning Using Reparamerization for Accelerating Convolutional Neural Networks

Nam Joon Kim, Hyun Kim

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Channel pruning is a popular method for compressing convolutional neural networks (CNNs) while maintaining acceptable accuracy. Most existing channel pruning methods use the approach of zeroing unnecessary filters and then removing them. To address the limitations of existing approaches, methods of creating forcibly filter redundancy and then removing redundant filters have been proposed without heuristic knowledge. However, these methods also use a deformed gradient to make filters identical, and performance degradation is inevitable because the parameters cannot be updated using the original gradients. To solve these problems, this study proposes RepSGD, which can compress CNNs simply and efficiently. RepSGD inserts a new point-wise convolution layer after the existing standard convolution layer. Subsequently, only new point-wise convolution layers are trained to produce filter redundancy (i.e., to make the filters identical), whereas the standard convolution layers are trained using the original gradient. After training, RepSGD merges two consecutive convolution layers into one convolution layer. Subsequently, the redundant filters in the merged convolution layer are pruned. Because RepSGD does not change the original architecture of the CNN, additional inference computation is not required, and it is possible to support training from scratch. In addition, using the original gradient in RepSGD optimizes the objective function of the CNNs better. We show that RepSGD outperforms existing pruning methods in various models and datasets through extensive experiments.

Original languageEnglish
Title of host publicationISCAS 2023 - 56th IEEE International Symposium on Circuits and Systems, Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781665451093
DOIs
StatePublished - 2023
Event56th IEEE International Symposium on Circuits and Systems, ISCAS 2023 - Monterey, United States
Duration: 21 May 202325 May 2023

Publication series

NameProceedings - IEEE International Symposium on Circuits and Systems
Volume2023-May
ISSN (Print)0271-4310

Conference

Conference56th IEEE International Symposium on Circuits and Systems, ISCAS 2023
Country/TerritoryUnited States
CityMonterey
Period21/05/2325/05/23

Keywords

  • Channel Pruning
  • CNN
  • Network Compression
  • Reparameterization

Fingerprint

Dive into the research topics of 'RepSGD: Channel Pruning Using Reparamerization for Accelerating Convolutional Neural Networks'. Together they form a unique fingerprint.

Cite this