An Approximate Memory Architecture for a Reduction of Refresh Power Consumption in Deep Learning Applications

Duy Thanh Nguyen, Hyun Kim, Hyuk Jae Lee, Ik Joon Chang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

52 Scopus citations

Abstract

A DRAM device requires periodic refresh operations to preserve data integrity, which incurs significant power consumption. This paper proposes a new memory architecture to reduce the power consumption by refresh operations by slowing down the refresh rate. Slow refresh may cause a loss of data stored in a DRAM cell, which affects the correctness of the computation using the lost data. The proposed memory architecture attempts to avoid the problem caused by lost data by taking advantage of the error-tolerant property of deep learning applications that are tolerant to presence of a small amount of errors. For data storage in deep learning applications, the approximate DRAM architecture stores the data in a transposed manner so that data are sorted according to their significance. DRAM organization is modified to support the control of the refresh period according to the significance of stored data. Simulation results with GoogLeNet and VGG-16 show that the power consumption is reduced by 69.68% with a negligible drop of the classification accuracy for both GoogLeNet and VGG-16.

Original languageEnglish
Title of host publication2018 IEEE International Symposium on Circuits and Systems, ISCAS 2018 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781538648810
DOIs
StatePublished - 26 Apr 2018
Event2018 IEEE International Symposium on Circuits and Systems, ISCAS 2018 - Florence, Italy
Duration: 27 May 201830 May 2018

Publication series

NameProceedings - IEEE International Symposium on Circuits and Systems
Volume2018-May
ISSN (Print)0271-4310

Conference

Conference2018 IEEE International Symposium on Circuits and Systems, ISCAS 2018
Country/TerritoryItaly
CityFlorence
Period27/05/1830/05/18

Keywords

  • approximate computing
  • approximate memory
  • deep learning
  • low-power DRAM
  • row-level fresh

Fingerprint

Dive into the research topics of 'An Approximate Memory Architecture for a Reduction of Refresh Power Consumption in Deep Learning Applications'. Together they form a unique fingerprint.

Cite this