Performance Analysis of a Phase-Change Memory System on Various CNN Inference Workloads

Jihoon Jang, Hyun Kim, Hyokeun Lee

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Scopus citations

Abstract

In this paper, we analyze the suitability of convolutional neural network (CNN) inference workloads on a phase-change memory (PCM) platform. CNN inference has an average of 14\times more read requests than write requests (i.e., read dominant) and a significantly low last-level cache misses per kilo instructions (LLC MPKI) of 2 on average (i.e., computation intensive). In addition, to compare the latency and energy of PCM and DRAM systems, we evaluate CNN inference workloads on two memory systems through a memory simulator. As a result, compared to DRAM, PCM can save total energy by 54% on average, but instruction per cycle (IPC) of PCM is reduced by an average of 28%. In conclusion, CNN inference is a workload suitable for PCM in terms of energy efficiency, but it must be accompanied by a scheme to improve IPC for practical use.

Original languageEnglish
Title of host publicationProceedings - International SoC Design Conference 2022, ISOCC 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages133-134
Number of pages2
ISBN (Electronic)9781665459716
DOIs
StatePublished - 2022
Event19th International System-on-Chip Design Conference, ISOCC 2022 - Gangneung-si, Korea, Republic of
Duration: 19 Oct 202222 Oct 2022

Publication series

NameProceedings - International SoC Design Conference 2022, ISOCC 2022

Conference

Conference19th International System-on-Chip Design Conference, ISOCC 2022
Country/TerritoryKorea, Republic of
CityGangneung-si
Period19/10/2222/10/22

Keywords

  • convolution neural networks
  • memory simulation
  • non-volatile memory
  • Phase-change memory

Fingerprint

Dive into the research topics of 'Performance Analysis of a Phase-Change Memory System on Various CNN Inference Workloads'. Together they form a unique fingerprint.

Cite this