TY - GEN
T1 - Characterizing Memory Access Patterns of Various Convolutional Neural Networks for Utilizing Processing-in-Memory
AU - Jang, Jihoon
AU - Kim, Hyun
AU - Lee, Hyokeun
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Convolutional neural network (CNN) models require deeper networks and more training data for better performance, which in turn results in greater computational and memory requirements. In this paper, we analyze the memory access patterns that occur in main memory during the training processes of various CNN models. CNN training is a linear procedure consisting of a forward pass (FP) and a backward pass (BP). As a result of the analysis, we found that BP accounted for 83.4% of the total main memory accesses on average. Therefore, CNN training including FP and BP is much more memory-intensive than CNN inference using only FP. This demonstrates that CNN training is a suitable application for near-data processing to reduce memory bottlenecks and conserve computational resources.
AB - Convolutional neural network (CNN) models require deeper networks and more training data for better performance, which in turn results in greater computational and memory requirements. In this paper, we analyze the memory access patterns that occur in main memory during the training processes of various CNN models. CNN training is a linear procedure consisting of a forward pass (FP) and a backward pass (BP). As a result of the analysis, we found that BP accounted for 83.4% of the total main memory accesses on average. Therefore, CNN training including FP and BP is much more memory-intensive than CNN inference using only FP. This demonstrates that CNN training is a suitable application for near-data processing to reduce memory bottlenecks and conserve computational resources.
KW - Convolutional Neural Networks
KW - Memory access pattern
KW - Near data processing
KW - Network training
KW - Processing-in-Memory
UR - http://www.scopus.com/inward/record.url?scp=85150473541&partnerID=8YFLogxK
U2 - 10.1109/ICEIC57457.2023.10049894
DO - 10.1109/ICEIC57457.2023.10049894
M3 - Conference contribution
AN - SCOPUS:85150473541
T3 - 2023 International Conference on Electronics, Information, and Communication, ICEIC 2023
BT - 2023 International Conference on Electronics, Information, and Communication, ICEIC 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2023 International Conference on Electronics, Information, and Communication, ICEIC 2023
Y2 - 5 February 2023 through 8 February 2023
ER -