Intrinsic variation effect in memristive neural network with weight quantization

Jinwoo Park, Min Suk Song, Sangwook Youn, Tae Hyeon Kim, Sungjoon Kim, Kyungho Hong, Hyungjin Kim

Research output: Contribution to journalArticlepeer-review

20 Scopus citations

Abstract

To analyze the effect of the intrinsic variations of the memristor device on the neuromorphic system, we fabricated 32 × 32 Al2O3/TiO x -based memristor crossbar array and implemented 3 bit multilevel conductance as weight quantization by utilizing the switching characteristics to minimize the performance degradation of the neural network. The tuning operation for 8 weight levels was confirmed with a tolerance of ±4 μA (±40 μS). The endurance and retention characteristics were also verified, and the random telegraph noise (RTN) characteristics were measured according to the weight range to evaluate the internal stochastic variation effect. Subsequently, a memristive neural network was constructed by off-chip training with differential memristor pairs for the Modified National Institute of Standards and Technology (MNIST) handwritten dataset. The pre-trained weights were quantized, and the classification accuracy was evaluated by applying the intrinsic variations to each quantized weight. The intrinsic variations were applied using the measured weight inaccuracy given by the tuning tolerance, RTN characteristics, and the fault device yield. We believe these results should be considered when the pre-trained weights are transferred to a memristive neural network by off-chip training.

Original languageEnglish
Article number375203
JournalNanotechnology
Volume33
Issue number37
DOIs
StatePublished - 10 Sep 2022

Keywords

  • intrinsic variation
  • memristive neural network
  • memristor crossbar array
  • neuromorphic system
  • off-chip training
  • weight quantization

Fingerprint

Dive into the research topics of 'Intrinsic variation effect in memristive neural network with weight quantization'. Together they form a unique fingerprint.

Cite this