MATE: Memory-and Retraining-Free Error Correction for Convolutional Neural Network Weights

Myeungjae Jang, Jeongkyu Hong

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

Convolutional neural networks (CNNs) are one of the most frequently used artificial intelligence techniques. Among CNN-based applications, small and timing-sensitive applications have emerged, which must be reliable to prevent severe accidents. However, as the small and timing-sensitive systems do not have sufficient system resources, they do not possess proper error protection schemes. In this paper, we propose MATE, which is a low-cost CNN weight error correction technique. Based on the observation that all mantissa bits are not closely related to the accuracy, MATE replaces some mantissa bits in the weight with error correction codes. Therefore, MATE can provide high data protection without requiring additional memory space or modifying the memory architecture. The experimental results demonstrate that MATE retains nearly the same accuracy as the ideal error-free case on erroneous DRAM and has approximately 60% accuracy, even with extremely high bit error rates.

Original languageEnglish
Pages (from-to)22-28
Number of pages7
JournalJournal of Information and Communication Convergence Engineering
Volume19
Issue number1
DOIs
StatePublished - Mar 2021

Keywords

  • Convolutional neural network
  • Error correction codes
  • Main memory
  • Reliability
  • Weight data

Fingerprint

Dive into the research topics of 'MATE: Memory-and Retraining-Free Error Correction for Convolutional Neural Network Weights'. Together they form a unique fingerprint.

Cite this