TY - JOUR
T1 - Revisiting Stochastic Computing in the Era of Nanoscale Nonvolatile Technologies
AU - Agrawal, Amogh
AU - Chakraborty, Indranil
AU - Roy, Deboleena
AU - Saxena, Utkarsh
AU - Sharmin, Saima
AU - Koo, Minsuk
AU - Shim, Yong
AU - Srinivasan, Gopalakrishnan
AU - Liyanagedera, Chamika
AU - Sengupta, Abhronil
AU - Roy, Kaushik
N1 - Publisher Copyright:
© 1993-2012 IEEE.
PY - 2020/12
Y1 - 2020/12
N2 - In this era of nanoscale technologies, the inherent characteristics of some nonvolatile devices, such as resistive random access memory (ReRAM), phase-change material (PCM), and spintronics, can emulate stochastic functionalities. Traditionally, these devices have been engineered to suppress the stochastic switching behavior as it poses reliability concerns for memory storage and logic applications. However, leveraging stochasticity in such devices led to a renewed interest in hardware-software codesign of stochastic algorithms since the CMOS-based implementations of stochastic algorithms involve cumbersome circuitry to generate 'stochastic bits.' In this article, we consider two classes of problems: deep neural networks (DNNs) and combinatorial optimization. The rapidly growing demands of artificial intelligence (AI) have sparked an interest in energy-efficient implementations of large DNNs, with binary representations of synaptic weights and neuronal activities. Stochasticity plays an important role in leveraging the benefits of these binary representations, leading to model compression and optimization during training. In combinatorial optimization, such as graph coloring or traveling salesman problems, stochastic algorithms, such as the Ising computing model, have been shown to be effective. These problems require exhaustive computational procedures, and the Ising model uses a natural annealing agent to achieve near-optimal solutions in a reasonable timescale, without getting stuck in 'local minima.' In this article, we present a broad review of stochastic computing utilizing the stochastic switching characteristics of devices based on nanoscale nonvolatile technologies. We show how to codesign of the devices and algorithms that can enable optimal solutions for both combinatorial problems and binary neural networks for local learning and inference. Directly mapping the nonvolatile device characteristics to the stochastic algorithms without the need for storing the bits in a separate memory leads to efficient use of hardware.
AB - In this era of nanoscale technologies, the inherent characteristics of some nonvolatile devices, such as resistive random access memory (ReRAM), phase-change material (PCM), and spintronics, can emulate stochastic functionalities. Traditionally, these devices have been engineered to suppress the stochastic switching behavior as it poses reliability concerns for memory storage and logic applications. However, leveraging stochasticity in such devices led to a renewed interest in hardware-software codesign of stochastic algorithms since the CMOS-based implementations of stochastic algorithms involve cumbersome circuitry to generate 'stochastic bits.' In this article, we consider two classes of problems: deep neural networks (DNNs) and combinatorial optimization. The rapidly growing demands of artificial intelligence (AI) have sparked an interest in energy-efficient implementations of large DNNs, with binary representations of synaptic weights and neuronal activities. Stochasticity plays an important role in leveraging the benefits of these binary representations, leading to model compression and optimization during training. In combinatorial optimization, such as graph coloring or traveling salesman problems, stochastic algorithms, such as the Ising computing model, have been shown to be effective. These problems require exhaustive computational procedures, and the Ising model uses a natural annealing agent to achieve near-optimal solutions in a reasonable timescale, without getting stuck in 'local minima.' In this article, we present a broad review of stochastic computing utilizing the stochastic switching characteristics of devices based on nanoscale nonvolatile technologies. We show how to codesign of the devices and algorithms that can enable optimal solutions for both combinatorial problems and binary neural networks for local learning and inference. Directly mapping the nonvolatile device characteristics to the stochastic algorithms without the need for storing the bits in a separate memory leads to efficient use of hardware.
KW - Combinatorial optimization
KW - Ising spin model
KW - neuromorphic computing
KW - nonvolatile memory (NVM)
KW - stochastic computing
KW - stochastic neural networks
UR - http://www.scopus.com/inward/record.url?scp=85097345804&partnerID=8YFLogxK
U2 - 10.1109/TVLSI.2020.2991679
DO - 10.1109/TVLSI.2020.2991679
M3 - Article
AN - SCOPUS:85097345804
SN - 1063-8210
VL - 28
SP - 2481
EP - 2494
JO - IEEE Transactions on Very Large Scale Integration (VLSI) Systems
JF - IEEE Transactions on Very Large Scale Integration (VLSI) Systems
IS - 12
M1 - 9096622
ER -