TraNNsformer: Neural network transformation for memristive crossbar based neuromorphic system design
August 26, 2017 ยท Declared Dead ยท ๐ 2017 IEEE/ACM International Conference on Computer-Aided Design (ICCAD)
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Aayush Ankit, Abhronil Sengupta, Kaushik Roy
arXiv ID
1708.07949
Category
cs.ET: Emerging Technologies
Cross-listed
cs.NE
Citations
38
Venue
2017 IEEE/ACM International Conference on Computer-Aided Design (ICCAD)
Last Checked
2 months ago
Abstract
Implementation of Neuromorphic Systems using post Complementary Metal-Oxide-Semiconductor (CMOS) technology based Memristive Crossbar Array (MCA) has emerged as a promising solution to enable low-power acceleration of neural networks. However, the recent trend to design Deep Neural Networks (DNNs) for achieving human-like cognitive abilities poses significant challenges towards the scalable design of neuromorphic systems (due to the increase in computation/storage demands). Network pruning [7] is a powerful technique to remove redundant connections for designing optimally connected (maximally sparse) DNNs. However, such pruning techniques induce irregular connections that are incoherent to the crossbar structure. Eventually they produce DNNs with highly inefficient hardware realizations (in terms of area and energy). In this work, we propose TraNNsformer - an integrated training framework that transforms DNNs to enable their efficient realization on MCA-based systems. TraNNsformer first prunes the connectivity matrix while forming clusters with the remaining connections. Subsequently, it retrains the network to fine tune the connections and reinforce the clusters. This is done iteratively to transform the original connectivity into an optimally pruned and maximally clustered mapping. Without accuracy loss, TraNNsformer reduces the area (energy) consumption by 28% - 55% (49% - 67%) with respect to the original network. Compared to network pruning, TraNNsformer achieves 28% - 49% (15% - 29%) area (energy) savings. Furthermore, TraNNsformer is a technology-aware framework that allows mapping a given DNN to any MCA size permissible by the memristive technology for reliable operations.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Emerging Technologies
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
In-memory hyperdimensional computing
R.I.P.
๐ป
Ghosted
Magnetic skyrmion-based synaptic devices
R.I.P.
๐ป
Ghosted
Memristors -- from In-memory computing, Deep Learning Acceleration, Spiking Neural Networks, to the Future of Neuromorphic and Bio-inspired Computing
R.I.P.
๐ป
Ghosted
DNA-Based Storage: Trends and Methods
R.I.P.
๐ป
Ghosted
Neuro-memristive Circuits for Edge Computing: A review
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted