Computation reuse-aware accelerator for neural networksShow others and affiliations
2020 (English)In: Hardware Architectures for Deep Learning, Institution of Engineering and Technology , 2020, p. 147-158Chapter in book (Other academic)
Abstract [en]
Power consumption has long been a significant concern in neural networks. In particular, large neural networks that implement novel machine learning techniques require much more computation, and hence power, than ever before. In this chapter, we showed that computation reuse could exploit the inherent redundancy in the arithmetic operations of the neural network to save power. Experimental results showed that computation reuse, when coupled with the approximation property of neural networks, can eliminate up to 90% of multiplication, effectively reducing power consumption by 61%, on average in the presented architecture. The proposed computation reuse -aware design can be extended in several ways. First, it can be integrated into several state-of-the-art customized architectures for LSTM, spiking, and convolutional neural network models to further reduce power consumption. Second, we can couple computation reuse with existing mapping and scheduling algorithms toward developing reusable scheduling and mapping methods for neural network. Computation reuse can also boost the performance of the methods that eliminate ineffectual computations in deep learning neural networks. Evaluating the impact of CORN on reliability and customizing the CORN architecture for FPGA-based neural network implementation are the other future works in this line.
Place, publisher, year, edition, pages
Institution of Engineering and Technology , 2020. p. 147-158
Keywords [en]
Arithmetic operations, Computation reuse-aware accelerator, Convolutional neural nets, Convolutional neural network, Learning (artificial intelligence), LSTM, Machine learning, Neural networks, Power aware computing, Power consumption, Spiking neural network
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:mdh:diva-62529DOI: 10.1049/PBCS055E_ch7Scopus ID: 2-s2.0-85106133376ISBN: 9781785617683 (print)OAI: oai:DiVA.org:mdh-62529DiVA, id: diva2:1761051
2023-05-312023-05-312023-05-31Bibliographically approved