Hardware architectures for deep learning
2020 (English)Collection (editor) (Other academic)
Abstract [en]
This book presents and discusses innovative ideas in the design, modelling, implementation, and optimization of hardware platforms for neural networks. The rapid growth of server, desktop, and embedded applications based on deep learning has brought about a renaissance in interest in neural networks, with applications including image and speech processing, data analytics, robotics, healthcare monitoring, and IoT solutions. Efficient implementation of neural networks to support complex deep learning-based applications is a complex challenge for embedded and mobile computing platforms with limited computational/storage resources and a tight power budget. Even for cloud-scale systems it is critical to select the right hardware configuration based on the neural network complexity and system constraints in order to increase power- and performance-efficiency. Hardware Architectures for Deep Learning provides an overview of this new field, from principles to applications, for researchers, postgraduate students and engineers who work on learning-based services and hardware platforms.
Place, publisher, year, edition, pages
Institution of Engineering and Technology , 2020. p. 1-306
Keywords [en]
Analog accelerators, Binary data representations, Convolutional neural networks, Deep learning hardware, Embedded systems, Error-tolerance, Feedforward models, Feedforward neural nets, Hardware accelerators, Hardware architectures, Inverter-based memristive neuromorphic circuit, Learning (artificial intelligence), Low-precision data representation, Model sparsity, Neural chips, Neural net architecture, Neuromorphic engineering, Recurrent neural nets, Recurrent neural network, RNN, Stochastic data representations, Ultra-low-power IoT smart applications
National Category
Computer Systems
Identifiers
URN: urn:nbn:se:mdh:diva-62375DOI: 10.1049/PBCS055EScopus ID: 2-s2.0-85153666198ISBN: 9781785617683 (print)OAI: oai:DiVA.org:mdh-62375DiVA, id: diva2:1754344
2023-05-032023-05-032023-05-24Bibliographically approved