Publications
A study of complex deep learning networks on high-performance, neuromorphic, and quantum computers
Abstract
Current deep learning approaches have been very successful using convolutional neural networks trained on large graphical-processing-unit-based computers. Three limitations of this approach are that (1) they are based on a simple layered network topology, i.e., highly connected layers, without intra-layer connections; (2) the networks are manually configured to achieve optimal results, and (3) the implementation of the network model is expensive in both cost and power. In this article, we evaluate deep learning models using three different computing architectures to address these problems: quantum computing to train complex topologies, high performance computing to automatically determine network topology, and neuromorphic computing for a low-power hardware implementation. We use the MNIST dataset for our experiment, due to input size limitations of current quantum computers. Our results show the …
- Date
- July 11, 2018
- Authors
- Thomas E Potok, Catherine Schuman, Steven Young, Robert Patton, Federico Spedalieri, Jeremy Liu, Ke-Thia Yao, Garrett Rose, Gangotree Chakma
- Journal
- ACM Journal on Emerging Technologies in Computing Systems (JETC)
- Volume
- 14
- Issue
- 2
- Pages
- 1-21
- Publisher
- ACM