Quantum-limited stochastic optical neural networks operating at a few quanta per activation
July 28, 2023 Β· Declared Dead Β· π Nature Communications
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Shi-Yuan Ma, Tianyu Wang, JΓ©rΓ©mie Laydevant, Logan G. Wright, Peter L. McMahon
arXiv ID
2307.15712
Category
physics.optics
Cross-listed
cs.ET,
cs.LG,
cs.NE,
quant-ph
Citations
25
Venue
Nature Communications
Last Checked
2 months ago
Abstract
Energy efficiency in computation is ultimately limited by noise, with quantum limits setting the fundamental noise floor. Analog physical neural networks hold promise for improved energy efficiency compared to digital electronic neural networks. However, they are typically operated in a relatively high-power regime so that the signal-to-noise ratio (SNR) is large, and the noise can be treated as a perturbation. We study optical neural networks where all layers except the last are operated in the limit that each neuron can be activated by just a single photon, and as a result the noise on neuron activations is no longer merely perturbative. We show that by using a physics-based probabilistic model of the neuron activations in training, it is possible to perform accurate machine-learning inference in spite of the extremely high shot noise (SNR ~ 1). We experimentally demonstrated MNIST handwritten-digit classification with a test accuracy of 98% using an optical neural network with a hidden layer operating in the single-photon regime; the optical energy used to perform the classification corresponds to just 0.038 photons per multiply-accumulate (MAC) operation. Our physics-aware stochastic training approach might also prove useful with non-optical ultra-low-power hardware.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β physics.optics
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Training of photonic neural networks through in situ backpropagation
R.I.P.
π»
Ghosted
Experimental robustness of Fourier Ptychography phase retrieval algorithms
R.I.P.
π»
Ghosted
The physics of optical computing
R.I.P.
π»
Ghosted
Freeform Diffractive Metagrating Design Based on Generative Adversarial Networks
R.I.P.
π»
Ghosted
Scalable Optical Learning Operator
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted