Learning to Program Variational Quantum Circuits with Fast Weights
February 27, 2024 Β· Declared Dead Β· π IEEE International Joint Conference on Neural Network
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Samuel Yen-Chi Chen
arXiv ID
2402.17760
Category
quant-ph: Quantum Computing
Cross-listed
cs.AI,
cs.ET,
cs.LG,
cs.NE
Citations
22
Venue
IEEE International Joint Conference on Neural Network
Last Checked
2 months ago
Abstract
Quantum Machine Learning (QML) has surfaced as a pioneering framework addressing sequential control tasks and time-series modeling. It has demonstrated empirical quantum advantages notably within domains such as Reinforcement Learning (RL) and time-series prediction. A significant advancement lies in Quantum Recurrent Neural Networks (QRNNs), specifically tailored for memory-intensive tasks encompassing partially observable environments and non-linear time-series prediction. Nevertheless, QRNN-based models encounter challenges, notably prolonged training duration stemming from the necessity to compute quantum gradients using backpropagation-through-time (BPTT). This predicament exacerbates when executing the complete model on quantum devices, primarily due to the substantial demand for circuit evaluation arising from the parameter-shift rule. This paper introduces the Quantum Fast Weight Programmers (QFWP) as a solution to the temporal or sequential learning challenge. The QFWP leverages a classical neural network (referred to as the 'slow programmer') functioning as a quantum programmer to swiftly modify the parameters of a variational quantum circuit (termed the 'fast programmer'). Instead of completely overwriting the fast programmer at each time-step, the slow programmer generates parameter changes or updates for the quantum circuit parameters. This approach enables the fast programmer to incorporate past observations or information. Notably, the proposed QFWP model achieves learning of temporal dependencies without necessitating the use of quantum recurrent neural networks. Numerical simulations conducted in this study showcase the efficacy of the proposed QFWP model in both time-series prediction and RL tasks. The model exhibits performance levels either comparable to or surpassing those achieved by QLSTM-based models.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Quantum Computing
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
The power of quantum neural networks
R.I.P.
π»
Ghosted
Power of data in quantum machine learning
R.I.P.
π»
Ghosted
Quantum machine learning: a classical perspective
R.I.P.
π»
Ghosted
Noise-Adaptive Compiler Mappings for Noisy Intermediate-Scale Quantum Computers
R.I.P.
π»
Ghosted
ProjectQ: An Open Source Software Framework for Quantum Computing
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted