Automated Metaheuristic Algorithm Design with Autoregressive Learning
May 06, 2024 · Declared Dead · 🏛 IEEE Transactions on Evolutionary Computation
"Paper promises code 'coming soon'"
Evidence collected by the PWNC Scanner
Authors
Qi Zhao, Tengfei Liu, Bai Yan, Qiqi Duan, Jian Yang, Yuhui Shi
arXiv ID
2405.03419
Category
cs.NE: Neural & Evolutionary
Cross-listed
cs.LG
Citations
14
Venue
IEEE Transactions on Evolutionary Computation
Last Checked
2 months ago
Abstract
Automated design of metaheuristic algorithms offers an attractive avenue to reduce human effort and gain enhanced performance beyond human intuition. Current automated methods design algorithms within a fixed structure and operate from scratch. This poses a clear gap towards fully discovering potentials over the metaheuristic family and fertilizing from prior design experience. To bridge the gap, this paper proposes an autoregressive learning-based designer for automated design of metaheuristic algorithms. Our designer formulates metaheuristic algorithm design as a sequence generation task, and harnesses an autoregressive generative network to handle the task. This offers two advances. First, through autoregressive inference, the designer generates algorithms with diverse lengths and structures, enabling to fully discover potentials over the metaheuristic family. Second, prior design knowledge learned and accumulated in neurons of the designer can be retrieved for designing algorithms for future problems, paving the way to continual design of algorithms for open-ended problem-solving. Extensive experiments on numeral benchmarks and real-world problems reveal that the proposed designer generates algorithms that outperform all human-created baselines on 24 out of 25 test problems. The generated algorithms display various structures and behaviors, reasonably fitting for different problem-solving contexts. Code will be released after paper publication.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
📜 Similar Papers
In the same crypt — Neural & Evolutionary
R.I.P.
👻
Ghosted
R.I.P.
👻
Ghosted
Progressive Growing of GANs for Improved Quality, Stability, and Variation
R.I.P.
👻
Ghosted
Learning both Weights and Connections for Efficient Neural Networks
R.I.P.
👻
Ghosted
LSTM: A Search Space Odyssey
R.I.P.
👻
Ghosted
A Baseline for Detecting Misclassified and Out-of-Distribution Examples in Neural Networks
R.I.P.
👻
Ghosted
An Introduction to Convolutional Neural Networks
Died the same way — ⏳ Coming Soon™
R.I.P.
⏳
Coming Soon™
Exploring Simple Siamese Representation Learning
R.I.P.
⏳
Coming Soon™
An Analysis of Scale Invariance in Object Detection - SNIP
R.I.P.
⏳
Coming Soon™
Class-balanced Grouping and Sampling for Point Cloud 3D Object Detection
R.I.P.
⏳
Coming Soon™