R.I.P.
๐ป
Ghosted
Evolutionary Preference Sampling for Pareto Set Learning
April 12, 2024 ยท Entered Twilight ยท ๐ Annual Conference on Genetic and Evolutionary Computation
Repo contents: README.md, eps.png, functions_evaluation.py, functions_hv_python3.py, model.py, problem.py, requirement.txt, run.py, utils.py
Authors
Rongguang Ye, Longcan Chen, Jinyuan Zhang, Hisao Ishibuchi
arXiv ID
2404.08414
Category
cs.NE: Neural & Evolutionary
Cross-listed
cs.AI
Citations
6
Venue
Annual Conference on Genetic and Evolutionary Computation
Repository
https://github.com/rG223/EPS
โญ 7
Last Checked
2 months ago
Abstract
Recently, Pareto Set Learning (PSL) has been proposed for learning the entire Pareto set using a neural network. PSL employs preference vectors to scalarize multiple objectives, facilitating the learning of mappings from preference vectors to specific Pareto optimal solutions. Previous PSL methods have shown their effectiveness in solving artificial multi-objective optimization problems (MOPs) with uniform preference vector sampling. The quality of the learned Pareto set is influenced by the sampling strategy of the preference vector, and the sampling of the preference vector needs to be decided based on the Pareto front shape. However, a fixed preference sampling strategy cannot simultaneously adapt the Pareto front of multiple MOPs. To address this limitation, this paper proposes an Evolutionary Preference Sampling (EPS) strategy to efficiently sample preference vectors. Inspired by evolutionary algorithms, we consider preference sampling as an evolutionary process to generate preference vectors for neural network training. We integrate the EPS strategy into five advanced PSL methods. Extensive experiments demonstrate that our proposed method has a faster convergence speed than baseline algorithms on 7 testing problems. Our implementation is available at https://github.com/rG223/EPS.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Neural & Evolutionary
R.I.P.
๐ป
Ghosted
Progressive Growing of GANs for Improved Quality, Stability, and Variation
R.I.P.
๐ป
Ghosted
Learning both Weights and Connections for Efficient Neural Networks
R.I.P.
๐ป
Ghosted
LSTM: A Search Space Odyssey
R.I.P.
๐ป
Ghosted
A Baseline for Detecting Misclassified and Out-of-Distribution Examples in Neural Networks
R.I.P.
๐ป
Ghosted