R.I.P.
π»
Ghosted
Coulomb GANs: Provably Optimal Nash Equilibria via Potential Fields
August 29, 2017 Β· Entered Twilight Β· π International Conference on Learning Representations
"Last commit was 7.0 years ago (β₯5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: README.md, pytorch, tensorflow
Authors
Thomas Unterthiner, Bernhard Nessler, Calvin Seward, GΓΌnter Klambauer, Martin Heusel, Hubert Ramsauer, Sepp Hochreiter
arXiv ID
1708.08819
Category
cs.LG: Machine Learning
Cross-listed
cs.GT,
stat.ML
Citations
76
Venue
International Conference on Learning Representations
Repository
https://github.com/bioinf-jku/coulomb_gan
β 63
Last Checked
1 month ago
Abstract
Generative adversarial networks (GANs) evolved into one of the most successful unsupervised techniques for generating realistic images. Even though it has recently been shown that GAN training converges, GAN models often end up in local Nash equilibria that are associated with mode collapse or otherwise fail to model the target distribution. We introduce Coulomb GANs, which pose the GAN learning problem as a potential field of charged particles, where generated samples are attracted to training set samples but repel each other. The discriminator learns a potential field while the generator decreases the energy by moving its samples along the vector (force) field determined by the gradient of the potential field. Through decreasing the energy, the GAN model learns to generate samples according to the whole target distribution and does not only cover some of its modes. We prove that Coulomb GANs possess only one Nash equilibrium which is optimal in the sense that the model distribution equals the target distribution. We show the efficacy of Coulomb GANs on a variety of image datasets. On LSUN and celebA, Coulomb GANs set a new state of the art and produce a previously unseen variety of different samples.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Machine Learning
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
R.I.P.
π»
Ghosted
Semi-Supervised Classification with Graph Convolutional Networks
R.I.P.
π»
Ghosted
Proximal Policy Optimization Algorithms
R.I.P.
π»
Ghosted