Towards a Better Global Loss Landscape of GANs

November 10, 2020 ยท Entered Twilight ยท ๐Ÿ› Neural Information Processing Systems

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 5.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: 5gaussian.py, GAN2_2Cluster.py, LICENSE, README.md, datasets.py, disc_resblocks.py, eval.py, fid_tf.py, gen_resblocks.py, inception_score_tf.py, networks.py, prd_score.py, pretrained-model, rsgan.py, scripts.txt, utils.py, vanillaGAN.py

Authors Ruoyu Sun, Tiantian Fang, Alex Schwing arXiv ID 2011.04926 Category cs.LG: Machine Learning Cross-listed cs.AI, cs.CV, cs.IT, math.OC Citations 35 Venue Neural Information Processing Systems Repository https://github.com/AilsaF/RS-GAN โญ 32 Last Checked 2 months ago
Abstract
Understanding of GAN training is still very limited. One major challenge is its non-convex-non-concave min-max objective, which may lead to sub-optimal local minima. In this work, we perform a global landscape analysis of the empirical loss of GANs. We prove that a class of separable-GAN, including the original JS-GAN, has exponentially many bad basins which are perceived as mode-collapse. We also study the relativistic pairing GAN (RpGAN) loss which couples the generated samples and the true samples. We prove that RpGAN has no bad basins. Experiments on synthetic data show that the predicted bad basin can indeed appear in training. We also perform experiments to support our theory that RpGAN has a better landscape than separable-GAN. For instance, we empirically show that RpGAN performs better than separable-GAN with relatively narrow neural nets. The code is available at https://github.com/AilsaF/RS-GAN.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning