Sylvester Normalizing Flows for Variational Inference
March 15, 2018 Β· Entered Twilight Β· π Conference on Uncertainty in Artificial Intelligence
"Last commit was 6.0 years ago (β₯5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: .gitignore, LICENSE, README.md, data, main_experiment.py, models, optimization, utils
Authors
Rianne van den Berg, Leonard Hasenclever, Jakub M. Tomczak, Max Welling
arXiv ID
1803.05649
Category
stat.ML: Machine Learning (Stat)
Cross-listed
cs.AI,
cs.LG,
stat.ME
Citations
257
Venue
Conference on Uncertainty in Artificial Intelligence
Repository
https://github.com/riannevdberg/sylvester-flows
β 181
Last Checked
1 month ago
Abstract
Variational inference relies on flexible approximate posterior distributions. Normalizing flows provide a general recipe to construct flexible variational posteriors. We introduce Sylvester normalizing flows, which can be seen as a generalization of planar flows. Sylvester normalizing flows remove the well-known single-unit bottleneck from planar flows, making a single transformation much more flexible. We compare the performance of Sylvester normalizing flows against planar flows and inverse autoregressive flows and demonstrate that they compare favorably on several datasets.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Machine Learning (Stat)
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Distilling the Knowledge in a Neural Network
R.I.P.
π»
Ghosted
Layer Normalization
R.I.P.
π»
Ghosted
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
R.I.P.
π»
Ghosted
Domain-Adversarial Training of Neural Networks
R.I.P.
π»
Ghosted