Sylvester Normalizing Flows for Variational Inference

March 15, 2018 Β· Entered Twilight Β· πŸ› Conference on Uncertainty in Artificial Intelligence

πŸŒ… TWILIGHT: Old Age
Predates the code-sharing era β€” a pioneer of its time

"Last commit was 6.0 years ago (β‰₯5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .gitignore, LICENSE, README.md, data, main_experiment.py, models, optimization, utils

Authors Rianne van den Berg, Leonard Hasenclever, Jakub M. Tomczak, Max Welling arXiv ID 1803.05649 Category stat.ML: Machine Learning (Stat) Cross-listed cs.AI, cs.LG, stat.ME Citations 257 Venue Conference on Uncertainty in Artificial Intelligence Repository https://github.com/riannevdberg/sylvester-flows ⭐ 181 Last Checked 1 month ago
Abstract
Variational inference relies on flexible approximate posterior distributions. Normalizing flows provide a general recipe to construct flexible variational posteriors. We introduce Sylvester normalizing flows, which can be seen as a generalization of planar flows. Sylvester normalizing flows remove the well-known single-unit bottleneck from planar flows, making a single transformation much more flexible. We compare the performance of Sylvester normalizing flows against planar flows and inverse autoregressive flows and demonstrate that they compare favorably on several datasets.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” Machine Learning (Stat)

R.I.P. πŸ‘» Ghosted

Graph Attention Networks

Petar VeličkoviΔ‡, Guillem Cucurull, ... (+4 more)

stat.ML πŸ› ICLR πŸ“š 24.7K cites 8 years ago
R.I.P. πŸ‘» Ghosted

Layer Normalization

Jimmy Lei Ba, Jamie Ryan Kiros, Geoffrey E. Hinton

stat.ML πŸ› arXiv πŸ“š 12.0K cites 9 years ago