Few-shot Autoregressive Density Estimation: Towards Learning to Learn Distributions

October 27, 2017 Β· Declared Dead Β· πŸ› International Conference on Learning Representations

πŸ‘» CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Scott Reed, Yutian Chen, Thomas Paine, AΓ€ron van den Oord, S. M. Ali Eslami, Danilo Rezende, Oriol Vinyals, Nando de Freitas arXiv ID 1710.10304 Category cs.NE: Neural & Evolutionary Cross-listed cs.CV Citations 90 Venue International Conference on Learning Representations Last Checked 2 months ago
Abstract
Deep autoregressive models have shown state-of-the-art performance in density estimation for natural images on large-scale datasets such as ImageNet. However, such models require many thousands of gradient-based weight updates and unique image examples for training. Ideally, the models would rapidly learn visual concepts from only a handful of examples, similar to the manner in which humans learns across many vision tasks. In this paper, we show how 1) neural attention and 2) meta learning techniques can be used in combination with autoregressive models to enable effective few-shot density estimation. Our proposed modifications to PixelCNN result in state-of-the art few-shot density estimation on the Omniglot dataset. Furthermore, we visualize the learned attention policy and find that it learns intuitive algorithms for simple tasks such as image mirroring on ImageNet and handwriting on Omniglot without supervision. Finally, we extend the model to natural images and demonstrate few-shot image generation on the Stanford Online Products dataset.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” Neural & Evolutionary

R.I.P. πŸ‘» Ghosted

LSTM: A Search Space Odyssey

Klaus Greff, Rupesh Kumar Srivastava, ... (+3 more)

cs.NE πŸ› IEEE TNNLS πŸ“š 6.0K cites 11 years ago

Died the same way β€” πŸ‘» Ghosted