An Efficient Minibatch Acceptance Test for Metropolis-Hastings

October 19, 2016 ยท Declared Dead ยท ๐Ÿ› Conference on Uncertainty in Artificial Intelligence

๐Ÿ‘ป CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Daniel Seita, Xinlei Pan, Haoyu Chen, John Canny arXiv ID 1610.06848 Category cs.LG: Machine Learning Cross-listed stat.ML Citations 46 Venue Conference on Uncertainty in Artificial Intelligence Last Checked 2 months ago
Abstract
We present a novel Metropolis-Hastings method for large datasets that uses small expected-size minibatches of data. Previous work on reducing the cost of Metropolis-Hastings tests yield variable data consumed per sample, with only constant factor reductions versus using the full dataset for each sample. Here we present a method that can be tuned to provide arbitrarily small batch sizes, by adjusting either proposal step size or temperature. Our test uses the noise-tolerant Barker acceptance test with a novel additive correction variable. The resulting test has similar cost to a normal SGD update. Our experiments demonstrate several order-of-magnitude speedups over previous work.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning

Died the same way โ€” ๐Ÿ‘ป Ghosted