Maximum entropy methods for texture synthesis: theory and practice

December 03, 2019 Β· Declared Dead Β· πŸ› SIAM Journal on Mathematics of Data Science

πŸ‘» CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Valentin De Bortoli, Agnes Desolneux, Alain Durmus, Bruno Galerne, Arthur Leclaire arXiv ID 1912.01691 Category math.ST Cross-listed cs.CV, math.PR, stat.CO Citations 6 Venue SIAM Journal on Mathematics of Data Science Last Checked 2 months ago
Abstract
Recent years have seen the rise of convolutional neural network techniques in exemplar-based image synthesis. These methods often rely on the minimization of some variational formulation on the image space for which the minimizers are assumed to be the solutions of the synthesis problem. In this paper we investigate, both theoretically and experimentally, another framework to deal with this problem using an alternate sampling/minimization scheme. First, we use results from information geometry to assess that our method yields a probability measure which has maximum entropy under some constraints in expectation. Then, we turn to the analysis of our method and we show, using recent results from the Markov chain literature, that its error can be explicitly bounded with constants which depend polynomially in the dimension even in the non-convex setting. This includes the case where the constraints are defined via a differentiable neural network. Finally, we present an extensive experimental study of the model, including a comparison with state-of-the-art methods and an extension to style transfer.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” math.ST

Died the same way β€” πŸ‘» Ghosted