Controllable Style Transfer via Test-time Training of Implicit Neural Representation

October 14, 2022 ยท Entered Twilight ยท ๐Ÿ› Pattern Recognition

๐Ÿ’ค TWILIGHT: Eternal Rest
Repo abandoned since publication

"No code URL or promise found in abstract"
"Derived repo from GitHub Pages (backfill)"

Evidence collected by the PWNC Scanner

Repo contents: LICENSE, README.md, images, requirements.txt, run_gradation.sh, run_interpolation.sh, run_mask.sh, run_size_control.sh, run_train.sh, samples, scripts

Authors Sunwoo Kim, Youngjo Min, Younghun Jung, Seungryong Kim arXiv ID 2210.07762 Category cs.CV: Computer Vision Citations 13 Venue Pattern Recognition Repository https://github.com/ku-cvlab/INR-st โญ 37 Last Checked 1 month ago
Abstract
We propose a controllable style transfer framework based on Implicit Neural Representation that pixel-wisely controls the stylized output via test-time training. Unlike traditional image optimization methods that often suffer from unstable convergence and learning-based methods that require intensive training and have limited generalization ability, we present a model optimization framework that optimizes the neural networks during test-time with explicit loss functions for style transfer. After being test-time trained once, thanks to the flexibility of the INR-based model, our framework can precisely control the stylized images in a pixel-wise manner and freely adjust image resolution without further optimization or training. We demonstrate several applications.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Computer Vision