Controllable Style Transfer via Test-time Training of Implicit Neural Representation
October 14, 2022 ยท Entered Twilight ยท ๐ Pattern Recognition
"No code URL or promise found in abstract"
"Derived repo from GitHub Pages (backfill)"
Evidence collected by the PWNC Scanner
Repo contents: LICENSE, README.md, images, requirements.txt, run_gradation.sh, run_interpolation.sh, run_mask.sh, run_size_control.sh, run_train.sh, samples, scripts
Authors
Sunwoo Kim, Youngjo Min, Younghun Jung, Seungryong Kim
arXiv ID
2210.07762
Category
cs.CV: Computer Vision
Citations
13
Venue
Pattern Recognition
Repository
https://github.com/ku-cvlab/INR-st
โญ 37
Last Checked
1 month ago
Abstract
We propose a controllable style transfer framework based on Implicit Neural Representation that pixel-wisely controls the stylized output via test-time training. Unlike traditional image optimization methods that often suffer from unstable convergence and learning-based methods that require intensive training and have limited generalization ability, we present a model optimization framework that optimizes the neural networks during test-time with explicit loss functions for style transfer. After being test-time trained once, thanks to the flexibility of the INR-based model, our framework can precisely control the stylized images in a pixel-wise manner and freely adjust image resolution without further optimization or training. We demonstrate several applications.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Computer Vision
๐
๐
Old Age
๐
๐
Old Age
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks
R.I.P.
๐ป
Ghosted
You Only Look Once: Unified, Real-Time Object Detection
๐
๐
Old Age
SSD: Single Shot MultiBox Detector
๐
๐
Old Age
Squeeze-and-Excitation Networks
R.I.P.
๐ป
Ghosted