ReGO: Reference-Guided Outpainting for Scenery Image

IEEE TRANSACTIONS ON IMAGE PROCESSING(2024)

Cited 1|Views47
No score
Abstract
We present ReGO (Reference-Guided Outpainting), a new method for the task of sketch-guided image outpainting. Despite the significant progress made in producing semantically coherent content, existing outpainting methods often fail to deliver visually appealing results due to blurry textures and generative artifacts. To address these issues, ReGO leverages neighboring reference images to synthesize texture-rich results by transferring pixels from them. Specifically, an Adaptive Content Selection (ACS) module is incorporated into ReGO to facilitate pixel transfer for texture compensating of the target image. Additionally, a style ranking loss is introduced to maintain consistency in terms of style while preventing the generated part from being influenced by the reference images. ReGO is a model-agnostic learning paradigm for outpainting tasks. In our experiments, we integrate ReGO with three state-of-the-art outpainting models to evaluate its effectiveness. The results obtained on three scenery benchmarks, i.e. NS6K, NS8K and SUN Attribute, demonstrate the superior performance of ReGO compared to prior art in terms of texture richness and authenticity.
More
Translated text
Key words
Image outpainting,GAN,generation model,adversarial learning
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined