DiscoverDaily Paper CastGenerative Refocusing: Flexible Defocus Control from a Single Image
Generative Refocusing: Flexible Defocus Control from a Single Image

Generative Refocusing: Flexible Defocus Control from a Single Image

Update: 2025-12-20
Share

Description

🤗 Upvotes: 26 | cs.CV



Authors:

Chun-Wei Tuan Mu, Jia-Bin Huang, Yu-Lun Liu



Title:

Generative Refocusing: Flexible Defocus Control from a Single Image



Arxiv:

http://arxiv.org/abs/2512.16923v1



Abstract:

Depth-of-field control is essential in photography, but getting the perfect focus often takes several tries or special equipment. Single-image refocusing is still difficult. It involves recovering sharp content and creating realistic bokeh. Current methods have significant drawbacks. They need all-in-focus inputs, depend on synthetic data from simulators, and have limited control over aperture. We introduce Generative Refocusing, a two-step process that uses DeblurNet to recover all-in-focus images from various inputs and BokehNet for creating controllable bokeh. Our main innovation is semi-supervised training. This method combines synthetic paired data with unpaired real bokeh images, using EXIF metadata to capture real optical characteristics beyond what simulators can provide. Our experiments show we achieve top performance in defocus deblurring, bokeh synthesis, and refocusing benchmarks. Additionally, our Generative Refocusing allows text-guided adjustments and custom aperture shapes.

Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Generative Refocusing: Flexible Defocus Control from a Single Image

Generative Refocusing: Flexible Defocus Control from a Single Image

Jingwen Liang, Gengyu Wang