Self-Supervised Adaptive Illumination Estimation for Low-Light Image Enhancement

IEEE Transactions on Emerging Topics in Computational Intelligence(2024)

引用 0|浏览3
暂无评分
摘要
In low-light image enhancement tasks, global structure and local texture details have different effects on illumination estimation. However, most existing works fail to effectively explore the intrinsic association within them. To effectively balance the structure-preserving and texture-smoothing for illumination maps, this paper introduces a new illumination smoothing loss and proposes a self-supervised adaptive illumination estimation network (AIE-Net). The illumination smoothing loss achieves a balance between structure-preserving and texture-smoothing mainly through L2 norm, truncated Huber, and Gaussian kernel function with color affinity. To construct AIE-Net, we introduce a local-global adaptive modulation (LGAM) module in deep feature extraction. The module allows local and global features to be adaptively fused in a spatially varying manner by predicting scaling and adding factors. Finally, we separately estimate the illumination maps for the input image and its inverted image, and then achieve exposure correction with multi-exposure fusion. Extensive experiments show that the proposed method can improve image quality under different light conditions, and has better performance and generalization ability than other methods on several datasets.
更多
查看译文
关键词
Adaptive fusion,illumination smoothing loss,low-light image enhancement,vision transformer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要