Please use this identifier to cite or link to this item:
https://repository.cihe.edu.hk/jspui/handle/cihe/4672
Title: | Rainyscape: Unsupervised rainy scene reconstruction using decoupled neural rendering | Author(s): | Liu, Hui | Author(s): | Lyu, X. Hou, J. |
Issue Date: | 2024 | Publisher: | Association for Computing Machinery | Related Publication(s): | Proceedings of the 32nd ACM International Conference on Multimedia (MM '24) | Start page: | 10920 | End page: | 10929 | Abstract: | We propose RainyScape, an unsupervised framework to reconstruct pristine scenes from a collection of multi-view rainy images. RainyScape consists of two main modules: a neural rendering module and a rain-prediction module that incorporates a predictor network and a learnable latent embedding that captures the rain characteristics of the scene. Specifically, leveraging the spectral bias property of neural networks, we first optimize the neural rendering pipeline to obtain a low-frequency scene representation. Subsequently, we jointly optimize the two modules, driven by the proposed adaptive direction-sensitive gradient-based reconstruction loss, which encourages the network to distinguish between scene details and rain streaks, facilitating the propagation of gradients to the relevant components. Extensive experiments on both the classic neural radiance field and the recently proposed 3D Gaussian splatting demonstrate the superiority of our method in effectively eliminating rain streaks and rendering clean images, achieving state-of-the-art performance. The constructed high-quality dataset, source code, and supplementary material are publicly available at https://github.com/lyuxianqiang/RainyScape. |
URI: | https://repository.cihe.edu.hk/jspui/handle/cihe/4672 | DOI: | 10.1145/3664647.3681290 | CIHE Affiliated Publication: | Yes |
Appears in Collections: | CIS Publication |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
View Online | 126 B | HTML | View/Open |

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.