Please use this identifier to cite or link to this item:
https://repository.cihe.edu.hk/jspui/handle/cihe/4448
Title: | Content-aware warping for view synthesis | Author(s): | Liu, Hui | Author(s): | Guo, M. Hou, J. Jin, J. Zeng, H. Lu, J. |
Issue Date: | 2023 | Publisher: | IEEE | Journal: | IEEE Transactions on Pattern Analysis and Machine Intelligence | Volume: | 45 | Issue: | 8 | Start page: | 9486 | End page: | 9503 | Abstract: | Existing image-based rendering methods usually adopt depth-based image warping operation to synthesize novel views. In this paper, we reason the essential limitations of the traditional warping operation to be the limited neighborhood and only distance-based interpolation weights. To this end, we propose content-aware warping , which adaptively learns the interpolation weights for pixels of a relatively large neighborhood from their contextual information via a lightweight neural network. Based on this learnable warping module, we propose a new end-to-end learning-based framework for novel view synthesis from a set of input source views, in which two additional modules, namely confidence-based blending and feature-assistant spatial refinement, are naturally proposed to handle the occlusion issue and capture the spatial correlation among pixels of the synthesized view, respectively. Besides, we also propose a weight-smoothness loss term to regularize the network. Experimental results on light field datasets with wide baselines and multi-view datasets show that the proposed method significantly outperforms state-of-the-art methods both quantitatively and visually. |
URI: | https://repository.cihe.edu.hk/jspui/handle/cihe/4448 | DOI: | 10.1109/TPAMI.2023.3242709 | CIHE Affiliated Publication: | Yes |
Appears in Collections: | CIS Publication |
Show full item record
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.