Abstract

We introduce the concept of 4D model flow for the precomputed alignment of dynamic surface appearance across 4D video sequences of different motions reconstructed from multi-view video. Precomputed 4D model flow allows the efficient parametrization of surface appearance from the captured videos, which enables efficient real-time rendering of interpolated 4D video sequences whilst accurately reproducing visual dynamics, even when using a coarse underlying geometry. We estimate the 4D model flow using an image-based approach that is guided by available geometry proxies. We propose a novel representation in surface texture space for efficient storage and online parametric interpolation of dynamic appearance. Our 4D model flow overcomes previous requirements for computationally expensive online optical flow computation for data-driven alignment of dynamic surface appearance by precomputing the appearance alignment. This leads to an efficient rendering technique that enables the online interpolation between 4D videos in real time, from arbitrary viewpoints and with visual quality comparable to the state of the art.

Paper

4D Model Flow: Precomputed Appearance Alignment for Real-time 4D Video
Dan Casas, Christian Richardt, John Collomosse, Christian Theobalt, and Adrian Hilton
Computer Graphics Forum (Proc. Pacific Graphics 2015)




Citation

    @article{Casas:PG:2015,
      author    = {Dan Casas and Christian Richardt and John Collomosse and Christian Theobalt and Adrian Hilton},
      title     = {{4D} Model Flow: Precomputed Appearance Alignment for Real-time {4D} Video Interpolation},
      journal   = {Computer Graphics Forum (Proceedings of Pacific Graphics)},
      year      = {2015},
      month     = {October},
      volume    = {34},
      number    = {7},
      pages     = {173--182},
      doi       = {10.1111/cgf.12756},
      url       = {http://cvssp.org/projects/4d/4dmodelflow/},
    }
	    

Data

Data used in this work can be found in the CVSSP Data Repository.

Acknowledgments

This research was funded by the ERC Starting Grant project CapReal (335545) and the InnovateUK project REFRAME.