Motion Compensated Virtual View Synthesis Using Novel Particle Cell


Due to the wide interest in advanced multimedia experience, free-viewpoint communication is being greatly developed in recent years. In the free-viewpoint communication, viewers can perceive a view from any angle and any position of a scene. Even though the preferred views are not captured, we can generate the views through virtual view synthesis that synthesizes an arbitrary view from captured reference view(s). For daily use, only one or few cameras in baseline distance are given to capture the scene that makes the virtual view synthesis challenging. The task is more difficult when the camera is continuously moving. In this paper, we propose a particle cell to model a reference view sequence to a set of moving particles for virtual view synthesis. Using our novel hybrid motion estimation scheme, the projected coordinates of particles in each frame are obtained even they are occluded. The particles are warped to a virtual view and synthesized as a virtual view sequence. Our method is applicable for both dynamic camera setting and static camera setting. The experimental results show our method outperforms the state-of-the-art algorithms in dynamic camera datasets and presents improvement in static camera datasets in general.

IEEE Transactions on Multimedia