Extracting Animated Meshes with Adaptive Motion Estimation

10 years 12 months ago
Extracting Animated Meshes with Adaptive Motion Estimation
We present an approach for extracting coherently sampled animated meshes from input sequences of incoherently sampled meshes representing a continuously evolving shape. Our approach is based on multiscale adaptive motion estimation procedure followed by propagation of a template mesh through time. An adaptive signed distance volumes are used as the principal shape representation, and a Bayesian optical flow algorithm is adapted to the surface setting with a modification that diminishes the interference between unrelated surface regions. Additionally, a parametric smoothing step is employed to improve the sampling coherence of the model. The result of the proposed procedure is a single animated mesh. We apply our approach to the human motion data.
Nizam Anuar, Igor Guskov
Added 31 Oct 2010
Updated 31 Oct 2010
Type Conference
Year 2004
Where VMV
Authors Nizam Anuar, Igor Guskov
Comments (0)