Sciweavers

TVCG
2012

Live Speech Driven Head-and-Eye Motion Generators

11 years 7 months ago
Live Speech Driven Head-and-Eye Motion Generators
—This paper describes a fully automated framework to generate realistic head motion, eye gaze, and eyelid motion simultaneously based on live (or recorded) speech input. Its central idea is to learn separate yet inter-related statistical models for each component (head motion, gaze, or eyelid motion) from a pre-recorded facial motion dataset: i) Gaussian Mixture Models and gradient descent optimization algorithm are employed to generate head motion from speech features; ii) Nonlinear Dynamic Canonical Correlation Analysis model is used to synthesize eye gaze from head motion and speech features, and iii) nonnegative linear regression is used to model voluntary eye lid motion and log-normal distribution is used to describe involuntary eye blinks. Several user studies are conducted to evaluate the effectiveness of the proposed speech-driven head and eye motion generator using the well-established paired comparison methodology. Our evaluation results clearly show that this approach can ...
Binh Huy Le, Xiaohan Ma, Zhigang Deng
Added 28 Sep 2012
Updated 28 Sep 2012
Type Journal
Year 2012
Where TVCG
Authors Binh Huy Le, Xiaohan Ma, Zhigang Deng
Comments (0)