Literature DB >> 20650728

Modeling short-term dynamics and variability for realistic interactive facial animation.

Nicolas Stoiber1, Gaspard Breton, Renaud Seguier.   

Abstract

Modern modeling and rendering techniques have produced nearly photorealistic face models, but truly expressive digital faces also require natural-looking movements. Virtual characters in today's applications often display unrealistic facial expressions. Indeed, facial animation with traditional schemes such as keyframing and motion capture demands expertise. Moreover, the traditional schemes aren't adapted to interactive applications that require the real-time generation of context-dependent movements. A new animation system produces realistic expressive facial motion at interactive speed. The system relies on a set of motion models controlling facial-expression dynamics. The models are fitted on captured motion data and therefore retain the dynamic signature of human facial expressions. They also contain a nondeterministic component that ensures the variety of the long-term visual behavior. This system can efficiently animate any synthetic face. The video illustrates interactive use of a system that generates facial-animation sequences.

Entities:  

Mesh:

Year:  2010        PMID: 20650728     DOI: 10.1109/MCG.2010.40

Source DB:  PubMed          Journal:  IEEE Comput Graph Appl        ISSN: 0272-1716            Impact factor:   2.088


  1 in total

1.  A wireless accelerometer-based body posture stability detection system and its application for meditation practitioners.

Authors:  Kang-Ming Chang; Sih-Huei Chen; Hsin-Yi Lee; Congo Tak-Shing Ching; Chun-Lung Huang
Journal:  Sensors (Basel)       Date:  2012-12-18       Impact factor: 3.576

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.