Facial Animation Retargeting and Control based on a Human Appearance Space

Abstract : Facial animations are essential to enhance the realism and the credibility of virtual characters. Parameter-based animation methods offer a precise control over facial configurations while performance-based animation benefits from the naturalness of captured human motion. In this article, we propose an animation system that gathers the advantages of both approaches. By analyzing a database of facial motion, we create the human appearance space. The appearance space provides a coherent and continuous parameterization of human facial movements, while encapsulating the coherence of real facial deformations. We present a method to construct an analogous appearance face for a synthetic character. The link between both appearance spaces makes it possible to retarget facial animation on a synthetic face from a video. Moreover, the characteristics of the appearance space allow us to detect the principal variation patterns of a face and automatically reorganize them on a low-dimensional control space. The control space acts an interactive user-interface to manipulate the facial expressions of any synthetic face. This Interface makes it simple and intuitive to generate still facial configurations for keyframe animation, as well as complete sequences of facial movements. The resulting animations combine the flexibility of a parameter-based system and the realism of real human motion
Complete list of metadatas

https://hal-supelec.archives-ouvertes.fr/hal-00492536
Contributor : Myriam Andrieux <>
Submitted on : Wednesday, June 16, 2010 - 11:34:55 AM
Last modification on : Friday, November 16, 2018 - 1:30:36 AM

Identifiers

  • HAL Id : hal-00492536, version 1

Citation

Nicolas Stoiber, Renaud Séguier, Gaspard Breton. Facial Animation Retargeting and Control based on a Human Appearance Space. Computer Animation and Virtual Worlds, Wiley, 2010, 21 (1), pp.39-54. ⟨hal-00492536⟩

Share

Metrics

Record views

316