Data-Driven Variation for Virtual Facial Expressions

Loading...
Thumbnail Image

View/Download File

Persistent link to this item

Statistics
View Statistics

Journal Title

Journal ISSN

Volume Title

Title

Data-Driven Variation for Virtual Facial Expressions

Published Date

2017-03-16

Publisher

Type

Report

Abstract

Animating digital characters has an important role in computer assisted experiences, from video games to movies to interactive robotics. A critical component of digital character interaction is the animation of the human face. Here we explore a data-driven method to produce variation in animated smiles. We define a low-dimensional parameter space for learning based on key feature points of the face, which generalizes to arbitrary digital models. We perform a large-scale user study to annotate a systematic sweep of faces, and train a non-parametric classifier to predict the level of perceived happiness. This model is tuned to balance between precision and the variation in its predictions. New happy faces are then sampled from this model, resulting in a variety of generated faces that display a targeted level of happiness. This diversity can allow rich interactions with digital characters to be built automatically, without the need for hand-crafted expressions.

Keywords

Description

Related to

Replaces

License

Series/Report Number

Funding information

Isbn identifier

Doi identifier

Previously Published Citation

Suggested citation

Sohre, Nick; Adeagbo, Moses; Guy, Stephen. (2017). Data-Driven Variation for Virtual Facial Expressions. Retrieved from the University Digital Conservancy, https://hdl.handle.net/11299/216004.

Content distributed via the University Digital Conservancy may be subject to additional license and use restrictions applied by the depositor. By using these files, users agree to the Terms of Use. Materials in the UDC may contain content that is disturbing and/or harmful. For more information, please see our statement on harmful content in digital repositories.