Spacetime expression cloning for blendshapes

@article{Seol2012SpacetimeEC,
  title={Spacetime expression cloning for blendshapes},
  author={Yeongho Seol and J. P. Lewis and Jaewoo Seo and Byungkuk Choi and Ken-ichi Anjyo and Jun-yong Noh},
  journal={ACM Trans. Graph.},
  year={2012},
  volume={31},
  pages={14:1-14:12}
}
The goal of a practical facial animation retargeting system is to reproduce the character of a source animation on a target face while providing room for additional creative control by the animator. This article presents a novel spacetime facial animation retargeting method for blendshape face models. Our approach starts from the basic principle that the source and target movements should be similar. By interpreting movement as the derivative of position with time, and adding suitable boundary… 
Facial retargeting with automatic range of motion alignment
TLDR
This work forms the problem of transferring the blendshapes of a facial rig to an actor as a special case of manifold alignment, by exploring the similarities of the motion spaces defined by the blend shapes and by an expressive training sequence of the actor.
Interactive facial expression editing based on spatio-temporal coherency
We present a novel approach for interactively and intuitively editing 3D facial animation in this paper. It determines a new expression by combining the user-specified constraints with the priors
BlendForces: A Dynamic Framework for Facial Animation
TLDR
A new paradigm for the generation and retargeting of facial animation is presented that naturally combines the blendshapes paradigm with physics‐based techniques for the simulation of deforming meshes and has a wider expressive range than previous blendshape‐based methods.
Artist friendly facial animation retargeting
TLDR
This paper presents a novel facial animation retargeting system that is carefully designed to support the animator's workflow and automatically creates GUI controllers to help artists perform key-framing and editing very efficiently.
Stabilized blendshape editing using localized Jacobian transpose descent
TLDR
An approach to reducing unexpected movements during blendshape weight editing based on using the transpose of the Jacobian matrix rather than the pseudo-inverse, which results in smoother and more reliable facial expressions during the weight editing process.
Intuitive Facial Animation Editing Based On A Generative RNN Framework
TLDR
This paper designs a generative recurrent neural network that generates realistic motion into designated segments of an existing facial animation, optionally following user‐provided guiding constraints, and demonstrates the usability of the system on several animation editing use cases.
Expression transfer: A system to build 3D blend shapes for facial animation
TLDR
This paper presents a complete pipeline to create a set of blend shapes in different expressions for a face mesh having only a neutral expression and solves optimization problem to consistently map the deformation of the source blend shapes to the target face model.
Generate Individually Optimized Blendshapes
TLDR
This paper presents a novel approach to automatically generate individually optimized blendshapes from real-time captured facial expressions with two methods: linear regression and an autoencoder.
A Robust Interactive Facial Animation Editing System
TLDR
This work proposes a new learning-based approach to easily edit a facial animation from a set of intuitive control parameters that uses a resolution-preserving fully convolutional neural network that maps control parameters to blendshapes coefficients sequences.
Sketching Manipulators for Localized Blendshape Editing
TLDR
It is shown that localized blendshape direct manipulation has the potential to reduce the time-consuming blendshape editing process to an easy freehand stroke drawing and a wide range of facial poses on various models that are created rapidly using the method.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 61 REFERENCES
Example-based facial rigging
We introduce a method for generating facial blendshape rigs from a set of example poses of a CG character. Our system transfers controller semantics and expression dynamics from a generic template to
Performance Driven Facial Animation using Blendshape Interpolation
TLDR
This paper describes a method of creating facial animation using a combination of motion capture data and blendshape interpolation, which is effective even when the motion capture actor and the target model have quite different shapes.
Learning controls for blend shape based realistic facial animation
TLDR
This paper proposes an automatic, physically-motivated segmentation that learns the controls and parameters directly from the set of blend shapes and provides a rendering algorithm to enhance the visual realism of a blend shape model.
Expression cloning
TLDR
This work presents a novel approach to producing facial expression animations for new models that takes advantage of existing animation data in the form of vertex motion vectors and provides a new alternative for creating facial animations for character models.
Performance-driven muscle-based facial animation
TLDR
A system to synthesize facial expressions by editing captured performances using the actuation of expression muscles to control facial expressions, and applies the original performance data to different facial models with equivalent muscle structures, to produce similar expressions.
Spacetime faces: high resolution capture for modeling and animation
TLDR
An end-to-end system that goes from video sequences to high resolution, editable, dynamically controllable face models, and new tools that model the dynamics in the input sequence to enable new animations, created via key-framing or texture-synthesis techniques are described.
Transferring the Rig and Animations from a Character to Different Face Models
TLDR
A facial deformation system that allows artists to define and customize a facial rig and later apply the same rig to different face models and obtain unique expressions is introduced.
Performance-driven facial animation
TLDR
A means of acquiring the expressions of real faces, and applying them to computer-generated faces is described as an "electronic mask" that offers a means for the traditional talents of actors to be flexibly incorporated in digital animations.
Performance-driven facial animation
TLDR
A means of acquiring the expressions of real faces, and applying them to computer-generated faces is described as an "electronic mask" that offers a means for the traditional talents of actors to be flexibly incorporated in digital animations.
Face poser: Interactive modeling of 3D facial expressions using facial priors
TLDR
An intuitive and easy-to-use system for interactively posing 3D facial expressions by combining the user's input with priors embedded in a large set of facial expression data and Maximizing the posteriori allows us to generate an optimal and natural facial expression that achieves the goal specified by the user.
...
1
2
3
4
5
...