The process for creating believable facial animation is time-consuming and tedious. We have developed a tool set which dramatically reduces the time spent generating a high-quality facial animation setup with natural, non-linear skin deformations. The innovation in our approach consists of the unusual employment of facial motion capture data and its adaptation. The Adaptable Facial Setup tool is the progression of the 2004 SIGGRAPH Sketch “Adaptable Setup For Performance Driven Facial Animation”.
The rigging artist/TD will be provided with a set of facial Feature Point Locators that you will have to position on the face geometry intended to rig. Each of the Feature Point Locators has a point and orient constrained joint that belongs to it. After binding all facial joints to the geometry you need to paint skin weights for the influences. Alternatively you can use the Geometry Matcher / Skin Weights Cloner (included in the Facial Animation Toolset) to clone skin weights from another geometry. Most of the facial movements that you can control/animate follow the Facial Action Coding System (FACS) invented by Paul Ekman. For the best results it is recommended to fully understand this system. One useful resource is the Facial Expression Repertoire. The next step is to load and connect the Motion Data. You can immediately see the results when moving sliders in the AFS Animation Control layout. The Adaptable Facial Setup GUI will provide you with powerful functions to adapt the facial movements in a desired way. The AFS is geometry independent and works for all humanoid characters.
The installer will automatically create a Maya shelf named “Facial_Animation_Toolset” containing:
Click on the Maya shelf button labeled “AFS” to start the tool. The AFS can be used to either create an entirely new setup or to adapt facial movements in an existing setup.
The AFS tool GUI has a menu bar (1-4) and three different main frames. They are named:
Following is a short introduction of the GUI functions.
(1) the Data menu holds functions to load and connect the Motion Data
(2) set the global Maya key interpolation to Linear, Spline, Clamped or Flat
(3) various functions to enhance the animation setup
(4) web based help on Animation Controls and general information
(5) the Feature Point Positioning functions will only be needed in the beginning of a new setup creation, thus this frame is closed by default.
(6) enter/leave the Global Positioning mode
(7) enter/leave the Individual Positioning mode
(8) mirror the selected Feature Point locators
(9) the Data Adaptation functions
(10) the name of the Animation Control currently loaded
(11) load/unload an Animation Control
(12) select the current Animation Control
(13) enable/disable default key creation when loading the Animation Control. Unload (11) will remove the keys.
(14) Reset removes all adaptation and rotation nodes, the scale will be set back to 1.0
(15) Mirror will look for the asymmetrical related Animation Control and use its adaptation information.
(16) list of involved Feature Point Locators and their corresponding Motion Data Nodes (X/Y/Z)
(17) Scale applied to Motion Data Node (16)
(18) Adaptation values for Motion Data (16)
(19) Rotation values for Motion Data (16)
(20) Motion Data X,Y,Z selection filter
(21) Adaptation Data X,Y,Z selection filter
(22) Rotation Data X,Y,Z selection filter
(23) Query the current Feature Point selection corresponding to the selection filters (20,21,22)
(24) (re-) select the current node selection from lists (16,18,19)
(25) Scale value (17) for the selected Motion Data Node(s) (16), causes update in list (17)
(26) Set Adaptation Key for selected Feature Point Locator(s), causes an update in list (18)
(27) Set Rotation Key for selected Feature Point locator(s), can cause an update in list (19)
(28) Status messages on the performed operations
Note that functions that do not make sense in the current state of the process will not be available, their corresponding controls in the GUI will be disabled (greyed out).
Each Animation Control represents a slider with a 0-100 intensity range. The Animation Controls are needed during the setup creation process, the same objects are used by the animator to keyframe the facial animation. You can think of these objects as if they were sliders in the blendshape animation window.
Red Animation Controls indicate symmetrical facial movements.
Blue Animation Controls indicate asymmetrical facial movements.
The layout is organised in the following sections (top down from left to right): eyes and lids, upper face, lower face, phonemes, extra actions and emotional states. Facial actions can be a combination of various Animation Control intensities, but be careful not to use too many at the same time. You can move the Animation Control root object named “Slider” to any desired position in your scene.
In this section you will learn how to use the AFS functions to create a facial animation rig.
Load the first tutorial scene by clicking the “Tut1” shelf button. Have a look at the objects in the scene:
If you move any of the Animation Control sliders there will be no effect. Once you load and connect Motion Data you can see the first results. We will do this in a few moments.
Click on the AFS shelf button to start the tool. You can close the Data Adaptation frame and open the Feature Point Positioning frame, this will save you screen space.
Enter the Global Positioning mode by clicking its button, which will turn orange. You must now globally position and scale the Feature Point Locator cloud to the “Hank” geometry. The “BaseMesh” object displays the Feature Points in their appropriate positions (Guides). It is impossible to fully match the feature points at this state. Use position and scale for the best result possible. Click Global Positioning again to leave this mode, the button will turn grey again.
To fully align the Feature Point Locators enter the Individual Positioning mode by clicking its button (will turn orange). To make it easier to identify the corresponding Guide Locator(s) on the reference object, they will appear in red color. Use the move tool in combination with snap to points and snap to curves to align the Feature Point Locators to their appropriate positions. Optionally duplicate (no input graph) and “make live” a copy of the geometry you intend to rig (“Hank”). After positioning all Feature Points it is no longer required and can be deleted.
The Mirror Selection button allows you to mirror the position of the current selection to the other side.
When satisfied leave the Individual Positioning mode by clicking its button one more time (will turn grey again).
You can take a look at the tutorial scene 2 (“Tut2”) where the Feature Points are aligned appropriately.
If you work on your own scene you have to bind the facial joints to your geometry and paint skin weights. Well painted weights are crucial for a good result. If you decide to work on the provided scene just click on the tutorial scene 2 (“Tut2”) shelf button to load the second tutorial scene. This scene does not have the “BaseMesh” and “Guides” objects, they are no longer needed.
In the “Data” menu select “Load Motion Data”. The path in the popup window will be predefined and should point to a folder containing the Motion Data. If not you can specify it manually. Click “OK” to load the Motion Data. The status frame (28) will update the progress.
Next in the “Data” menu select “Build Connections”. The status frame (28) will update the progress. You have now loaded and connected the basic Motion Data.
Select any of the Animation Controls and examine the result of a value change. What you see is unadapted Motion Data driving the Feature Point Locators. As you examine the effect of the Animation Controls you will see that some deformations work very well, others don’t. In chapter 4.3 you will learn which functions you can use to adapt the data to improve the deformations.
Data adaptation allows you to adjust the facial movements caused by the Motion Data by scaling and setting Adaptation and/or Rotation keys. It must be mentioned again that this can only be successful if the influence weights are painted precisely.
Load the prebuild facial rig (“Hank”) for the next exercises. The Motion Data in this scene has been adapted already. If you have fully understood the functions of the AFS you can use your scene from 3.2 to adapt the facial movements from scratch.
A good example for Motion Data scaling is the Animation Control named “NosW_CAU9”, select it and click on Load (11) in the AFS tool. Once the Animation Control data is loaded the button (11) is labeled Unload. The lists (16-19) are now dynamically filled with values and enabled or disabled X/Y/Z buttons. If you scrub through the timeline you can see the movement of the currently loaded Animation Control (10) “NosW_CAU09”. The reason for the timeline feedback is the enabled Default Key checkbox (13).
The Motion Data scale list (17) shows a value of 0.7 for most nodes (16). This means that the Motion Data has been scaled down by 30%. Set the display layer “Feature Points” to visible and examine the movement of the Feature Point Locators while scrubbing in the timeline. Select any Feature Point Locator(s) and click on the Query button (23). The corresponding node buttons in the list (16) will be selected. Open the graph editor to see the data curves for the current selection.
Go to frame 100, then use the scale function (25) to see the results of scaling the selected Motion Data in the viewport, the graph editor and the list (17). You can always undo the scaling by clicking the Reset button for the current selection. When scaling Motion Data values make sure that no other nodes (Adaptation or Rotation) are selected.
Now what if you only want to scale the Y axis data for the five chin Feature Point Locators? Select those objects and use the selection filter (20) to disable the X and Z axis, next click on Query (23). Only the Y axis Motion Data nodes of the selected objects are selected. The applied scale will only affect the current node selection. You can also manually select data nodes in the list (16).
Think of the Motion Data scale function as the first thing to work with if the pure Motion Data looks too weak or too strong. Be careful not to scale down the movement too much or you may loose the non-linearity of the data.
Examine the Motion Data scale of the various Animation Controls in the prebuild “Hank” scene.
To further adapt the movements of each Animation Control you can set so called Adaptation Keys. An Adaptation Key can be thought of as a controlled scale of the original Motion Data effecting its value at a specific intensity.
In the prebuild “Hank ”scene load the Animation Control “L_Dimpl_LAU14” by selecting it followed by pressing the Load button (11). Click on button (14) to reset all values. This will set the scale of all Motion Data nodes back to 1.0, all Adaptation and Rotation nodes will be removed. The reset function is good if you want to start over adapting a specific Animation Control.
If you scrub the timeline (after Reset) you will recognise that the lips part. In this Animation Control however we want the lips to be closed. Furthermore, the left mouth corner X movement appears to be somewhat strong. Go to frame 100, select the Feature Point Locator named “Marker_RLLP” and position it so that the lower lip does not part from the upper lip. If done press button (26) to set an Adaptation Key.
The GUI list (18) will update and allow you to query (23) or select (directly) the new nodes. Set Adaptation Keys for “Marker_LLLP” and “Marker_LBLP”.
Note that the adapted Feature Point Locators will keep their motion characteristic adapted to the new positions.
You can set Adaptation Keys for any intensity except 0. If you work with the Default Key function (13) the current frame in the timeline represents the Animation Control intensity. It is recommended to always set a key at intensity 100 first. Never delete keys manually in the graph editor, instead select the keys and set its values to 0. To get a better understanding of the desired result of an Animation Control you can use the Action Unit Explanation and the Action Unit Demo Video functions found in the Help menu (4). These functions will open a web browser window to the Facial Expression Repertoire hosted on our website. To access this site you first need to create a user account (registration is free). You might have also realised that the Mirror button (15) is enabled. This is due to the Animation Control “L_Dimpl_LAU14” being asymmetrical. After setting your first Adaptation Keys, click on the Mirror button (15). This function will first Reset the current Animation Control and then apply all mirrored scale, adaptation and rotation values from its asymmetrical related Animation Control, in this case “R_Dimpl_RAU14”.
Rotation Keys work pretty much the same way Adaptation Keys do, except that there is no rotation Motion Data. By changing the rotation value of a Feature Point Locator and pressing button (27) you will set a Rotation Key updating list (19).
Rotation Keys are used to further enhance the facial movements. Examine the prebuild “Hank ”scene to learn more about the use of Rotation Keys.
If you start rigging your own characters you may want to make use of the “base” Maya shelf button. Clicking this button imports all necessary objects to start creating an AFS setup on a new geometry. Notice that you need to detach the skinCluster deformer from the reference geometry.
This videos shows how a character is prepared to be rigged using the AFS tool:
The next step would be to smooth bind the geometry to the joints.
Optionally you can also use the Geometry Matcher and Skin Weights Cloning (video).
tune AFS skin weights
This video shows how to tune skin weight contributions for use with the AFS tool.
The basic weight distribution has been cloned using the Geometry Matcher tool.
See also this video explaining the Skin Weights Cloning workflow.
Notice how deformations are continuously tested by scrubbing in the Time Slider (AFS Animation Control loaded through AFS tool GUI). This is also useful when moving joints in 3d space to quickly evaluate the weight contributions. Just step one frame forward or backward and the joints will be realigned to their position caused by the Motion Data.
Skin deformation in this example is limited to 4 influence objects (optimized for real-time export).
Weights are tuned for one side only and then copied using 'Mirror Skin Weights' (problems in the lip area might occur).
Use this function to delete all Motion Data, Adaptation and Rotation data nodes.
The Motion Data is very dense and simplification can decrease your file size.
See Chapter 5: Updates to Version 1.1 for more details.
This function adds a Blender attribute to the currently loaded Animation Control. The Blender function allows you to choose between the non linear adapted movements (Blender = 1), completely linear movements (Blender = 0) or a mixture of both (e.g. Blender = 0.5).
But why would you want to use linear movements?
There could be stylistic reasons or the simple fact that an Animation Control like the Phoneme P cannot return from 100 to 0 (zero) intensity without pressing the lips together. With the Blender attribute set to 1 you would animate the Phoneme P from 0 (zero) to 100 intensity, then set the Blender to 0 and return to 0 intensity.
While you have created a Blender attribute you must not scale or adapt any data. If you have to, simply remove the Blender, perform the desired changes and recreate the Blender.
Notice that the Blender can only be created if all adaptation curves have a key at their 100% intensity.
This function sets all Animation Controls to a value of Zero.
Loads a prepared set of OnFace Animation Controls aligned to the specific region they deform.
Toggles between the activation of the OnFace and Arthogonal Animation Controls. Inactive Controls are greyed out.
Use this function to import a clean Animation Control with no connections.
Prior to creating a Custom Animation Control you will need some animation that consists of several Animation Controls and/or additional offset animation.
To get a better understanding of what this function can do for you load the “Tut3” file. Several animations have been combined to create a new facial action. Import an empty Animation Control (see above in 5.1.6.), click the new Animation Control and hit the “load” button in the AFS tool. The “Create Custom Animation Control” menu in tools is now enabled. Select the Offset Locators you want to include in your new Animation Control and select “Create Custom Animation Control”. All deformation information is now baked into the new Animation Control. You can use scale, data adaptation and rotation as you are used to. Notice the rotation information on “Offset_RTLP” and “Offset_LTLP”. This has been derived from the offset animation’s rotation.
Always remember to have your frame range between 0-100 when creating Custom Animation Controls.
With this functionality you can now create any facial action that you can think of.
This option allows you to automatically remove any animation on all present Animation Controls and Offset Locators after creating a Custom Animation Control.
This is a simple renaming function for all Maya objects in the current scene to remove a name prefix.
The Shape Baker function allows you to build shapes based on the AFS deformation for further use in other applications.
If your scene contains any character sets please delete them prior using the Shape Baker.
If this option is enabled the AFS will automatically set a specific selection mask when an Animation Control is loaded (11). Unload (11) will set the mask back to all objects.
Enable/Disable the display preference Affected Highlighting. Default is Affected Highlighting off.
Locators selection will be maintained after setting adaptation or rotation keys
This function opens a web browser window to the Facial Expression Repertoire hosted on our website. To access this information you need to register first (registration is free). Not all Animation Controls have an Action Unit Explanation.
This function works like the Action Unit Explanation showing Demo Videos.
Data Adaptation (scale, adaptation and rotation) allows you to adapt the Feature Point Locators to the desired movement of a facial action. When animating the Animation Controls the case might occur where you would like to manually influence a specific Feature Point. Therefore you can use the objects organized in the “FeaturePoints_Offset” display layer. You can always create offset animation on those objects.
Keep in mind that the offset value needs to return to 0, otherwise the offset will be maintained. Use the information on this page to get a better understanding of the desired facial action.
The eyelid movements and gaze controls must be treated separately. They look like common Animation Controls but do not have real Motion Data applied. Furthermore some have an intensity space from positive 100 to negative 100.
Examine the prebuild “Hank” scene to learn more about these Animation Controls. For now ignore the fact that there are Motion Data nodes with no real value. The idea is that you can stay in the workflow of setting Adaptation and Rotation Keys to create non linear movements.
An Animation Control with a -100 to +100 intensity space is represented by two rows for every list entry (16-19) in the AFS tool GUI (see screenshot). If you query (23) a Feature Point Locator the current frame value/intensity is considered for either the +100 space (first row) row or -100 space (second row). Thus setting Adaptation and Rotation Keys will occur in one of these spaces.
Notice that the AFS tool by default only makes use of one joint for each eyelid. This is by far not enough to achieve believable eyelid deformations. The prebuild “Hank” scene however uses a combination of the existing AFS joints that drive a more complex eye rig underneath. Examine the scene for further information.
The symmetrical (red) Animation Controls for the Jaw movements can not be loaded by the AFS tool, it will prompt with a warning in the status line (28) “Selected Object is not a valid AFS Animation Control”. The symmetrical (red) Animation Controls will only drive the blue asymmetrical Animation Controls. Those are meant to be loaded and adapted by the AFS tool.
The animator will only need the symmetrical (red) Animation Controls, therefore the asymmetrical objects can be hidden when the character setup is finished.
In Version 1.1 the global scale to the Motion Data is handled differently. If you start creating a new rig using Version 1.1 you do not have to care about anything.
To update a Version from V1.0 to V1.1 use the update menus as shown in the first image (only appears in V1.0 scenes).
This update was necessary to retrieve the exact rotation values for data adaptation when exporting to a game engine. After the update all rotation curves are baked (sampled).
To retrieve a selection of all rotation data curves you can use the Tools/Select Rotation Data menu. The “simplify curve” command can then be used to get rid of redundant keys (see image sequence left).
Version 1.1 also includes additional Animation Controls (LipTight AU23). Use the appropriate menu function to update your Version 1.0 scene. You also need to adapt the movements of the new Animation Controls. The update function will rearrange your slider layout as displayed on page 4.
Please notice that any changes to names and hierarchies will cause problems to the update procedures and make it very likely for them to fail.
Q: What is the AFS?
A: The AFS is a Toolset to create a sophisticated facial animation rig for any humanoid character
Q: How does the AFS work?
A: The AFS provides the Rigging Artist/Technical Director with powerful Tools to adapt a library of up to 100 Motion Data Clips to the animated characters face
Q: Why would I use the AFS instead of traditional Blendshape animation?
A: The Motion Data contains non linear movement distilled from real facial movement therefore it is much more believable than linear Blendshape animation.
Q: Can you describe the Facial Animation System the AFS uses?
A: The Animation System is based on Paul Ekman’s research work also known as the Facial Action Coding System (FACS). This definition of facial movements has been revised by the AFS authors to better fit animation needs. The AFS does also allow offset animation for each of the Feature Points Locators.
Q: Does the AFS only work for realistic faces?
A: No. The Motion Data can be adapted to abstract faces and creatures as well, as long as they have humanoid features. See the demo movie on Geometry Independence on our website.
Q: The AFS requires very fine weight painting, this can be very tedious. Is there a procedure that can speed up this process?
A: Yes. The Geometry Matcher tool for Maya can clone skin weights. This can provide you with a very good starting point. The Geometry Matcher is part of the Facial Animation Toolset.
Q: I have problems understanding the Animation Controls based on the Facial Action Coding System. What can I do?
A: We have created a Facial Expression Repertoire that describes various facial expressions and the relation to the FACS.
Q: Can I mix the AFS with Blendshapes?
A: Yes. Make use of the Corrective Blendshape Manager tool for Maya to create and manage blendshapes calculated to correct a specific mesh deformation by bones. The Corrective Blendshape Manager is part of the Facial Animation Toolset.
Q: I have problems finding the appropriate Animation Control values, the face looks too exaggerated?
A: You can not add up a Smile, an Open Smile and the Right Mouth Corner Puller all at 100%. This will cause unnatural movement. Spend some time studying the blending behavior of various Animation Controls.
Q: Is there a Mac version of the Facial Animation Toolset?
A: Unfortunately not but here are some informations for a workaround. Only the Geometry Matcher (GM) and Corrective Blendshape Manager (CBS) tools require OS dependent plugins. The AFS can be run on Mac by changing the path in the AFS.mel file. The plugins used in GM an CBS are not persistent in the scene. This means that you can do your rigging work on a supported platform (Win or Linux), save your scene and open it on on Maya for Mac (or Maya 64-bit) system.
Q: I am experiencing strange GUI behavior with the AFS tool and Maya 2011 or higher.
A: To fix this problem try disabling Windows: remember size and position in Window→Settings/Preferences→Preferences→Interface.