The LipSync component handles playing back LipSync animations on characters. An instance of it stores all the required information about a character, including the various Phoneme/Emotion poses and Gesture animations.
The in-editor inspector for this component also contains the editor for creating poses. For more info on this, see LipSync Inspector.
A GameObject that has a LipSync component attached is referred to as a character. The component can technically go on any GameObject, but for the sake of organisation, is usually placed on the parent object of the meshes, sprites or other objects that actually represent the character.
A few things are required for the component to work correctly:
A Blend System needs to be assigned and set up.
An AudioSource needs to be assigned.
Poses need to be set up.
The rest of the process is covered on the LipSync Inspector page.
There are two ways of starting to play an animation from a LipSyncData clip:
Play On Awake Setting
The simplest method is to enable the 'Play on Awake' checkbox on the LipSync inspector. When this is on, you are given an object field to drag a saved LipSyncData clip into, and a default delay time option. This will cause the LipSync component to play the chosen clip as soon as the scene starts playing, much like an Audio Source.
This is the most common method of playing an animation. Get a reference to a LipSync component, either by exposing a reference
public LipSync lipSyncCharacter; or using GetComponent.
Then call the
.Play(LipSyncData clip, float delay) method on it, passing a reference to a LipSyncData clip to play, and (optionally) a delay in seconds before the clip will start playing.
Additionally, there are integrations with several third-party assets that provide other ways to begin playing, such as through PlayMaker, NodeCanvas and Adventure Creator. Information on these can be found on the Integrations page.
As well as playing back LipSyncData clips containing phonemes, emotions and/or gestures, you can manually tell LipSync to transition into a named emotion when not playing a clip.
This can be useful for creating reactions to in-game events, or to add more interest to idle animations. Simply call the
.SetEmotion(string emotionName, float blendTime) method on any LipSync component passing in the name of any emotion defined in the Project Settings. The character will then blend from its current emotion (or neutral) into the chosen emotion over the course of the blendTime value passed in.
If a clip is then played, it will transition from the currently set emotion. You can also call
.ResetEmotion(float blendTime) to blend back into the character's neutral face.
LipSync includes a UnityEvent called
onFinishedPlaying which is called whenever a clip ends, either naturally or by
.Stop(bool stopAudio) being called. This can have listeners added from the editor, or from code. See Unity's documentation for more info on this.
.loop field is a boolean that determines whether any played clips will loop back to the start when they finish. If this is true,
onFinishedPlaying will still fire at the end of each loop, but the clip will never stop of its own accord.
.Stop(bool stopAudio) must be called manually, or
.loop set back to false.
.keepEmotionWhenFinished boolean controls what happens to emotions at the end of a clip. If a clip ends with an emotion that hasn't blended out and this value is false, then the emotion will transition back to neutral anyway after the clip finishes. If this value is true, however, the emotion will stay active on the character, and can then be reset later with the
.ResetEmotion(float blendTime) method.