Skip to main content

UE Facial Capture (Recording)

About

This guide covers using Unreal Live Link to record ReadyPlayer.Me facial animations with FaceID using one of the lab's iPads. By the end, you should have a working facial animation that can be paired with body animations and audio.

I chose ReadyPlayer.Me for several reasons: there is character customization, standardized face morphs/blend shapes, standardized character bone rig, and others.

This tutorial does not cover importing custom characters. A future guide may be considered to cover that topic.

Any Apple device that uses FaceID can be used. In this guide, SCiL iPads are being used.

Getting Started

Verify connection

It is IMPORTANT that both the workstation and the iPad are on the SAME NETWORK.

For the workstation, open a command-line prompt and type in:

ipconfig

Look for IPv4 Address. This is the address that you will enter into the iPad.

Screenshot 2024-07-18 160919.png

When you are ready, launch Live Link on the iPad. Select Live Link (ARKit) and press Continue. Press the [gear] icon in the top left. Tap the button Live Link under Streaming. You can leave the Subject Name alone but press Add Target. Add the IP address you obtained from the command-line in the previous step and enter 11111 in the Port. Exit the menus and return to the full scene where your face is being recorded.

Open the Unreal Project if it is not already opened. Open the window Live Link located in the menus at top under Window/Virtual Production/Live Link. If everything is running properly, you should see the iPad under the Subject Name area. Notice the yellow dot when it is NOT tracking your face, and green when it is. You can close the Live Link window.

Verify everything is working

This is a moment to take to verify everything is working. While using the iPad, you should notice your ReadyPlayer.Me character is now following your face, including your eyes!

From here, I like to run through various facial expressions or even lines and watch how my character emulates the movement. Whether it is you doing the recording, or an actor. I think it's important here to try the character so you can adjust your own facial expressions to match the expected facial expression you desire when recording.

Recording

In the Take Recorder window, you will notice a [+ Source] button. Press this button followed by From Live Link and then select the iPad source. The iPad should appear in the source window. Click on the iPad (not the yellow toggle) to review details in the panel underneath. Uncheck Use Source Timecode. You can also change the Subject Name to your character's name.

Screenshot 2024-07-19 095401.png

Unchecking this button uses time based off the engine. I recommend this because it will always be a consistent time versus the time based off the recording.

You will notice some information before recording, such as Slate information, Take, and Description (see image above). Feel free to edit these values with each new recording so that information matches with your take sheets.

Notice above the description that the default fps is set to 24 fps? This is one of the cinema standards. You may feel compelled to use a high frame rate so that your experience "looks good." Unlike a game engine's standard FPS, this is the interval that keyframes are are taken. This is independent from the actual Tick/Update loop of the engine and will look the same regardless of computer speed. I recommend for good performance/balance to leave this alone and keep animations in general at no more than 29.97.

To record, fill out the appropriate information. Press PLAY to enter play mode first. Press F8 to center your camera up. This is only cosmetic so you are able to view the character as you record. When you are ready, press the red record button in the Unreal editor (not the iPad).

Afterward, a recording will be saved in your Content Drawer as Cinematics/Takes/etc.

Baking/Exporting

If you open up your recordings, you will notice there are two recordings for each take. One recording is what you will use. The other (saved in .._Subscenes/) is the recording from the iPad, which has a significantly larger file size. The iPad records at 60 fps by default. Use the first, smaller file size recording.

You will find your character in the project root directory, under Saved/PersistentDownloadDir/Avatars/(SOME ID)/(ANOTHER ID)/YourCharactersID.gltf. In the Content Drawer, I have created a folder called Characters. Create a new folder in here with your character's name and open that folder. Now drag in the GLTF file above into that folder. When the window appears, be sure that Import Morph Targets is checked!! After Import, be sure to Save All.

After import, sometimes multiple items are selected. Click once on any assets and then click on your character (skeletal mesh) with the pink bar in the middle. Drag that into the scene.

Where to go from here?

I have found that the animation baking is already done following this process and will work for ReadyPlayer.Me characters because they share the same base mesh/rig.

If you would learn how to import your own character, check here:

If you would learn more about the recording process, check here: