Unreal Engine Facial Motion Capture
Gabriella Krousaniotakis / May 7th, 2021 Blog#23
In this weeks blog, I finished the tutorial I created for GlassboxTech and FacewareTech on how to use the updated motion logic blueprint for the Metahumans.
Motion Logic
I have tested out the facial motion data with the UE Live Link Face on the new Metahumans in Unreal Engine, and found the data to be a little too linear. That means, the expressions are drastic and unnatural looking. With the blueprint created by Glassbox and Faceware, because it utilizes curves, the data is non linear, hence, it looks more natural.
Mars and Antarctica Project Development
I had a lot of fun creating this tutorial and am now using it for my Mars/Antarctica film that I am making entirely in Unreal Engine, by combining body, finger and facial motion to drive the Metahuman characters. I am customizing the Metahuman characters to look like myself and one other person.
Essentially, developing a seamless pipeline, as I am learning how to replace the Metahuman body with my own character body, which in this case is going to be a space suit, for the Mars scene.
What I am doing is removing the head of the spacesuit, and replacing it with a Metahuman head. Then, I am recording body data onto the space suit skeleton, and then face data on the Metahuman head and combining both inside of Sequencer.
I am collaborating with FattyBull aka Bernhard Rieder, as he is doing the cinematography, and directing, and I am doing the producing and motion capture as well as the Metahuman troubleshooting.
The people that are helping me are, Ega Dyas, who has created the current space suit I am using for my proof of concept. Pierpaolo aka Unreal_Environments, who created the Mars map, and Saint aka PixerUrge, who is helping me to customize the space suit.