All In Engine Virtual Production Face, Body and Hand Motion Capture with the Metahumans

Motion Capture Data with the Metahumans

Gabriella Krousaniotakis / March 19th, 2021 Blog#15

For the past week, I have been living inside of the Metahumans project that was recently released by Epic Games and created by 3Lateral.

I am using this project to stream in body, finger and facial motion data and record it inside of Unreal. My goal has always been to do everything entirely inside of Unreal and to combine all of this motion capture data. The software’s I am running besides Unreal, are Xsens MVN Animate, Manus Core and Faceware Studio.

Software

Xsens MVN Animate allows for the Manus Prime II glove data to be integrated into its software, through Manus Core and then connects to Unreal using the MVN Live Link plugin, that was just recently updated for 4.26.

Faceware Studio streams its data into Unreal using the Live Client plugin by Glassbox Technologies.

With these three software’s streaming into Unreal, I can not only test out all of the body, finger and facial motion data, but also record it, clean it up, edit it, add it to sequencer and create a cinematic.

Mocap Hardware

For the body data, I am using the Link suit by Xsens. For the finger data, I am using the Manus Prime II gloves which are compatible with Xsens. And for the face data, I am using both my webcam as well as recording my face with my Canon 1DX mark II with a 50mm in order to have more control over lighting and recording at higher frame rates.

Ideally, I would like to test out the facial motion data with the Mark IV wireless camera system or the Indie Headcam built around the GoPro. The reason for this is that when looking at the video footage of the face captured with these tools, I notice even though the head moves, the face is perfectly centered in front of the lens.

After doing tests with the webcam and video camera, I find that when I keep my head perfectly still, but just move my face, I get cleaner and more precise data this way.

I also want to mention this is my absolute first time recording facial data with Faceware Studio, having only tested and dabbled with the UE LiveLink Face app using my iPhone. The control I have in Faceware Studio is truly remarkable.

Computer

The other important piece of hardware I am using is the Razer Studio laptop….yes, I said it. A laptop.

This laptop is actually a little beast! When looking for a laptop in order to start learning Unreal Engine, this laptop was suggested to me by Alex Bartz, who happened to write a blog about mobile workstations for VFX. Thank you, Alex!

When I went onto the Razer page and looked at the Studio model, I saw for the first time a filmmaker that created an entire film in Unreal Engine using this laptop. That filmmaker is Haz Dulull and he made the film Battlesuit with this laptop, in Unreal!! I never forgot that.

The day I received the laptop, I installed Unreal Engine and  have been living in this software ever since. Seeing what Haz Dulull created with this laptop, changed my life.

However, the true test has been this Metahumans vlog. I have been streaming in live body, finger and face data into it, recording it, making adjustments to the blueprints, running it with ray tracing, leaving the LOD’s alone and even doing some movie render passes while running the live data in sequencer. It doesn’t miss a beat. I was even able to get my fps up to 60 in this project when recording!

I honestly don’t know how that happened, but it did!

Xsens MVN Live Link Tests

For the body data, since Xsens just released the new 4.26 plugin, I was able to stream in the data directly, instead of having to migrate it from a 4.25 project. Talk about perfect timing.

One of the challenges has been learning to retarget the data to these characters, since they have 5 spine bones and 2 neck bones, in the character remap asset chart. The Xsens MVN plugin prefers characters with 4 spine bones and one neck bone. I played around with the remap asset and got it to work, beautifully!

I have to thank Katie Jo, for not only helping me with this process but for believing in me and giving me the hand up that I needed when first starting out with virtual production a few months ago. Katie Jo, there is a special place in my heart for you. You have moved mountains for me and brought me closer to my dreams. Thank you, for being you!

Faceware Studio and Live Client Plugin by Glassbox

Faceware Studio is quite easy to use. All you do is open up the software, calibrate your face and can then make adjustments to various facial expressions in the Animation Tuning. When you calibrate, the software then tracks your facial movements, which that alone is pretty crazy!

It gives you the option to either use your webcam to capture facial motion in that moment or you can upload a video recording of your face.

I discovered the video recording option thanks to a review by Solomon Jagwe. Thank you, Solomon! This review gave me the idea to record my face with my camera, and to be able to light my face better using softbox lights.

I got some great lights from my Creative Pinellas mentor, Victoria Jorgenson. Thank you, Victoria! I use them every day!!

I have been testing the video recording option by recording my face and mouth while saying vowels, the alphabet, full sentences, even did the “Rain in Spain, falls mainly on the plain” and for each of these tests I am able to save the profile settings for each of these. That is beyond useful, as some settings work better for some tests, than others.

The other amazing option Faceware Studio has, is that it has a streaming panel, which is where you can turn on, Stream to Client. This option works with the Live Client plugin to Unreal by Glassbox Technologies.

When you are inside of Unreal, and have your Faceware blueprint setup, using the port number and IP, it allows the Faceware Studio data to communicate with Unreal, in an instant! Wow!!! Talk about brilliant!

This is how I am able to get the data from Faceware Studio, into the Metahumans project. You can press play in Unreal and see your data play out in real time, or as I prefer, to add a Live Link Skeletal component and see the data play out without having to be in play mode.

Besides being able to make adjustments to the facial motion in Faceware Studio, you are also able to make adjustments inside of the Faceware Live blueprint by adding multiply or addition floats. That is as far as I have gotten with regards to customizing the blueprint as it is the first facial motion blueprint I have ever set up.

One more thing I want to mention, is besides loving Faceware Studio, the team and people that make up this company have been so good to me. Helping me, teaching me how to use these tools, how to get them to work and the support I have received, has given me the encouragement to keep going. Thank you, Peter, Karen, Josh and Brian!

Metahumans Facial Animation Recordings

These Metahuman characters, I would say are almost as complex as real humans. The only thing they don’t do is dream…or do they?

I wanted to experiment with the curves adjustments of the facial animations I recorded and see if I can not only make adjustments to the Faceware Studio Animation Tuning and the Faceware Live blueprints, but also if I can add more to the expressions I recorded by playing around with the tongue curves and trying to get enunciations that I am having a hard to capturing with the facial motion data.

I figured out how to use the backward solver which makes adjusting the curves much easier in sequencer. This, I want to spend more time with, in order to fine tune the animations I have recorded and to accentuate certain expressions, besides adding some that are not there.

I am also finding that I am getting lost in this project. You can spend days, weeks, months, possibly years, and still not discover everything going on with these characters and what they are truly capable of.

This week I watched and listened to the Pulse, The Rise of Real-Time Digital Humans. Some very interesting comments were made, but also some interesting questions. One being, can these characters elicit emotions from an audience? My question is, can I figure out how to do that with them?

When the release of the MetaHuman creator comes out, I know that my goal is going to be to use them for my ALH84001 project about the rock from Mars, that spawned the birth of Astrobiology. To recreate the ANSMET team that discovered the rock, but to also capture their faces, their conversations and bring those moments back from time. I get goosebumps thinking about the endless possibilities this release is bringing.

Right now, my goal is to to learn as much as I can about these Metahumans, just as you would a real person.

I find that when I spend so many hours inside of Unreal, that the line between real and unreal becomes blurry. Ever since I started working on this vlog, I have found that the time literally flies, and even when I sleep, I am troubleshooting and coming up with things I want to create with them.

I just need more time to get to know them. I wonder how long that will be?

 

 

Leave a Reply

Become a Creative Pinellas Supporter