All In Engine Virtual Production Facial Motion

Faceware Glassbox Metahumans Xsens Awinda

Blog#22
This week, I spent working on the tutorial for the new updated motion logic Faceware and Glassbox Metahumans blueprint. I got a chance to record my face, put this video into Faceware Studio, and then see it play out with the Glassbox blueprint on one several Metahuman characters, that I imported from Quixel Bridge inside of Unreal Engine.

Facial Motion

I have been waiting so long to be able to combine the body and finger data from Xsens and Manus with the facial motion data from Faceware and Glassbox on the Metahumans. Finally, I have been able to combine all three. And what I have to say about this, is the creative options are unlimited!

I have begun working on my Mars/Antarctica project now. I am going to utilize the Metahumans with all of this motion capture data, to create the first few pages of the screenplay I wrote and make the intro trailer.

The main challenge is going to be, adding the Metahuman head to a different rig, in order to be able to change the outfits. I am in the process of having some custom rigs built exactly for this purpose.

Again, with the addition of facial motion, now my characters can talk and  make expressions which adds to the storytelling and cinematics.

Become a Creative Pinellas Supporter