Learning Facial Motion
Blog#15
I have been learning about Facial Motion capture for the past week. Thanks to the release of the Metahumans project by Epic Games and 3Lateral, I now have access to a full body rig with built-in blendshapes.
I will have a vlog and demo video showcasing, not only my progress but how to set up Faceware Studio inside of Unreal Engine using the Live Client plugin by Glassbox Technologies.
I am also incorporating body and finger data onto these characters by streaming in data from MVN Animate which integrates the finger data from Manus Core, using the MVN Live Link plugin, that was just updated for 4.26.
Incorporating Body, Finger and Facial Motion Data Inside of Unreal
I am able to stream in live body, finger and facial mocap data into this project and not only make adjustments to the data but also to record it and create a cinematic sequence.
This has been an amazing learning experience and I am very excited to share this in a video and vlog. I will be showing the step by step process of how to set up the Faceware Studio blueprints inside of Unreal, as well as how to set up the Tpose and remap asset for the Xsens and Manus mocap data.
In addition, I will be showing how to record the data and then add it to a sequence.