Experimenting with Mocap Data in Unreal Engine, Dragonfly virtual camera and Remote Multi-User Testing
Gabriella Krousaniotakis / December 17th, 2020 Blog#6
Working with Multiple Characters and Animations in Unreal
I created this nightclub scene entirely in Unreal Engine. I recorded all of the character animations with the Xsens Link suit and Manus Prime II gloves in Unreal using the MVN Live Link plugin. I learned a lot from this.
I learned a lot about rigging and retargeting data, particularly with the fingers. Setting up multiple character animations in sequencer was a little tedious but considering how many characters I originally started with (nine), I ended up not needing all of them. But being able to have so much creative freedom thanks to all of this new mocap tech is just Unreal!!!
The most important lesson I learned though, is one person can create a little film with all of this amazing technology. It was just me in my home office, with the suit and gloves on and having fun in Unreal.
Granted, I have only been using Unreal just shy of 4 months, but this virtual production community has been the reason I was able to learn how to utilize all of these tools. I still have so much to learn and realize I am just at the beginning of this journey, but I have never had more fun in my life, so I can only imagine what excitement lies ahead.
I have to give thanks to Winbush and the guys at Mograph.com for making that Unreal Engine course (I even learned Cinema 4d from it), JSFilmz for his awesome tutorials and encouragement, to Manus for giving me my first taste of motion capture equipment, and to the lovely ladies at Xsens for opening up a door to a world that I never thought I would fall madly in love with, motion capture, that is!
My journey with face mocap is still progressing, as I am discovering it is a challenge to find a rig with face morphs/blendshapes. At least when it is just you, self-funding yourself. When I do get that rig that I can test out the Faceware Studio software with the Glassbox Live Client plugin to Unreal with the Xsens and Manus body/finger data streaming simultaneously, I will be beside myself!!! I cannot wait!
I started learning Maya, as my impatience in finding a rig has led me to try and create my own blendshapes.
Glassbox Dragonfly Virtual Camera
I tested out the mocap data I recorded with the Xsens Link suit and Manus Prime II gloves in Unreal Engine using the MVN Live Link plugin to Unreal, with Glassbox Dragonfly virtual camera. All I can say is I did not realize how much fun this was!
Testing out this virtual camera tool was just beyond exciting. Dragonfly offers so many options. You can choose your lenses, sensors, focal length and it even has an option to smooth out the camera movements after you have recorded your data.
I used my iphone with the Dragonfly app, created joystick buttons on the phone through the app and added the plugin folder to my Unreal project. Thanks to the awesome tutorials offered by Glassbox, I was up and running within minutes! The plugin is set up in such a way, that all you have to do is activate the link from the Unreal project to your phone and you can control the Dragonfly camera from your phone, press record and move around the scene. Pretty amazing.
I can’t wait to test this out with a remote multiuser session with the Xsens Live Link!!!
Unreal Engine & Mocap Demo using Xsens and Manus
Still working on getting the face mocap into this mix!
I find all of this technology so incredible. What is even more incredible is the people that make up this motion capture community. Never have I been shown such kindness and support than from the people from Xsens (Katie Jo & Audrey Stevens), from Manus (Serdal & Arsene), from Faceware (Karen Chan, Josh & Peter) and from Glassbox (Norman) who have all been there to help me every step of the way. Thank you! It has taken an army of people to help me to get to this point in my journey.
Here is a little glimpse of what the workflow looks like when combining the body mocap data from Xsens MVN Animate with Manus Core inside of Unreal Engine. This was my first time filming myself and doing a screen recording.
Unreal Engine & Remote Multi-User with Live Mocap Data
I wish I had more to show for this, as this test was so exciting. I have to thank Kevin Cooney for reaching out to me and introducing the idea of testing out a remote multi-user session using the mocap data from Xsens MVN. He introduced me to Aiden Wilson, who I believe is an absolute genius!
Both of these guys are true gems and doing incredible things with virtual production! Kevin Cooney is based out of the UK and Aiden Wilson is all the way in Australia.
One of the challenges for this was finding a time that worked for all of us, as being based in Florida, Kevin is 5 hours ahead and Aiden is 15 hours ahead. For my first multi-user session, it was Aiden and I. Being in the same project with someone else, and both of us doing different tasks simultaneously was unreal!
The MVN live link worked and we did some testing. We are both in the same project, and streaming live mocap data from MVN Animate into Unreal using the Live Link plugin. Here is a quick glimpse into this project.
Creative Pinellas Mentorship
I really cannot say enough about my mentor, Victoria Jorgensen. Wow! What a lady! This week Victoria introduced me to Gail Evenari. Gail has created educational content by utilizing Virtual Reality with the use of a 360 camera to immerse her viewers/students into the world of her subject. I have never seen anything like this before and it is truly brilliant!!!
With regards to my Antarctica/Mars project that I am preparing to create inside of Unreal Engine, being based on a meteorite that was discovered in Antarctica and is known to be from Mars, Victoria is putting me in touch with a historian/professor and traveler whose interests lie in the Polar regions. For anyone who knows me, they know that going to Antarctica has always been a dream of mine, in order to film the first 5 pages of my screenplay.
However, with the way the world is today, my only option is to do this with Unreal Engine. I would do this by creating topographical maps of the region the story takes place in, creating the characters (ANSMET team) that discovered this meteorite and using motion capture data I record with the Xsens suit, Manus gloves and hopefully Faceware to drive the characters facial expressions.
When I got the message from Victoria about the Polar historian, I will admit, I did shed some tears as this project is so important to me. Knowing someone is helping me to turn this dream into a reality, I was touched. Thank you, Victoria. No words can describe my appreciation.