Using Unreal Engine and Motion Capture for Self-Expression

In the past four months, I have been learning how to use Unreal Engine in order to make films. Ever since I entered this new world, I have changed. This experience has allowed me to reinvent myself.

I have had the opportunity to learn about the 3D world. I have learned so many new software’s and tested out cutting edge technology and met some incredible people. This piece that I am sharing in this blog represents my journey with virtual production and Unreal Engine.

The Process

When I was creating this, I approached this very differently than my other works. This particular one is special. When I was recording the motion capture data with the Xsens suit and the Manus gloves, I was not thinking about anything in particular. I was just having fun and exploring this new technology with fresh eyes.

In the back of my mind, I kept thinking about waking up in a strange new world without knowing who I am or what I am doing here. As an artist, I believe what we create is an expression of ourselves and our inner psyche. Without realizing it, I created this piece that represents how I feel in this new world I am in with Unreal Engine and all the things I am learning that fall under the umbrella of virtual production.

Lately, I have been thinking about Antarctica. My goal in life is to go there one day and make a film about a very special rock that was discovered there, and that came all the way from Mars. Hence, the environment I chose to create was influenced by my inner thoughts of this world I find so mystical.

The Character

When searching for a model to use for this project, this particular one stood out. Ever since I started learning about motion capture, finding well made models that speak to me has not been easy. But with this one, I just knew. As soon as I added the mocap data to her, I noticed things about her. For instance, her body jiggled when she walks. Almost human like. And even though her face does not have blendshapes, there is so much expression with her body movements and hands that compensates for her still face.

This model was created by Michael Weisheim.

The Environment

In order to create a dynamic environment with snow, I looked for this feature and found Everett Gunther had created an interactive weather asset where rain and snow collect onto certain materials. In order for the snow to collect onto an object, I had to create a blueprint in the material it was to attach to.

The blue ice I used, came from an asset I bought when I was thinking about my Antarctica project a while ago. I created a terrain map of my neighborhood thanks to a tutorial by Jonathan Winbush. I swear, he makes the best and most creative tutorials! In order to add texture to the terrain, I played around with several materials and had to get the tiling just right so that it did not look artificial.

Then I painted the environment with mountains and glaciers and rocks to give it an alien but familiar landscape appearance. I brought in several Polar/Artic projects I had been collecting for the Antarctica project and used assets from several of these projects.

The story was slowly unfolding and was actually shaped by the environment and mocap animations. At first, I had chosen for the mirror door to be in a wall, but after seeing some ancient statues on Quixel Bridge and finding a marble material that would work perfectly with the dynamic snow feature, I choose to go with the statue as the location for the doorway. This entire process was improvised.

Unreal 4.25 & 4.26

I should also mention I had to create this project twice. Once in 4.25 and then in 4.26.

The reason being, the dynamic weather snow only works in 4.26.

I also wanted to test out the raytracing update. In the 4.25 project I was able to use the Xsens MVN Animate Live Link to record my mocap data onto the character, and then migrated the animations to the 4.26 project.

I also wanted to utilize the Glassbox Dragonfly virtual camera for some of the panoramic shots so this was done in 4.25.

Both Dragonfly and MVN Live Link work in 4.25 for now, so I can’t wait for these plugins to work with the newer version of Unreal.

Cameras

I spent a considerable amount of time learning the virtual camera Dragonfly, created by Glassbox. I loved that I was able to custom create cameras and sensors as Unreal has a nice variety, but Dragonfly takes things to a whole new level.

Besides using the custom virtual camera that Dragonfly offers that was controlled by my iphone, I learned a lot of about the various lens options that are built into Unreal. I decided to capture the same animation from several angles, wide shots, close ups, medium shots, some with lens blur and others with a wide aperture.

It was the closest thing to making an actual film, but without the worry of driving a crew crazy with the camera angles and locations that would otherwise be impossible had it been filmed in Antarctica, having the weather and lighting constantly change according to the winds and time of day.

I was able to control the lighting through the dynamic sky blueprint. Adjusting the amount of clouds, and even time of day, and and being able to keyframe some of these in order to create effects that would be impossible to do in real life. For example the aurora borealis at the end, that would not be possible without this dynamic sky feature in Unreal. I was also able to place the moon exactly where I wanted and control its size.

The Story

The story unfolded on its own. This entire process was so enjoyable that I did not rush this. In all, this took me less than a week to create but I spent about 6 to 8 hours a day on it. Each hour had a new obstacle I had to overcome. And I did this carefully, sweating every decision. Not sure what the end result would look like but trusting in the process.

Once I thought I had the set complete, a new set of problems would arise. The lighting would change the appearance of the landscape and I would have to move the set of a specific animation to a different one, or create a new one. That is how the statue location came to be. By trial and error.

Eventually, I would create a location based on the animation, and record the camera shots I wanted. As I crossed out each animation, the story was slowly starting to unfold.

Then I got to the mirror door. I had spent some time playing around with various ways to make a mirror material, however when I would record the sequence, it just didn’t work. I found a tutorial that went over various ways to achieve this, skipping the planar reflection option as the guy in the tutorial mentioned this would be expensive. In the end, I decided to just go for it.

I thought I was going to have a heart attack when I made certain setting changes and restated the project in order to activate the planar reflections. It took 30 minutes for the project to load and I was so close to being almost finished with getting all my sequences recorded, I thought I might have lost the project. Then….when all the shaders compiled and I activated the planar reflection…it was magic!

I was not sure how to end this, but knew I wanted the aurora to be at the end, so again, this little short kind of created itself. It took on a life of its own. The animations I had left over fit very well with the ending so it all came together.

Editing and Music Selection

I did not realize how many renders I had until I imported all of these sequences into Premiere. I organized each scene into a sequence and then created the master sequence with all of them. Only then did I realize it was 8 minutes long. It almost hurt to cut each shot as I realized I had gotten too close to the material. But I did manage to bring it down to 6 minutes.

I wish I had made it a little longer with more panoramic shots using Dragonfly. I also realized when cutting this, there were some scenes that needed a transition. For example, when she enters the mirror, the other robot that has been frozen turns and looks. This, again fell into place on its own. Sometimes I wonder if Unreal has a mind of its own. It gives you a tremendous amount of control but for some reason, this story made itself. I just built the set, controlled the cameras, lighting and animations, the rest was created on its own.

When choosing the music, I knew the music had to be able to tell a part of the story that the images were unable to do, so I chose to go with piano, as I have always been affected by the sound of piano in a way I cannot explain. I played around with several tracks and could not seem to find one that was 6 minutes long. I found two tracks by the same artist and overlayed them. Funny thing is, even the music fell into place on its own. When one track ended and a new one was beginning, the scene was transitioning at that moment. All the keys fell on the right moments. I cannot explain this.

All in all, I would say this experience was the closest thing to creating art without knowing it. When I sat back and looked at the final cut, I was a little shocked at what I created as I had never imagined the final piece in my mind. It was like watching it for the first time. Then it hit me, this was my journey, this was how I felt in this new world I was in. Discovering myself for the first time and finding my reflection, finding myself.

Become a Creative Pinellas Supporter