MoCap W6.5 '25: Semester Break
- Hannah Chung

- Apr 30
- 3 min read
Semester break was busy. I imported the Jasmine Metahuman into Maya and then began working on the facial data in Faceware Analyser and Retargeter.
First things first, I imported the Metahuman from Quixel Bridge to Maya. I applied a rig plugin provided by the lecturers to her body.

Next I imported the facial control panel label plane into the scene, again provided to us.

Then I went through the utter joy of going into Unreal Engine, importing in the Metahuman to that software, running different plugins to see her hair, exporting the hair and eyebrows out of the content drawer at the highest LOD and then importing these .fbxs back into Maya.

Initially, an 'Oscar the Grouch' Jasmine appeared. So beautiful. I had to apply the following nodes to the lambert shader in order to define her eyebrows with more than grey planes.

The result:

Oh great! She's got glow worm brows now!! Unfortunately with some more node adjustments, the glow worms were no more.

The Result:

She looks bonita. Then I used a wrap deformer to attach the brows to the skin mesh, meaning that the brows could be controlled unanimously with the face.

With her brows now working as they should, it was time to add the hair by following the same process.

AHH JUMPSCARE! Dark Nun Jasmine has joined the party. I hid the extra mesh piece and applied the texture to her hair.



I tried to add a wrap deformation to the hair like I did with the brows, but any movement to her controllers and the hair warped in a less than ideal way.


Metahuman Jasmine was now aesthetically complete! (minus some clothes).
Now it was time for Analyzer... I used the same process we went over in Week 4's practice. First I started by keying the brow ROMs footage, then the eyes ROMs and after training and tracking, I exported training models to apply to the actual performance footage.

However, as a little extra curricular, I also did the mouth tracking as well.
Whether I decide to use the mocap data or just keyframe the mouth poses will remain to be seen depending on how much time I have in future weeks.
Because there wasn't a mouth ROMs file, I tracked them on the performance footage file.

This is the result of my eyes and brow tracking:
And then the result of my mouth tracking - notably so much easier than last year's:
The final stretch!! Retargeting time!! My setup in Maya consisted of the Facecam (which I had to personally adjust because when I imported it in, it was completely askew) on the right, and the control panel and the performance footage video together on the left.

The workflow in Retargeter was again different from last year because of the ROMs. I worked on the ROM files and keyed the same frames as I did in Analyzer. With all the keys setup, I would test the retargeting accuracy. If there were any areas where errors occurred in the tracking, I would go back and make additional frames to fix the bugs.
First was key posing the eyebrows.

Then more excitingly, the eyes:


I worked on one eye at a time so Jasmine sadly spent a lot of time looking like this lol:

Eye ROMs Retargeted:
When I was happy with the retargeting of the ROMs, I selected the key poses and added them to the character's shared pose library. Then I opened the performance footage file and added the shared poses for the system to retarget from. Some real smart cookie stuff right there.
When there were (expectedly) some issues with the accuracy of the retarget, I added a couple key poses to strengthen the tracking, and the system drew from both the shared poses as well as the new key poses.
The result was a glorious thing, my retargeted performance footage:
There is some jitter in the irises but this is looking good enough to me for now. And fyi, no I did not do the mouth. I was too tired at this point. But that's okay, it was fun to do in Analyzer regardless. Therefore, that concludes my semester break MoCap work! Yippee time to sleep.












Comments