Motion Capture Week 8 - Control Rigging and Data Solving
- Hannah Chung

- Sep 24, 2023
- 2 min read
This week I began to make my Autodesk Characters for my scene. I wanted to make multiple characters for the crowd scenes and a main character. In the script, the main character's name is Jordan so that's how I'll refer to her. The rest I just made up names for.
Here are my characters in the default 'T' pose:
Jordan:

Aaron:

Derek:

Granny:

Lily:

Police Guy 1:

Police Guy 2

Sam:

Wanda:

Then once the characters were created, I imported them into Maya and began to create control rigs by assigning their bones to the Human IK system of controllers.

Starting by selecting a joint in the Autodesk Character, we assign it to the correlated bone in Human IK.

When the bone is assigned, it will turn green.

This continues for each bone until the whole body is green.

Then a control rig can be created, which can be used to control the movements of the joints.

After a couple of weeks, all of our Cortex Data had been cleaned by my teammates and I. We kept them in a shared folder but worked individually from this point onward. The next step in the pipeline was to solve the data using Autodesk Motion Builder. The process of solving data involves taking the optical mocap data (captured by the markers) and translating it into skeleton animation before applying it to control rigs of the Autodesk characters.
Firstly I would take a .trc file (cortex data) and import it into Motion Builder.

The markers come up as blue cubes.

Then I had to add an 'actor' into the scene (the purple man).

Because the actor comes as a default size, it needs to be adjusted (scaled, transformed, rotated) to better fit within the data that was captured on Jasmine's body.
Original:

Edited:

After adjusting the actor in as best as I could between the marker points (blue squares), I created a marker set for the actor's body. The marker set creates a bunch of cells for the key points on the body. Connecting the marker data to the cells (a tedious process) allows the actor's body parts to be controlled by the data and therefore the actor will accordingly.

Once the connections have been made, the data has to be activated and when pressing play, the actor is driven by the movement of the markers, and I save the file. This process was repeated for all 25 files we recorded in the Motion Capture Studio and took a WHILE.












Comments