Quickstart /Official Docs
Official Docs Stream Skeleton into Unreal
Note -after recording you can’t see the result – you need to Reprocess the data – turns 2d into skeletons – hit the small button that looks like 2 dots joined by a wiggly line
Choose these settings (see below)
Playing the take back and streaming it to Unreal – choose Play with RealTime output
Qualisys has a LiveLink plugin here it’s easiest to add it from the marketplace.
Here’s a video of how to set up the Qualisys Retargeting Asset – it works fine in Unreal 5 too.
In Unreal the bone naming convention is MOST IMPORTANT! You will make a Qualisys Retargeting Asset. If the bones aren’t weighted to anything in your 3d weight painting software e.g. Akeytsu then they are greyed out in Unreal (which is fine). You don’t need to hook them up in the Qualisys Retargeting Asset but if you do its easier when viewing skeletons for debugging.
It seems like the plugin can do real-time retargeting too.
Above are two characters, one is human-sized but has short legs and a longer body.
The second is 4m tall with wide apart legs. The tall one was imported and set to use the skeleton of the shorter one.
Both are using the same Qualysis Live Link retargeting asset.
The short one is an Anim BluePrint, the second is a BluePrint Character.
Both are driven by Ruth’s motion capture streaming from QTM and both are set position-wise to the origin. (I had to turn off collision because they we’re boinking each other), but it seems like the retargeting is proportionally working –so that for a step forward from Ruth, the little guy takes a short step, the tall guy a big step. Neither has a stretched mesh. (there is a little foot sliding however)
When the characters come in from Akeytsu they normally need to have T0 As Ref pose enabled :
AND Import rotation set to -90
I also moved the physics bodies around so that there in roughly the correct place. You can also select the physics bodies and choose Regenerate Bodies.
–It seems like character bounds are set, but the position of the physics bodies controls where the character gets clipped by the frustrum that causes them to disappear even though they should be in view. And when hen they are imported the physics gets set in the wrong place and causes this problem.)
Mocap Cleaning Qualysis
Look in the Labeled Trajectories Window for the Fill level. The percentage in not 100% indicates gaps.
Also open the Trajectory Editor window
For Swapping – where once marker gets confused for another
For Gap Filling for missing parts in trajectories
Now choose Fill type from the drop down either Polyonomial or Linear are good. Then hit F to Fill.
For Smoothing – look for red underlined sections in the trajectory editor. Select a selection of the trajectory and use the drop down to choose Butterworth (good for HF noise) then hit S to Smooth
For big gaps you might see a grey marker which indicates an unlabelled trajectory
Select it in the 3d window and it will turn white and select the corresponding trajectory in the Unidentified Trajectories window.
now drag the selected trajectory from the Unidentified Trajectories window onto the (in this case) RUT_SpineTop trajectory in the Labelled Trajectories window list above above.
Now it should look like this – the spine top is orange in the 3d window and that gap has gone
Now sometimes this won’t work if the unidentified section is longer than the missing section (because it is made of several parts) In this case hold ALT to get a part of the unidentified section like the image below
Click + drag to select in the Trajectory editor, then Double click in the timeline to jump to that time. In the 3d Window you hold Ctrl + Drag to scrub the timeline – use this method to see in 3d what you are doing. You can use The Follow Selected Markers button to centre the marker in the frame
Important – at the end of cleaning a take you need to choose a T-pose Frame the hit Calibrate Skeletons (F10) – this remakes the skels based on the cleaned data and will reduce /elimate popping. THEN SAVE!
Gibson/Martelli are undertaking a research residency to explore movement in avatars with motion capture at Gilles Jobin’s #Studios44MocapLab in Geneva. The artists have never had the chance to work with Gilles & team or work with a Qualisys Mocap system. During the residency, combined with mocap, they will experiment with live-controlling avatars with mocap and then iterate in Virtual Reality, using virtual cameras and working with Company Gilles Jobin lead dancer Susana Panades Diaz, and lead tech Pedro Ribot @pedro.ribot
…At the #Studios44MocapLab