Deep diving into the data driving creative ai technology.
Synthetic Humans Render Pipeline.
Core Tools:
Unreal Engine 5.4 (headless)
Python
Side Tools:
Ray Multiprocessing
Storage Tools:
Internal Cluster
The ins and outs (literally) for a BEDLAM-inspired synthetic render pipeline using Unreal Engine 5.4.
There are many moving parts to a synthetic data pipeline: camera intrinsics and extrinsics, ground truth motion capture data, retargeting systems, environment variations —you name it. The name of the game is reliable ground truth variations. Based on the paper “Look Ma, No Markers” released by the Microsoft Cambridge research team, this pipeline outputted approximately 1M images every 24 hours.
Motion Capture Data Standardization Pipeline.
Core Tools:
Maya (headless)
FBX SDK
Python
Side Tools:
Blender (scripting)
Storage Tools:
Google Cloud Services
3D motion data is expensive to get your hands on, complex once you do, and messy once you dig into the quality.
This project focused on unifying all of our acquired motion into our internal data structure for training our motion models.
Feet and Hands Pipeline.
Core Tools:
(internal)
Storage Tools:
Google Cloud Storage
3D motion data is expensive to get your hands on, complex once you do, and messy once you dig into the quality.
This project focused on unifying all of our acquired motion into our internal data structure for training our motion models.
IR Facial Capture and Audio Syncing Pipeline.
Core Tools:
Maya (headless)
FFMPEG
BWF MetaEdit (open source audio metadata)
Premiere Pro (scripting)
Storage Tools:
AWS S3
3D motion data is expensive to get your hands on, complex once you do, and messy once you dig into the quality.
This project focused on unifying all of our acquired motion into our internal data structure for training our motion models.
Motion Capture Shoot and Processing Pipeline.
Core Tools:
Vicon Motion Capture System
Technoprops Head-Mounted Cameras
Maya (headless)
Python
Storage Tools:
AWS S3
3D motion data is expensive to get your hands on, complex once you do, and messy once you dig into the quality.
This project focused on unifying all of our acquired motion into our internal data structure for training our motion models.