yelzkizi High-Level Metahuman Mocap Workflow: Professional Techniques for Unreal Engine 5 Animation

What is the best high-level mocap workflow for Metahuman in Unreal Engine 5?

The optimal MetaHuman mocap workflow for high-end productions integrates full performance capture, combining body, facial, and hand movements using advanced hardware like Xsens suits, Vicon optical systems, or Faceware for facial data, streamed via Live Link in Unreal Engine 5 (UE5) for real-time preview. This data is retargeted to MetaHuman characters, cleaned, and refined using UE5’s tools (IK Retargeter, Control Rig, Sequencer) or external software, with animators enhancing subtle details to achieve polished results, as exemplified by Treehouse Digital’s film The Well.

Which motion capture systems are compatible with Metahuman characters?

MetaHumans are designed to work with a wide range of motion capture systems. Essentially any mocap solution that outputs standard skeletal animation or ARKit facial blendshape data can be made compatible. This includes inertial suits, optical marker-based systems, camera-based facial capture, and even AI-driven markerless solutions. Here we compare some popular systems and their MetaHuman compatibility:

Yelzkizi high-level metahuman mocap workflow: professional techniques for unreal engine 5 animation
High-level metahuman mocap workflow: professional techniques for unreal engine 5 animation
SystemCapture TypeReal-Time SupportNotes
Xsens (Inertial)Full-body (optional fingers via gloves)Yes – Live Link pluginHigh-end inertial suit with excellent accuracy and stability. Requires Xsens MVN software; higher cost but minimal drift.
Rokoko (Inertial)Full-body (plus Rokoko Smartgloves; face via iPhone app)Yes – Rokoko Studio LiveAffordable mocap suit for indies. Integrated ecosystem, but can suffer magnetic interference on long sessions. Good for game dev and short takes.
Vicon / OptiTrack (Optical)Full-body (markers)Limited real-time (usually processed)Top-tier optical systems with very high fidelity. Expensive and needs a dedicated camera volume, but used in high-end film/VFX. Vicon’s Shōgun software was used with MetaHumans in projects like AKUMA.
Live Link Face (ARKit)Facial (iPhone/iPad)Yes – via WiFi or USBUses Apple ARKit blendshapes from a TrueDepth camera. Very accessible (requires iPhone X or newer). Great for real-time facial puppetry, though not as customizable as pro solutions.
Faceware (Video-based)Facial (camera or HMC)Yes – with Faceware Studio pluginProfessional facial capture from a single camera. Offers more control over face data than ARKit. Often used with head-rig cameras; Faceware Studio streams directly into UE. Widely adopted in AAA games and VFX.
StretchSense / Manus (Gloves)Hands and fingersYes – via pluginsHigh-fidelity finger tracking, used alongside body suits. Captures intricate hand motions for full-performance capture.
Move.ai (Markerless)Full-body (from video)Not live (offline processing)AI-driven mocap from 2D videos. No suit or markers needed, but currently no finger capture and not real-time. Useful for quick previz or where suits aren’t available.
Perception Neuron (Inertial)Full-bodyYes – Axis Studio plugin or BVH exportBudget-friendly inertial suit. Provides okay results but with less consistency (fewer user case studies available). Often requires more cleanup.

All listed systems are compatible with MetaHumans, using UE5’s Live Link for real-time streaming or FBX exports for recorded data, leveraging Epic’s standard skeleton and ARKit-compatible facial rig for broad compatibility. Most systems integrate seamlessly with MetaHuman rigs, enabling efficient animation workflows through real-time or post-processed application.

Yelzkizi high-level metahuman mocap workflow: professional techniques for unreal engine 5 animation
High-level metahuman mocap workflow: professional techniques for unreal engine 5 animation

How do I set up a professional mocap pipeline with Metahuman in UE5?

To establish a professional mocap pipeline for MetaHumans in UE5, configure hardware and UE5 project settings as follows:

  • Prepare Your MetaHuman in UE5: Create or download your MetaHuman character (via MetaHuman Creator and Quixel Bridge) and import it into your UE5 project. MetaHumans come fully rigged and are performance-capture-ready out of the box, which saves setup time. Ensure the MetaHuman Blueprint and assets are added to your scene.
  • Enable Necessary Plugins: In UE5, enable the plugins for Live Link and any specific mocap devices. At minimum, activate Live Link, Live Link Control Rig, Apple ARKit, and ARKit Face Support if doing facial capture. Also enable device-specific plugins (e.g. Xsens, Rokoko, Faceware Live Link plugins) as needed. Restart the editor after enabling plugins.
  • Connect Motion Capture Hardware: Set up your mocap hardware and software. Calibrate the system (e.g. suit calibration or camera calibration). Then establish a Live Link connection to Unreal for real-time streaming: for example, run Rokoko Studio or Xsens MVN and output to Live Link, or use the Faceware Live client to stream facial data. In UE5, open the Live Link panel and add your sources (they will appear if on the same network or via local link). You should see entries for the body tracker (e.g. suit) and facial source (e.g. iPhone or Faceware).
  • Assign Live Link Sources to the MetaHuman: MetaHumans use a body skeletal mesh and a face skeletal mesh. In the MetaHuman’s Anim Blueprint or in Sequencer, assign the Live Link data appropriately. For instance, you might set the body mesh to be driven by the mocap suit source (via an IK Retargeter or an animation blueprint that maps the incoming stream to the MetaHuman skeleton). The face mesh can be driven by the ARKit Live Link Face source – the MetaHuman facial rig is already set up to listen for ARKit blendshape signals. Ensure the subject names match what the MetaHuman expects (the default MetaHuman face AnimBP listens for a Live Link subject named “iPhone” by default, which you can change if needed).
  • Test Real-Time Playback: With Live Link active, you should see your MetaHuman moving in the editor as you move in the mocap suit or make facial expressions. This real-time preview is invaluable for a professional pipeline, allowing quick iteration and troubleshooting. Check for any obvious retargeting issues (like misoriented limbs or latency in face movement) and adjust settings accordingly (e.g., retarget pose, Live Link subject calibration).
  • Recording Performances: For production, use Take Recorder or Sequencer in UE5 to record the Live Link streams. This will save the animation to an asset (or take) that you can playback and edit. If capturing body and face simultaneously, record them together so they stay in sync (or use a common timecode across devices). You can also record audio in Take Recorder alongside the animation for easy syncing.
  • Retarget or Refine as Needed: Depending on the source, you might retarget the recorded animation. If you streamed directly to the MetaHuman rig, it may already be applied. In other cases, you might import an FBX animation (from an offline capture) and need to retarget it using UE5’s IK Retargeter to the MetaHuman skeleton. Ensure the MetaHuman is in the correct rest pose (UE5 MetaHumans use a specific A-pose) when retargeting so that limbs align correctly.
  • Apply Control Rig for Tweaks: For a professional pipeline, plan to refine the animation. UE5’s Control Rig can be used on MetaHumans to adjust the motion after capture. For example, you can add an IK Control Rig to tweak foot placement, or use the Facial Control Board to clean up facial animations. This step integrates the animator’s artistry on top of raw mocap.

This pipeline enables seamless performance capture, with real-time previews and iterative refinement ensuring high-quality MetaHuman animations. Dry runs and calibration are critical to validate data accuracy before final takes, streamlining game development and virtual production.

Yelzkizi high-level metahuman mocap workflow: professional techniques for unreal engine 5 animation
High-level metahuman mocap workflow: professional techniques for unreal engine 5 animation

Can I combine body and facial mocap for a full Metahuman performance?

Yes, MetaHumans support full performance capture by combining body and facial mocap, either simultaneously or in separate passes, synchronized via timecode or manual alignment:

  • Simultaneous Capture: Actors wear a mocap suit for body and a head-mounted camera or iPhone for facial capture, as seen in The Well, where Xsens suits and Faceware headcams drove MetaHuman body and face rigs in real-time. This method ensures cohesive performances by capturing both streams together, streamlining integration in UE5.Simultaneous capture uses a mocap suit for body motion and a facial rig for expressions, ensuring natural interaction. Data streams into UE5 via Live Link, driving the MetaHuman’s body and facial rigs concurrently. Timecode synchronization ensures frame-accurate alignment across devices. This approach, used in The Well, delivers lifelike, unified performances efficiently.
  • Separate Capture: Body and facial performances are recorded independently and synced using timecode or a clap slate, allowing flexibility in equipment use. In UE5’s Sequencer, animations are aligned on the timeline, leveraging MetaHuman’s separate body and face tracks for precise integration, with tools like MetaHuman Animator enhancing facial data syncing.Separate capture records body and face at different times, using playback to match timing. A clap slate or timecode ensures synchronization in post-production. UE5’s Sequencer aligns tracks, with MetaHuman Animator simplifying facial data integration. This method accommodates varied setups while maintaining performance coherence.

Epic’s MetaHuman Animator enhances facial capture, syncing high-fidelity animations with body mocap using timecode, ensuring lifelike, coordinated performances in professional workflows. Proper synchronization, ideally via timecode, guarantees frame-accurate alignment for realistic MetaHuman animations.

What are the steps to retarget mocap data to Metahuman in Unreal Engine?

To retarget mocap data to a MetaHuman in UE5, follow these steps to map animation from a source skeleton to the MetaHuman skeleton using the IK Retargeter:

  • Import or Stream the Mocap Animation: Import mocap as an FBX or use Live Link to create an Animation Sequence, identifying the source skeleton (e.g., Rokoko, Xsens, or UE Mannequin).Import FBX files or stream via Live Link to generate an Animation Sequence in UE5. Identify the source skeleton to ensure compatibility with MetaHuman retargeting. This step establishes the mocap data foundation. Proper skeleton recognition prevents mapping errors during retargeting.
  • Set Up IK Rigs: Create IK Rig assets for both source and MetaHuman skeletons, defining bone mappings and IK chains for spine, arms, and legs.IK Rigs map bones and define chains for both skeletons in UE5. Ensure accurate bone correspondence despite naming differences. This setup enables precise animation transfer. Misaligned mappings can cause distortion, so verify hierarchies carefully.
  • Create an IK Retargeter: Use an IK Retargeter asset to link source and MetaHuman IK Rigs, aligning rest poses (e.g., A-pose) and loading the animation for transfer.The IK Retargeter connects rigs and aligns poses for animation transfer. Load the mocap animation to visualize source and target skeletons. Adjust pose alignments to prevent retargeting artifacts. Automatic alignment aids but may require manual tweaks for accuracy.
  • Retarget the Animation: Generate a new Animation Sequence for the MetaHuman skeleton, checking for issues like sliding feet or arm twisting, and adjust IK settings as needed.Retargeting creates a MetaHuman-compatible Animation Sequence. Inspect for artifacts like foot sliding or limb distortion. Tweak IK goals or bone mappings to correct issues. Batch retargeting streamlines multiple animations for efficiency.
  • Apply to MetaHuman: Assign the retargeted animation to the MetaHuman via Animation Blueprint, skeletal mesh, or Sequencer, leveraging the standard Epic skeleton for accuracy.Apply the animation through Animation Blueprint or Sequencer for playback. MetaHuman’s standard skeleton ensures compatibility with similar source rigs. Verify animation fidelity in context. Proportions are preserved to avoid scaling issues.
  • Retarget Facial Data (if any): For facial animations, apply ARKit-based data directly or use tools like Faceware Retargeter to map non-ARKit curves to MetaHuman’s 52 blendshapes.ARKit facial data applies directly to MetaHuman’s rig due to matching standards. Non-ARKit data requires curve remapping via specialized tools. Ensure blendshape compatibility for accurate expressions. This step is separate from body retargeting.
  • Verify and Iterate: Playback the animation, fixing issues like floor penetration using Control Rig or additive animations for polished results.Verify animation in context to catch errors like foot penetration. Use Control Rig for precise tweaks. Iterate to refine problematic areas. This ensures a polished final performance.

This workflow, supported by provider plugins like Rokoko or Xsens, ensures efficient retargeting, enabling reusable animations across MetaHumans with minimal adjustments.

Yelzkizi high-level metahuman mocap workflow: professional techniques for unreal engine 5 animation
High-level metahuman mocap workflow: professional techniques for unreal engine 5 animation

How do I record real-time facial motion for Metahuman characters?

Recording real-time facial motion for MetaHumans typically uses camera-based systems streaming into UE5 via:

  • Use Live Link Face (iOS): Use an ARKit-capable iPhone with the Live Link Face app to capture 52 blendshapes and head rotation, streaming to UE5 for real-time MetaHuman facial animation, recorded with Take Recorder.The Live Link Face app on iPhone captures facial data via ARKit, streaming to UE5. It drives MetaHuman facial rigs in real-time for instant feedback. Take Recorder saves the performance as an editable asset. Requires a TrueDepth camera for accurate tracking.
  • Use a Head-Mounted Camera with Faceware or Similar: Employ a head-mounted camera with Faceware Studio to stream high-fidelity facial animation to UE5, recorded in real-time for professional workflows.Faceware uses a head-mounted camera to track facial movements, streaming via Faceware Studio. It offers precise control for professional-grade animation. Real-time recording in UE5 ensures seamless integration. Ideal for nuanced performances requiring detailed capture.
  • MetaHuman Animator Workflow: Record facial performance with an iPhone or stereo HMC, process it quickly with MetaHuman Animator for high-fidelity animation, suitable for near-real-time feedback.MetaHuman Animator processes iPhone or stereo camera footage into high-quality animation. It delivers results in minutes, ideal for on-set iteration. Though not fully real-time, it supports rapid feedback. Enhances facial fidelity with minimal processing time.
  • For actual recording: Use Take Recorder to capture facial animation and audio simultaneously, ensuring lip-sync and performance sync with proper lighting and calibration.Take Recorder captures facial data and audio for perfect synchronization. Good lighting and calibration minimize tracking errors. Audio recording ensures accurate lip-sync. Real-time feedback allows actors to adjust performances instantly.

Preparation, including good lighting and calibration, ensures low-latency, nuanced facial capture, enabling rapid iteration and lifelike MetaHuman performances.

What tools can I use to clean mocap data before applying it to a Metahuman?

To clean raw mocap data for MetaHumans, address noise, jitter, and artifacts using these tools:

  • MotionBuilder or Maya: Import FBX into MotionBuilder or Maya to apply filters, fix intersections, and blend takes for polished animation curves.MotionBuilder’s filters reduce jitter and enforce foot contacts. Maya’s HumanIK allows manual key adjustments. Both excel at refining animation curves. Ideal for studios requiring extensive cleanup before UE5 import.
  • Blender: Use Blender’s Graph Editor or NLA strips to smooth curves and adjust timing, exporting cleaned FBX to UE5.Blender’s editors smooth noisy curves cost-effectively. Community add-ons enhance mocap cleanup. Manual adjustments may be needed for complex fixes. Exports clean FBX for UE5 integration.
  • Rokoko Studio, Xsens MVN, etc.: Apply smoothing filters and tweak contact points in proprietary mocap software before exporting to UE5.Rokoko Studio and Xsens MVN offer built-in smoothing tools. They adjust contact points and reduce jitter pre-export. Early cleanup minimizes UE5 refinements. Ensures cleaner data from the source.
  • Unreal Engine’s Control Rig and Animation Editor: Use UE5’s Control Rig to adjust animations and apply filters like Euler Filter to fix rotation noise.UE5’s Control Rig tweaks foot placement and smooths jitter. Animation Editor applies filters to rotation curves. In-engine cleanup provides immediate context feedback. Streamlines final adjustments without external software.
  • Face-specific cleanup tools: Use Faceware Retargeter or UE5’s Facial Pose editor to smooth facial animation curves and adjust expressions.Faceware Retargeter refines eyebrow noise and expression intensity. UE5’s Facial Pose editor smooths blendshape curves. Targets specific facial artifacts effectively. Enhances facial animation quality before application.

A combined approach, initial cleanup in mocap software followed by UE5 refinements, preserves performance nuances while removing artifacts, with raw data backups ensuring flexibility.

Yelzkizi high-level metahuman mocap workflow: professional techniques for unreal engine 5 animation
High-level metahuman mocap workflow: professional techniques for unreal engine 5 animation

How do I blend mocap animation with hand-keyed motion in a Metahuman rig?

To blend mocap with hand-keyed animation for MetaHumans, use these UE5 methods to combine realism and artistic control:

  • Use Sequencer with Animation Layers: Stack mocap and keyframed tracks in Sequencer, blending additively or overriding specific motions like head turns.Sequencer layers mocap with keyframed tracks for selective overrides. Additive tracks enhance specific motions like head turns. Control Rig integration enables precise keyframing. Ensures seamless blending for targeted adjustments.
  • Animation Blueprint Layering: Use Animation Layers or Blend Nodes to drive parts of the MetaHuman with mocap and others with hand-animated states.Animation Blueprints mask bones for partial hand-animation. Blend Nodes combine mocap with keyframed gestures. Supports dynamic layering for complex animations. Maintains mocap realism with artistic control.
  • Partial Blending via Control Rig: Apply mocap, then keyframe specific controls (e.g., neck, eyes) via Control Rig to override or enhance motions.Control Rig selectively keyframes over mocap for targeted tweaks. Adjusts eye or neck controls for precise overrides. Blending weights ensure smooth transitions. Ideal for subtle performance enhancements.
  • Hand-Keyed Refinement of Facial Mocap: Use mocap for base facial performance, keyframing additional expressions via the Facial Control Board for polish.Facial Control Board refines mocap-driven expressions. Keyframes enhance smiles or correct misinterpretations. Preserves mocap’s broad strokes with detailed polish. Perfect for fine-tuning lip-sync and emotions.

Focus hand-keying on areas like fingers or eyes where mocap may lack precision, using smooth transitions via Sequencer easing or AnimBP weights to maintain realism and hit storytelling beats.

How do I use Control Rig in UE5 to refine Metahuman mocap animations?

To refine MetaHuman mocap animations using UE5’s Control Rig, follow these steps to enhance performance quality:

  • Bake Mocap to Control Rig: Bake mocap animation to the MetaHuman’s Control Rig in Sequencer, converting it to editable keyframes.Baking transfers mocap to Control Rig keyframes in Sequencer. Select the appropriate body or face rig asset. Enables direct editing of animation controls. Streamlines refinement without altering raw data.
  • Refine Using Rig Controls: Adjust specific controls (e.g., hand IK, eye aim) to fix issues like prop intersections or incorrect gaze.Rig controls adjust hand IK or eye aim for accuracy. Fixes intersections or misaligned gazes intuitively. Used in The Well for natural eyelines. Enhances performance without complex bone edits.
  • Clean and Smooth: Smooth jittery motions by flattening keyframes or adjusting controls like torso or shoulders for consistent poses.Smooth keyframes on torso controls to reduce jitter. Adjust shoulder controls for consistent posture. Simplifies cleanup with animator-friendly interface. Preserves performance while eliminating noise.
  • Enhance Posing and Timing: Exaggerate motions or add secondary actions (e.g., arm overlap) using Control Rig for cinematic polish.Control Rig adds overlap or adjusts foot placement for silhouette. Exaggerates motions for visual impact. Used in The Well to amplify lip movements. Enhances cinematic quality non-destructively.
  • Blend between Rig and Original Anim: Keyframe specific segments or layer additive tweaks, baking back to an Animation Sequence if needed.Layer additive tweaks for partial adjustments. Bake full animations for a clean slate. Ensures flexibility in editing specific moments. Maintains original mocap as a fallback.

Control Rig enables non-destructive, in-engine animation polishing, allowing rapid iteration and high-quality MetaHuman performances without external tools.

Yelzkizi high-level metahuman mocap workflow: professional techniques for unreal engine 5 animation
High-level metahuman mocap workflow: professional techniques for unreal engine 5 animation

Can I use Faceware or Live Link Face in a high-end Metahuman workflow?

Both Faceware and Live Link Face are viable for high-end MetaHuman workflows, offering distinct advantages:

Faceware provides professional-grade facial capture with customizable tracking, streaming via its Live Link plugin, and was used effectively in The Well for nuanced results requiring minimal post-processing. Its ability to fine-tune data and support high-resolution video makes it ideal for precision-driven projects, integrating seamlessly with UE5’s MetaHuman rig.

Live Link Face, using iPhone’s ARKit, delivers accessible real-time capture suitable for previz or final shots in professional settings, enhanced by MetaHuman Animator for higher fidelity. While less customizable than Faceware, its speed and reliability support rapid iteration, with studios leveraging it for fast, effective facial animation in MetaHuman pipelines.

Both tools can be used together, ARKit for quick on-set puppetry and Faceware for detailed hero shots, ensuring flexibility and high-quality MetaHuman performances tailored to project needs.

What are the best practices for syncing audio and mocap data with Metahuman?

To sync audio with mocap for MetaHumans, follow these practices for seamless performance:

  • Use Timecode Across Devices: Feed SMPTE timecode to all devices for automatic alignment in UE5’s Sequencer, supported by MetaHuman Animator and Take Recorder.Timecode ensures frame-accurate alignment across mocap, facial, and audio devices. UE5’s tools like MetaHuman Animator use it for seamless integration. Simplifies syncing in professional setups. Requires a timecode generator for optimal results.
  • Clap or Slate for Sync: Use a clapboard or hand clap to create audio and visual markers for manual alignment in post-production, aided by Live Link Face’s slate feature.A clap creates clear audio and visual sync points. Live Link Face’s slate feature aids alignment. Manually match clap frames in Sequencer. Effective when timecode is unavailable.
  • Record Audio in Unreal (Take Recorder): Capture audio with mocap in Take Recorder for perfectly synced animation and voice tracks.Take Recorder captures audio and mocap simultaneously. Ensures perfect sync in saved Takes. Eliminates manual alignment needs. Streamlines lip-sync and performance integration.
  • Maintain Consistent Frame Rates: Match mocap frame rates to project settings to prevent drift, adjusting animation timing if needed.Consistent frame rates prevent drift in long takes. Match mocap to project settings (e.g., 24 FPS). Adjust animation timing for audio alignment. Ensures smooth playback without desync.
  • Lip-Sync Check: Verify phoneme visemes align with audio, nudging tracks in Sequencer to correct minor offsets.Check visemes against audio for lip-sync accuracy. Nudge tracks in Sequencer to fix offsets. Even timecode may require minor adjustments. Ensures convincing dialogue delivery.
  • Syncing Body Mocap to Audio: Play reference audio during capture to align body actions, attaching the same audio in UE5 for natural timing.Reference audio during capture aligns body motions. Attach identical audio in UE5’s Sequencer. Ensures gestures match audio cues naturally. Enhances performance realism.
  • Don’t Rely on Just One Take: Record extra padding and test runs to troubleshoot sync issues before critical takes.Extra padding provides flexibility in post. Test runs catch sync issues early. Prevents costly retakes in production. Ensures robust pipeline reliability.

These practices ensure audio and mocap align perfectly, delivering realistic MetaHuman performances with synchronized dialogue and actions.

Yelzkizi high-level metahuman mocap workflow: professional techniques for unreal engine 5 animation
High-level metahuman mocap workflow: professional techniques for unreal engine 5 animation

How do I optimize mocap performance for cinematic Metahuman sequences?

Optimizing mocap for cinematic MetaHuman sequences balances engine efficiency and motion quality through two approaches:

1. Engine Performance Optimization:

Bake animations to Sequencer, disable Live Link, and use lower LODs for distant characters to reduce CPU load. Simplify Anim Blueprints, limit physics simulations during capture, and use Movie Render Queue for high-quality offline rendering to maintain smooth performance.

2. Motion Quality Optimization:

  • Higher Frame Capture: Capture at 60 FPS for smooth motion, ensuring fast movements render well at 24 FPS.60 FPS captures fast motions without blur. Ensures smooth playback at cinematic 24 FPS. Prevents strobing in quick actions. Ideal for both body and facial mocap.
  • Dramatic Timing vs. Real-time: Adjust mocap timing in Sequencer for cinematic pacing, stretching or squashing for dramatic effect.Sequencer tweaks timing for storytelling impact. Stretch or squash motions for pacing. Maintains natural motion quality. Enhances cinematic narrative flow.
  • Exaggerate or Refine for Camera: Amplify motions or add secondary actions via Control Rig to suit camera framing.Control Rig boosts motions for camera clarity. Adds secondary actions like shoulder drops. Ensures motions read in wide shots. Tailors animation to shot composition.
  • Multiple Takes and Edits: Splice best take segments in Sequencer for a polished composite performance.Sequencer blends multiple takes for optimal performance. Careful blending avoids abrupt transitions. Curates the best moments from captures. Mimics live-action editing for quality.
  • Preview in Context: Review animations with final camera angles and lighting to adjust motions for shot-specific needs.Contextual preview catches shot-specific issues. Adjusts gestures for camera framing. Prevents wasted motions or shadow issues. Ensures animations suit cinematic intent.

This dual approach ensures efficient, high-quality MetaHuman cinematics, leveraging baked animations and artistic refinements for impactful sequences.

How do I handle facial animation blending between mocap and animator controls?

To blend facial mocap with animator controls for MetaHumans, use these UE5 strategies for nuanced performances:

  • Layered Facial Tracks in Sequencer: Stack mocap and keyframed tracks, blending additively or masking specific curves like eye squints.Sequencer layers mocap with keyframed tracks for selective enhancements. Additive blending refines expressions like eye squints. Masks isolate specific curves for control. Ensures seamless integration of mocap and manual adjustments.
  • Using Control Rig (Facial Control Board): Bake mocap to the Facial Control Board, keyframing additional expressions to enhance or correct performance.Facial Control Board refines baked mocap animations. Keyframes push expressions for impact, as in The Well. Offers intuitive sliders for precise tweaks. Maintains mocap base with artistic polish.
  • Blend Masks: Isolate parts of the face (e.g., eyes) for hand-animation, overriding mocap curves while preserving others.Blend masks separate eye animations from mocap. Hand-key eye controls for artistic direction. Preserves mocap for mouth and cheeks. Simplifies targeted facial adjustments.
  • Ensure Rig and Mocap Don’t Conflict: Isolate mocap and manual controls to avoid exaggerated or lost details, prioritizing lip-sync from mocap.Isolate controls to prevent conflicting animations. Prioritize mocap for lip-sync, hand-key for eyes. Avoids exaggerated jaw movements. Maintains performance integrity with clear roles.
  • Transitioning Between Mocap and Hand Animation: Use short overlaps in Sequencer to smoothly fade between mocap and hand-animated sections.Overlaps in Sequencer blend mocap and hand animations. Gradual fading prevents sudden pops. Ensures smooth expression transitions. Enhances visual continuity in performance.

These methods, common in studios, allow 80% mocap-driven facial animation with 20% hand-polished details, ensuring precise, lifelike MetaHuman performances.

Yelzkizi high-level metahuman mocap workflow: professional techniques for unreal engine 5 animation
High-level metahuman mocap workflow: professional techniques for unreal engine 5 animation

What frame rate and resolution are ideal for recording mocap for Metahuman?

For optimal MetaHuman mocap recording, use these settings:

  • Body Mocap Frame Rate: Capture at 60 FPS for smooth motion, with higher rates (120 FPS) for fast actions if supported.60 FPS captures nuanced body movements effectively. Higher rates like 120 FPS suit rapid motions. Prevents blur at cinematic 24 FPS playback. Balances quality and data size.
  • Facial Mocap Frame Rate & Resolution: Record at 60 FPS with 720p or 1080p for detailed facial tracking, using iPhone’s native resolution for ARKit.60 FPS at 720p captures lip and eye movements. 1080p enhances detail for high-quality solves. ARKit uses iPhone’s fixed depth resolution. Ensures smooth, detailed facial animation.
  • Camera Resolution for Facial Video: Use 720p or 1080p at 60 FPS for head-mounted cameras to balance detail and data size.720p at 60 FPS is Faceware’s recommended baseline. 1080p captures finer skin details. Balances tracking quality with storage needs. Ideal for professional facial capture setups.
  • Consistency with Output Frame Rate: Capture at 2x or 2.5x the final frame rate (e.g., 60 FPS for 24 FPS) for smooth interpolation.Higher capture rates ensure smooth down-sampling to 24 FPS. Prevents aliasing in fast motions. Multiples of final frame rate simplify conversion. Maintains animation fluidity.
  • Motion Capture Data Resolution: Use the highest precision supported by optical or inertial systems for accurate tracking.High-precision optical systems track sub-millimeter movements. Inertial systems require thorough calibration. Maximizes data accuracy for retargeting. Simplifies post-processing cleanup.
  • Hand/Finger Capture Frequency: Record at 60-120 FPS for finger motions, syncing with body capture to avoid offsets.60 FPS suffices for most finger movements. 120 FPS captures rapid motions like piano playing. Syncs with body data for coherence. Ensures accurate hand animation alignment.

These settings, 60 FPS at 720p or higher, deliver smooth, high-fidelity MetaHuman animations with minimal post-processing.

Can Metahuman Animator be integrated into a high-level mocap workflow?

Yes, MetaHuman Animator enhances high-level mocap workflows by streamlining facial animation:

  • Capture Facial Performance with MetaHuman Animator: Record facial video with an iPhone or stereo HMC, processing it in minutes for high-fidelity MetaHuman animation.MetaHuman Animator captures facial data via iPhone or stereo cameras. Processes video quickly into high-quality animation. Ideal for on-set iteration. Delivers nuanced performances with minimal setup.
  • Simultaneous Body Capture: Capture body mocap concurrently, using timecode to sync with facial data, as demonstrated in Epic’s Blue Dot.Body mocap runs alongside facial video capture. Timecode ensures perfect alignment with facial data. Used in Blue Dot for cohesive performances. Maintains standard body capture workflows.
  • Apply and Combine: Apply MetaHuman Animator’s facial animation and body mocap to the MetaHuman, syncing via shared timeline for cohesive performance.Facial and body animations apply to MetaHuman rigs. Shared timeline ensures synchronization. Combines high-quality facial and body data. Streamlines integration in UE5’s Sequencer.
  • Edit and Refine: Refine MetaHuman Animator’s output, which uses standard curves, for minor tweaks while requiring less cleanup.Animator’s curves allow easy editing in UE5. High-quality output reduces cleanup needs. Supports audio-driven tongue motion. Enhances efficiency in facial animation workflows.

MetaHuman Animator integrates as an advanced facial capture tool, complementing body mocap and reducing processing time, making it a valuable addition to professional pipelines.

Yelzkizi high-level metahuman mocap workflow: professional techniques for unreal engine 5 animation
High-level metahuman mocap workflow: professional techniques for unreal engine 5 animation

What file formats work best when importing mocap into Unreal Engine 5 for Metahuman?

FBX is the most reliable format for importing mocap into UE5 for MetaHumans, supporting skeletal and blendshape animations:

  • FBX Workflow: Export mocap as FBX from software like Rokoko Studio, importing into UE5 with the MetaHuman skeleton or retargeting as needed.FBX supports skeletal and facial animations comprehensively. Imports directly to MetaHuman or requires retargeting. Ensures compatibility with UE5’s pipeline. Streamlines data transfer from mocap software.
  • Other Formats:
    • BVH: Convert to FBX using Blender or MotionBuilder, as UE5 lacks native BVH support.BVH requires conversion to FBX for UE5 compatibility. Blender or MotionBuilder handles conversion effectively. Suitable for older mocap libraries. Ensures usability with MetaHuman rigs.
    • Alembic (ABC): Avoid for character animation, as it loses rig control for MetaHumans.Alembic is unsuitable for skeletal animation imports. Loses MetaHuman rig editability. Best for geometry caches, not mocap. Stick to FBX for character workflows.
    • USD: Less straightforward than FBX; ensure skeleton and blendshape compatibility.USD supports animations but is less common. Requires careful skeleton mapping. FBX remains the simpler choice. Use for specific pipeline needs only.
    • Live Link / Take Recorder: Bypasses external files, saving animations as UE5-native .utm or .take formats.Live Link records directly to UE5 formats. Eliminates external file imports. Ideal for in-engine capture workflows. FBX still needed for cross-software sharing.

FBX’s robust UE5 support makes it the best choice for importing body and facial mocap, with provider plugins enhancing integration.

How do I manage multiple characters in a mocap-driven Metahuman scene?

To manage multiple MetaHuman characters with mocap in UE5, coordinate capture and application as follows:

  • Use Multiple Mocap Setups or Iterative Recording: Capture actors simultaneously with unique Live Link sources or record iteratively, assigning data to distinct MetaHumans.Simultaneous capture uses multiple suits and facial rigs. Each MetaHuman links to unique Live Link sources. Iterative recording syncs performances via playback. Ensures natural interactions or aligned solo captures.
  • Syncing Performances: Use timecode or reference audio to align performances captured separately, ensuring cohesive timing in Sequencer.Timecode or audio tracks sync separate captures. Playback guides iterative recordings for reaction accuracy. Sequencer aligns tracks for unified scenes. Maintains performance coherence across characters.
  • Scene Setup in Unreal: Assign each MetaHuman’s AnimBP or Sequencer track to its respective mocap source, ensuring clear naming for organization.Each MetaHuman links to specific Live Link or animation tracks. Clear naming prevents source confusion. Sequencer organizes character tracks independently. Streamlines multi-character scene management.
  • Interactions and Contact: Adjust hand contacts or eye-lines using IK rigs or Control Rig to ensure accurate physical interactions.IK rigs refine handshakes or hugs post-mocap. Control Rig adjusts eye-lines for accuracy. Ensures realistic character interactions. Prevents visual disconnects in close contacts.
  • Performance and Organization: Use lower LODs during capture and clear naming to manage performance and complexity in multi-character scenes.Lower LODs reduce performance overhead during capture. Clear naming organizes assets and sources. Splits complex scenes for efficiency. Ensures smooth real-time previews and rendering.

This approach, used in The Well, treats each MetaHuman as a separate feed, synced via timecode, enabling ensemble scenes akin to live-action productions.

Yelzkizi high-level metahuman mocap workflow: professional techniques for unreal engine 5 animation
High-level metahuman mocap workflow: professional techniques for unreal engine 5 animation

What are common issues in high-level Metahuman mocap workflows and how to fix them?

Common issues in MetaHuman mocap workflows and their solutions include:

  • Retargeting Artifacts (Twisted or Distorted Limbs): Fix by aligning rest poses in IK Retargeter, bypassing Control Rig if needed, and setting IK goals.Align A-pose or T-pose in IK Retargeter to prevent distortions. Bypassing Control Rig can resolve mesh issues. IK goals fix sliding limbs. Ensures accurate animation mapping.
  • Foot Sliding or Incorrect Foot Plant: Use IK foot lock, Control Rig, or MotionBuilder’s solver to enforce ground contact and align root motion.IK foot lock pins feet in Control Rig. MotionBuilder’s solver corrects contacts pre-import. Root motion alignment prevents drift. Restores natural foot placement.
  • Jittery or Noisy Motion: Apply smoothing filters in mocap software, UE5, or Blender/Maya to reduce sensor noise, using high-quality hardware for prevention.Smoothing filters in Rokoko or UE5 remove jitter. Blender/Maya scripts refine curves. High-quality suits like Xsens minimize noise. Preserves performance while cleaning data.
  • Magnetic Interference / Drift (Inertial Mocap Issue): Calibrate frequently, use short takes, and avoid metallic interference, applying post-process drift correction.Frequent calibration resets drift in inertial suits. Short takes reduce cumulative errors. Avoid metallic environments for accuracy. Post-process correction restores alignment.
  • Facial Animation Not Matching Audio (Sync issues): Align tracks using audio scrub or timecode, nudging in Sequencer to correct offsets.Audio scrub matches lip closures to plosives. Timecode ensures sync with MetaHuman Animator. Sequencer nudges fix minor offsets. Restores lip-sync accuracy.
  • Overshooting or Undershooting Expressions: Adjust curves in Face Control Rig or Faceware Studio to clamp or enhance specific expressions.Face Control Rig clamps overshot curves. Faceware Studio tunes expression sensitivity. Additive tracks boost undershot expressions. Balances natural and artistic facial performance.
  • MetaHuman Specific Quirks: Disable auto eye focus, lock LOD0, and merge assets correctly to fix eye pops or hair glitches.Disable eye focus to prevent animation conflicts. Lock LOD0 for consistent facial rigging. Proper asset merging avoids head-body disconnects. Ensures visual stability in animations.
  • Performance Drops with Multiple Characters: Use lower LODs, record sequentially, and render offline with Movie Render Queue to maintain frame rates.Lower LODs reduce real-time load. Sequential recording manages complexity. Offline rendering ensures high quality. Maintains smooth multi-character scene performance.

Testing pipelines with small scenes catches issues early, leveraging community solutions for efficient fixes in MetaHuman mocap workflows.

Where can I find advanced tutorials on mocap for Metahuman characters?

There are several excellent resources for deep dives into MetaHuman mocap workflows:

  • Epic Games Official Learning Resources: Start with the Epic Developer Community hub for MetaHumans. Epic has documentation and video tutorials on topics like Animating MetaHumans with Live Link, Using Control Rig, and more. They also have an official course “MetaHumans for Virtual Production” (on the learning portal) which covers capturing performances and using them on MetaHumans. The MetaHuman documentation section on retargeting and facial animation is very useful. Additionally, the new MetaHuman Hub on Epic’s site provides guides and a forum to discuss advanced techniques.
  • Rokoko’s MetaHuman Tutorials: Rokoko has published guides and videos tailored to using their mocap suit with MetaHumans. A notable one is “Guide to Animating MetaHumans in UE4/UE5” on their blog, which covers everything from retargeting to live streaming. They also have a support article specifically on livestreaming to MetaHumans and a YouTube series (e.g., retargeting Mixamo animations to MetaHuman, etc.). Their Ultimate Retargeting Guide (though MetaHuman-specific parts are separate) is great for understanding Unreal’s retargeting in general. Rokoko’s YouTube channel often showcases MetaHuman workflows in their office hours or tutorial uploads.
  • JSFilmz and Other YouTube Creators: YouTube has become a goldmine of MetaHuman mocap content. Creators like JSFilmz, Matt Workman (Cine Tracer), and others have shared their processes. For example, JSFilmz did comparisons between Live Link Face and Faceware with MetaHumans, showing the quality differences, setup, etc. Many Unreal Engine livestreams or Unreal Fest talks are recorded on YouTube dealing with MetaHuman animation.
  • Faceware and Xsens Webinars: Faceware has documentation and a knowledge base on using their tech with MetaHumans. They might have tutorials on their site or YouTube channel showcasing setup. Xsens has case studies and perhaps webinars (often in collaboration with Unreal Engine) demonstrating full performance capture with MetaHumans. Check Xsens’ official blog or YouTube for any MetaHuman specific content.
  • Blender to MetaHuman workflows: If you’re interested in Blender in the pipeline, search for tools like the MetaHuman Blender DNA plugin – there are tutorials on how to import a MetaHuman into Blender, apply animations (even mocap via Blender’s add-ons like Motion Capture for Blender or Rokoko’s plugin), and send it back to Unreal. While not the most common pro workflow, it’s useful for animators familiar with Blender’s graph editor who want to clean mocap.
  • Community Forums and Discords: The Unreal Engine forums have a section for MetaHumans where users share tips (for example, threads on combining face and body animations). There are also community Discord servers (including an official MetaHuman one via Faceware’s site) where advanced users discuss their techniques. If you have a specific advanced question, these forums are a great place to search – chances are someone has asked it.
  • Virtual Production & VFX Sites: Sites like 80.lv, CGSociety, and befores & afters sometimes have articles or interviews where professionals talk about using MetaHumans and mocap. These can provide insight into how studios approach it (like a breakdown of a short film or an ad that used MetaHuman with mocap).

To get you started, the Epic Dev Community’s MetaHuman section is highly recommended, and Rokoko’s MetaHuman retargeting tutorial is another must-read. Combining knowledge from official docs and community-contributed content will give you a well-rounded, advanced understanding of MetaHuman mocap workflows.

Yelzkizi high-level metahuman mocap workflow: professional techniques for unreal engine 5 animation
High-level metahuman mocap workflow: professional techniques for unreal engine 5 animation

What studios and projects are using high-level mocap with Metahuman in production?

Several studios and projects leverage high-level mocap with MetaHumans for diverse applications:

  • Treehouse Digital – The Well (Short Horror Film): Used Xsens suits and Faceware for body and facial capture in a CG short, achieving film-quality animation, spotlighted by Epic Games.Treehouse Digital’s The Well employed Xsens for body and Faceware for facial mocap. Captured performances drove MetaHuman characters in UE5. Achieved cinematic quality with a small team. Epic Games highlighted it as a case study for professional workflows.
  • Dimension Studio – Akuma (Real-time Short Film): Collaborated with Vicon for optical mocap to animate MetaHumans in a cinematic short, showcasing advanced VFX and real-time development.Dimension Studio’s Akuma used Vicon’s optical system for MetaHuman animation. Combined mocap with real-time UE5 rendering. Demonstrated high-end VFX capabilities. Showcased MetaHuman potential in cinematic storytelling.
  • Epic Games (3Lateral) – Blue Dot: Employed stereo HMC and MetaHuman Animator for facial capture, synced with body mocap, for a cinematic short demonstrating cutting-edge pipelines.Epic’s Blue Dot used MetaHuman Animator with stereo HMC for faces. Body mocap synced via timecode for MetaHuman animation. Showcased advanced facial and body integration. Highlighted MetaHuman’s role in high-fidelity cinematics.
  • Game Cinematics and Trailers: Studios use MetaHumans for cutscene previz or secondary characters, with mocap-driven animations, as seen in projects like Hellblade II’s related tech.Game studios apply MetaHumans for cinematic previz. Mocap animates secondary or background characters. Saves time in cutscene production. Influenced by tech similar to Hellblade II’s cinematics.
  • Broadcast and Live Events: MetaHumans serve as digital hosts in live demos, driven by Xsens suits and iPhone facial capture for real-time performances at tech events.Live events feature MetaHumans as virtual hosts. Xsens and iPhone mocap drive real-time animation. Used in Unreal Engine 5 demos. Enhances interactive broadcast experiences.
  • Independent Filmmakers and YouTubers: Indies create shorts like Nobody using MetaHumans and mocap, with Love, Death & Robots using MetaHuman rigs for background characters.Indie short Nobody used MetaHumans with mocap suits. Love, Death & Robots applied MetaHuman rigs for backgrounds. Showcases accessible high-quality animation. Expands MetaHuman use in creative storytelling.
  • Training and Simulation: Companies use MetaHumans with Xsens-driven mocap for realistic training scenarios in military or medical simulations.Training simulations employ Xsens mocap for MetaHumans. Creates realistic role-players for VR scenarios. Enhances immersive training experiences. Applies mocap for task-specific animations.

These projects, from The Well to Blue Dot, demonstrate MetaHumans’ integration with high-end mocap in films, games, and simulations, with growing adoption across industries.

Yelzkizi high-level metahuman mocap workflow: professional techniques for unreal engine 5 animation
High-level metahuman mocap workflow: professional techniques for unreal engine 5 animation

FAQ Questions and Answers

  1. Do I need expensive equipment to animate MetaHumans with mocap?
    High-end gear like Xsens suits and Faceware cameras yields top results, but affordable options like Rokoko suits or iPhone-based Live Link Face work for indie creators. Hardware quality scales results, enabling good MetaHuman animations without a large budget.
  2. Can I use mocap data from Mixamo or other sources on MetaHumans?
    Yes, MetaHumans use Epic’s standard skeleton, allowing retargeting of humanoid animations from Mixamo or Unreal Marketplace via IK Retargeter or Retarget Manager, mapping FBX files to MetaHumans for quick movements.
  3. How can I capture finger movements for my MetaHuman?
    Use mocap gloves like Manus Prime or Rokoko Smartgloves to capture finger motions, integrating with body mocap for full animation. Manual keyframing is an alternative, but gloves ensure realistic MetaHuman hand animations for close-ups.
  4. My MetaHuman’s face looks frozen when I apply body mocap – why?
    Body animation alone doesn’t move the separate MetaHuman face rig. Capture or animate facial data using Live Link Face or animation assets, combining both to ensure the MetaHuman has full body and facial animation.
  5. Is it possible to drive a MetaHuman live, in real-time, for a stream or performance?
    Yes, Live Link enables real-time MetaHuman puppeteering with a mocap suit and iPhone on a helmet, used in virtual events and streams. A powerful computer is needed to process mocap and rendering.
  6. How do I correct the scale if my mocap data makes the MetaHuman slide or float?
    Adjust scale in IK Retargeter for sliding, enabling root translation or scale compensation. For floating, align the root bone height in the retarget pose to match the source’s floor contact, ensuring proper MetaHuman grounding.
  7. Can I edit MetaHuman facial animations outside of Unreal (like in Maya or Blender)?
    Export MetaHuman face rigs via the DNA tool to Maya for tweaking, then import FBX back to Unreal. Blender uses community rigs for similar workflows, enabling advanced facial animation edits for MetaHumans.
  8. What’s the difference between using Live Link (real-time) and importing animation files (offline)?
    Live Link streams real-time animation for previews or live performances, while offline FBX imports allow editing in Sequencer or AnimBlueprint. Live Link can be recorded via Take Recorder for offline polishing.
  9. Can AI motion capture (like recording a video and extracting animation) be used for MetaHumans?
    AI mocap from Move.ai or DeepMotion generates FBX/BVH animations retargetable to MetaHumans, suitable for quick body motions but often requiring cleanup due to artifacts and lacking facial data.
  10. What is the best way to get audio-driven facial animation (lip-sync) on a MetaHuman if I don’t have face mocap?
    Use audio-to-face tools like Audio2Face, Oculus Lipsync, or UE plugins for phoneme poses driven by audio analysis. Manual keyframing or combining automated lipsync with hand-keyframing ensures accurate MetaHuman lip-sync.

Conclusion

A high-level MetaHuman mocap workflow in Unreal Engine 5 integrates advanced motion capture systems, like Live Link Face and Faceware cameras, with UE5’s tools, Live Link, Control Rig, Sequencer, and MetaHuman framework, to create realistic animations. Creators capture detailed actor performances, retarget them to MetaHumans, and refine them using best practices for syncing and cleanup, blending mocap with hand animation for cinematic or game projects.

Studios leverage this for real-time films and interactive experiences, supported by community knowledge for troubleshooting. Tools like MetaHuman Animator lower barriers, enabling small teams to achieve professional results. A meticulous pipeline ensures believable, emotive MetaHuman performances, directing virtual actors like live ones. These techniques empower creators to push real-time animated storytelling boundaries across games, films, and immersive media.

Yelzkizi high-level metahuman mocap workflow: professional techniques for unreal engine 5 animation
High-level metahuman mocap workflow: professional techniques for unreal engine 5 animation

Sources and Citation

Recommended

Table of Contents

PixelHair

3D Hair Assets

PixelHair pre-made female 3d character Curly braided Afro in Blender using Blender hair particle system
PixelHair pre-made The weeknd Afro 3D hairstyle in Blender using Blender hair particle system
PixelHair Realistic female 3d character bob afro 4c hair in Blender using Blender hair particle system
PixelHair ready-made 3D  curly mohawk afro  Hairstyle of Odell Beckham Jr in Blender
PixelHair ready-made 3D Dreads hairstyle in Blender
PixelHair ready-made dreads pigtail hairstyle in Blender using Blender hair particle system
PixelHair pre-made female 3d character Curly  Mohawk Afro in Blender using Blender hair particle system
PixelHair ready-made short 3D beard in Blender using Blender hair particle system
PixelHair pre-made Burna Boy Dreads Fade Taper in Blender using Blender hair particle system
PixelHair pre-made Drake Braids Fade Taper in Blender using Blender hair particle system
PixelHair pre-made Curly Afro in Blender using Blender hair particle system
PixelHair ready-made Top short dreads fade 3D hairstyle in Blender using Blender hair particle system
PixelHair Realistic female 3d character pigtail dreads 4c hair in Blender using Blender hair particle system
PixelHair pre-made Odel beckham jr Curly Afro Fade Taper in Blender using Blender hair particle system
PixelHair pre-made Drake Braids Fade Taper in Blender using Blender hair particle system
PixelHair Realistic 3d character full beard in Blender using Blender hair particle system
PixelHair ready-made full 3D goatee beard in Blender using Blender hair particle system
PixelHair pre-made Afro Fade Taper in Blender using Blender hair particle system
PixelHair Realistic Killmonger from Black Panther Dreads fade 4c hair in Blender using Blender hair particle system
PixelHair ready-made 3D hairstyle of Lil uzi vert dreads in Blender
PixelHair Realistic Dreads 4c hair in Blender using Blender hair particle system
PixelHair Realistic 3d character curly afro fade taper 4c hair in Blender using Blender hair particle system
PixelHair pre-made Nardo Wick Afro Fade Taper in Blender using Blender hair particle system
PixelHair Realistic female 3d character curly afro 4c hair in Blender using Blender hair particle system
PixelHair Realistic r Dreads 4c hair in Blender using Blender hair particle system
PixelHair ready-made Rhino from loveliveserve style Mohawk fade / Taper 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made top four hanging braids fade 3D hairstyle in Blender using hair particle system
PixelHair ready-made 3D fade dreads in a bun Hairstyle  in Blender
PixelHair ready-made top woven dreads fade 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made 3D Dreads (Heart bun) hairstyle in Blender
PixelHair ready-made Polo G dreads 3D hairstyle in Blender using hair particle system
PixelHair ready-made 3D hairstyle of Halle Bailey Bun Dreads in Blender
PixelHair ready-made 3D Lil Pump dreads hairstyle in Blender using hair particle system
PixelHair ready-made iconic 3D Drake braids hairstyle in Blender using hair particle system
PixelHair pre-made Drake Double Braids Fade Taper in Blender using Blender hair particle system
PixelHair ready-made Neymar Mohawk style fade hairstyle in Blender using Blender hair particle system
PixelHair Realistic 3d character clean shaved patchy beard in Blender using Blender hair particle system
PixelHair pre-made The weeknd Dreads 3D hairstyle in Blender using Blender hair particle system
PixelHair pre-made Lil Baby Dreads Fade Taper in Blender using Blender hair particle system
PixelHair ready-made 3D hairstyle of Travis scott braids in Blender
PixelHair ready-made 3D hairstyle of Doja Cat Afro Curls in Blender
PixelHair ready-made 3D hairstyle of Big Sean Afro Fade in Blender
PixelHair ready-made 3D hairstyle of Halle Bailey dreads knots in Blender with hair particle system
PixelHair ready-made Snoop Dogg braids hairstyle in Blender using Blender hair particle system
PixelHair pre-made Chris Brown inspired curly afro 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made Scarlxrd dreads hairstyle in Blender using Blender hair particle system
PixelHair pre-made Omarion Braided Dreads Fade Taper in Blender using Blender hair particle system
PixelHair Realistic 3d character bob mohawk Dreads taper 4c hair in Blender using Blender hair particle system
PixelHair Realistic female 3d character curly bangs afro 4c hair in Blender using Blender hair particle system
PixelHair Realistic 3d character afro dreads fade taper 4c hair in Blender using Blender hair particle system
Fade 009
PixelHair ready-made 3D hairstyle of lewis hamilton Braids in Blender
PixelHair pre-made Ken Carson Fade Taper in Blender using Blender hair particle system
PixelHair Realistic female 3d character curly afro 4c big bun hair in Blender using Blender hair particle system
PixelHair ready-made 3D Beard of Khalid in Blender
PixelHair ready-made full Chris Brown 3D goatee in Blender using Blender hair particle system
PixelHair ready-made 3D hairstyle of Ski Mask the Slump god Mohawk dreads in Blender
PixelHair ready-made iconic Kodak thick black dreads 3D hairstyle in Blender using hair particle system
yelzkizi PixelHair Realistic female 3d character curly afro 4c big bun hair with 2 curly strands in Blender using Blender hair particle system
PixelHair ready-made Big Sean braids 3D hairstyle in Blender using hair particle system
PixelHair Realistic 3d character bob afro  taper 4c hair in Blender using Blender hair particle system
PixelHair ready-made Long Dreads Bun 3D hairstyle in Blender using Blender hair particle system
PixelHair Realistic 3d character dreads fade taper in Blender using Blender hair particle system
PixelHair ready-made Vintage Bob Afro 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made 3D hairstyle of XXXtentacion Dreads in Blender
PixelHair ready-made iconic Juice Wrld dreads 3D hairstyle in Blender using hair particle system
PixelHair ready-made Lil Baby dreads woven Knots 3D hairstyle in Blender using hair particle system
PixelHair pre-made weeknd afro hairsty;e in Blender using Blender hair particle system
PixelHair ready-made Rema dreads 3D hairstyle in Blender using Blender hair particle system
PixelHair Realistic female 3d character curly afro 4c ponytail bun hair in Blender using Blender hair particle system
PixelHair ready-made 3D hairstyle of Khalid Afro Fade  in Blender
PixelHair ready-made 3D KSI fade dreads hairstyle in Blender using hair particle system
PixelHair ready-made Omarion full 3D beard in Blender using Blender hair particle system
PixelHair ready-made pigtail female 3D Dreads hairstyle in Blender with blender hair particle system
PixelHair ready-made Drake full 3D beard in Blender using Blender hair particle system
PixelHair ready-made Braids Bun 3D hairstyle in Blender using Blender hair particle system
PixelHair pre-made Tyler the Creator Chromatopia  Album 3d character Afro in Blender using Blender hair particle system
PixelHair ready-made Pop smoke braids 3D hairstyle in Blender using Blender hair particle system
PixelHair pre-made Chadwick Boseman Mohawk Afro Fade Taper in Blender using Blender hair particle system
PixelHair Realistic 3d character curly afro taper 4c hair in Blender using Blender hair particle system
PixelHair ready-made short 3D beard in Blender using Blender hair particle system
PixelHair ready-made chrome heart cross braids 3D hairstyle in Blender using hair particle system
PixelHair ready-made iconic Lil Yatchy braids 3D hairstyle in Blender using hair particle system
PixelHair ready-made short 3D beard in Blender using Blender hair particle system
PixelHair pre-made Afro Fade Taper in Blender using Blender hair particle system
PixelHair ready-made Jcole dreads 3D hairstyle in Blender using hair particle system
PixelHair ready-made 3D Jason Derulo braids fade hairstyle in Blender using hair particle system
PixelHair ready-made 3D hairstyle of Big Sean  Spiral Braids in Blender with hair particle system
PixelHair ready-made Kobe Inspired Afro 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made Afro fade 3D hairstyle in Blender using Blender hair particle system
PixelHair Realistic female 3d charactermohawk knots 4c hair in Blender using Blender hair particle system
PixelHair ready-made faded waves 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made full 3D beard in Blender using Blender hair particle system
PixelHair ready-made Afro fade 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made iconic Asap Rocky braids 3D hairstyle in Blender using hair particle system
PixelHair ready-made Afro fade 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made full weeknd 3D moustache stubble beard in Blender using Blender hair particle system
PixelHair ready-made top bun dreads fade 3D hairstyle in Blender using Blender hair particle system
PixelHair Realistic Juice 2pac 3d character afro fade taper 4c hair in Blender using Blender hair particle system
PixelHair ready-made dreads afro 3D hairstyle in Blender using hair particle system