ALS Metahuman Motion Matching: How to Combine Realistic Movement and Character Animation in Unreal Engine 5

Combining Advanced Locomotion System (ALS) with MetaHuman characters and motion matching in Unreal Engine 5 enables developers to achieve remarkably realistic character movement in both gameplay and cinematic contexts. This full-length guide explores how to integrate community-supported ALS (like ALSv4) with MetaHumans, and leverage motion matching techniques (using tools such as the Motion Symphony plugin or UE5’s Pose Search system) for smooth, lifelike animation. We’ll cover everything from retargeting animations to MetaHuman rigs, blending motion matching with keyframe animations, handling facial animations, optimizing performance, and troubleshooting common issues. Whether you’re a solo dev or a small studio, this article will help you combine ALS and MetaHumans to create high-quality character movement in Unreal Engine 5. Let’s dive in!

What is ALS motion matching in Unreal Engine 5?

The Advanced Locomotion System (ALS) is an open-source framework for third-person character movement, originally built for Unreal Engine 4 and updated for UE5 by the community. It employs traditional animation techniques like blend spaces, state transitions, and inverse kinematics (IK) for responsive locomotion, such as running and jumping.

Motion matching, a dynamic animation technique, selects the most suitable pose from a large motion database based on real-time parameters like bone positions and velocities, enabled in UE5 through the Pose Search plugin. ALS motion matching integrates ALS’s character movement logic with a motion matching node, replacing canned transitions with data-driven pose selection for fluid, realistic animations. This combination simplifies animation blueprints, as seen in UE5.4’s experimental motion matching sample, which reduces locomotion to a single node. Developers note enhanced animation fidelity, with seamless handling of starts, stops, and turns, rivaling AAA game quality.

This approach pairs ALS’s robust input handling and character states with motion matching’s ability to dynamically choose animations, ideal for MetaHumans in UE5. It eliminates the need for explicit blend logic, making character movements more natural and reactive to gameplay. The system leverages a rich motion dataset to ensure smooth transitions, particularly for complex maneuvers. By combining ALS’s foundation with motion matching, developers achieve high-fidelity character control with minimal manual state machine setup, streamlining workflows for realistic MetaHuman animations.

Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5

Which version of ALS supports motion matching for Metahuman characters?

ALS v4 (Advanced Locomotion System Version 4) is the latest version, lacking native motion matching and relying on traditional animation blueprints developed before motion matching became prevalent in Unreal Engine. The ALS v4 Community edition, a C++ optimized fork, is updated for Unreal Engine 5 (up to UE 5.4) and recommended for integrating motion matching. No official ALS v5 exists, though some community projects use the term; ALS v4 community edition is the standard for UE5. The ALS community maintainer notes its reliance on outdated pre-UE5 techniques, suggesting alternatives like Epic’s Lyra. Motion matching requires external solutions, as it’s not a built-in ALS feature, necessitating plugin integration.

For motion matching, developers use plugins like Motion Symphony, compatible with UE 4.26 to 5.x, which integrates modularly with ALS’s animation graph for locomotion states. Alternatively, UE5.4’s native Pose Search plugin offers a free, engine-native motion matching system with a sample project of 500+ animations, though it requires manual ALS blueprint modifications. Both Motion Symphony and Pose Search support MetaHumans via skeleton retargeting, with community reports praising UE5.4’s responsiveness. ALS v4 (community edition) serves as the base, requiring additional tools like Motion Symphony or Pose Search to enable motion matching for enhanced MetaHuman animations.

Can you use ALS with Metahuman characters?

MetaHumans can be integrated with the Advanced Locomotion System (ALS) in Unreal Engine 5, allowing developers to replace the default ALS mannequin with a high-fidelity MetaHuman character for enhanced realism. The MetaHuman’s body rig is compatible with the Epic mannequin skeleton, enabling animation retargeting despite additional bones. While not plug-and-play, the process is well-documented, with community tutorials and Epic’s forums detailing blueprint setups and retargeting steps. Using UE5’s IK Retargeter, developers map the MetaHuman skeleton to the ALS mannequin, enabling movements like running and jumping. The ALS Community plugin requires integration, as it doesn’t natively support MetaHumans, but once set up, it delivers ALS’s robust locomotion with MetaHuman visuals.

Performance is a consideration due to MetaHumans’ complex skeletons, particularly the face, which is heavier than the mannequin’s. Community resources, such as Kitatus’s tutorial, demonstrate MetaHumans performing ALS v4 animations, including advanced features like mantling and rolling. Runtime or offline retargeting approaches hook MetaHumans into ALS, ensuring compatibility. This integration is common for creating realistic hero characters in games or cinematics, combining MetaHuman’s visual fidelity with ALS’s proven movement system. The result is a seamless blend of interactive locomotion and high-quality character models.

Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5

What are the requirements for using motion matching with ALS and Metahuman?

To use motion matching with ALS and MetaHumans, specific software, assets, and configurations are needed:

  • Unreal Engine 5 with the appropriate plugins: Use UE5.4 for the Pose Search plugin or earlier versions with Motion Symphony 2.0 for motion matching capabilities. Ensure the ALS Community C++ plugin is integrated, requiring a C++ project for compilation. Blueprint-only projects need plugin compilation to enable ALS functionality. These components establish the foundation for motion matching in UE5.
  • A MetaHuman character imported and ready: Import a MetaHuman via Quixel Bridge or the MetaHuman plugin, ensuring skeletal mesh, skeleton asset, and animation blueprints are functional. Verify body and face components work correctly in UE5 scenes. This setup prepares the MetaHuman for integration with ALS systems. Proper configuration ensures seamless animation compatibility.
  • Animation data with root motion: A diverse set of root-motion animations, including walks, runs, and turns, is essential for motion matching. Animations must contain root translation and rotation curves for trajectory calculations. In-place animations are unsuitable without conversion to root motion. High-quality root-motion data drives fluid motion matching outcomes.
  • Good coverage and continuity of motions: A comprehensive motion database with varied movements like jogging and slow turns prevents animation gaps. Gaps cause unnatural snapping during transitions. Motion capture data ensures continuity and realism in motion matching. Broad coverage is critical for smooth, natural character animations.
  • An animation retargeting setup (IK Rigs): Create IK Rigs for ALS mannequin and MetaHuman skeletons, mapped via an IK Retargeter to handle bone differences. This ensures accurate animation transfer to MetaHuman proportions. Retargeting prevents distortions during animation playback. Proper mapping maintains visual and functional integrity.
  • Adequate hardware and performance budget: Motion matching’s CPU-intensive database searches require optimization, especially with large animation sets. MetaHumans’ high polycount demands performance headroom or LODs for real-time applications. PC/console platforms handle this better than mobile. Hardware planning ensures smooth performance under load.
  • Time and knowledge to integrate: Motion matching setup demands animation expertise to configure plugins, tag animations, and debug MetaHuman mesh alignment. A learning curve exists for integrating ALS and motion matching systems. Dedicated effort is needed for high-fidelity results. Time investment ensures seamless system interoperability.

How do I integrate ALS with a custom Metahuman in UE5?

  • Install ALS Community and import your MetaHuman: Add the ALS Community plugin and verify its functionality with the default character in a test level. Import the MetaHuman via Quixel Bridge, ensuring its blueprint with Body and Face components is functional. This step establishes the foundation for combining ALS and MetaHuman systems. Testing ensures both components work independently before integration.
  • Set up an IK Retargeter for ALS to MetaHuman: Create an IK Retargeter mapping the ALS mannequin to the MetaHuman skeleton, adjusting for extra bones like metacarpals to prevent finger warping. Ensure accurate chain mappings for spine, arms, and legs, previewing with an ALS animation. This setup enables precise animation transfer. Proper configuration avoids common retargeting issues like distorted poses.
  • Duplicate the ALS character blueprint: Duplicate the ALS blueprint, replacing the mannequin mesh with MetaHuman’s Body and Face components, mirroring the MetaHuman blueprint’s hierarchy. Ensure transforms align to maintain animation integrity. This replaces the visual character while preserving ALS logic. The setup ensures the MetaHuman is the visible entity in-game.
  • Apply a retargeting solution at runtime: Use a Retarget Pose From Mesh node in a new MetaHuman Animation Blueprint, referencing the ALS-to-MetaHuman IK Retargeter. The hidden ALS mannequin runs ALS animations, retargeted live to the MetaHuman, leveraging existing logic. This approach simplifies updates to ALS animations. Runtime retargeting enhances workflow efficiency.
  • Link the MetaHuman components to follow the animation: In the blueprint’s Construction Script, use Set Leader Pose Component to make the Body mesh drive Face and other parts, ensuring unified movement. This mirrors MetaHuman’s default setup, preventing mesh desync. The face follows the head bone accurately. Proper linking maintains visual coherence across components.
  • Hide or show meshes appropriately: Hide the ALS mannequin’s visuals using transparency or scaling, ensuring its animation ticking remains active to support retargeting. Set the MetaHuman mesh as visible with no collision. This keeps the retargeting functional while showing only the MetaHuman. Visibility management prevents animation disruptions.
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5

How do I retarget ALS animations to a Metahuman rig?

  • Create IK Rig assets: Generate IK Rigs for the ALS mannequin and MetaHuman skeletons, defining bone chains for limbs, spine, root, and optionally hand/foot goals. Use existing rigs if available to save time. This prepares both skeletons for accurate retargeting. Rigs ensure proper bone hierarchy recognition.
  • Create an IK Retargeter asset: Set the ALS rig as source and MetaHuman rig as target in a new IK Retargeter, previewing with an ALS animation like walking. The editor displays skeletons for mapping verification. This establishes the retargeting framework for pose transfer. Previewing ensures initial alignment accuracy.
  • Adjust bone chain mapping: Exclude MetaHuman’s metacarpal bones in hand mappings to prevent finger twisting, ensuring root and twist bones interpolate correctly. Align skeletons in similar poses (e.g., A-pose) for consistency. Proper mapping avoids distortions like warped hands. Root motion settings preserve locomotion integrity.
    • They have metacarpal bones in the hands (for the base of each finger) that the UE4 mannequin lacks. If the retargeter maps the mannequin’s hand chain to include these, you could get twisted fingers. The fix (as mentioned earlier) is to remove or ignore those in mapping. For example, set the MetaHuman’s LeftThumbMetacarpal, etc., to None or ensure only the relevant parts are mapped.
    • MetaHumans also have more spine bones (they have UpLegTwist, etc., and various twist bones for arms). Typically, the IK retargeter will just follow the chain hierarchy; as long as the root and end effectors are mapped, twist bones interpolate. Make sure the root (e.g., pelvis or root) is mapped to root, and the IK retarget root settings (like root translation scaling) are set appropriately (often you keep root motion, especially if using motion matching with root motion).
  • Export animations (optional): Export retargeted ALS animations as MetaHuman assets for a custom Animation Blueprint, replicating ALS’s state machine logic, curves, and notifies. This creates standalone animations but is labor-intensive. It suits workflows needing specific MetaHuman animations. Manual export ensures full control over assets.
  • Use retargeting at runtime (alternative): Apply the IK Retargeter in a Retarget Pose From Mesh node for live pose transfer, referencing it in the MetaHuman’s AnimBP. This avoids exporting animations, streamlining updates with ALS changes. Runtime retargeting enhances efficiency. It leverages existing ALS animation logic.
  • Retarget root motion and IK: Enable root motion transfer in the Retargeter for motion matching animations, focusing on main skeletal poses. Reapply ALS’s IK logic via AnimBP or Control Rig for foot placement, ensuring grounding. This maintains accurate movement. IK adjustments enhance environmental interaction.

Is there a blueprint setup for combining ALS and Metahuman motion?

Yes. Besides the asset retargeting, you need to configure the character blueprint to properly combine ALS with the MetaHuman. The main blueprint work involves attaching the MetaHuman skeletal mesh to the ALS character and ensuring the animations propagate. Here are the blueprint steps in detail:

  • Character Blueprint Components: Add MetaHuman’s Body and Face skeletal meshes as components in the duplicated ALS blueprint, mirroring the MetaHuman’s setup with components like Hair or Teeth. Keep the hidden mannequin Mesh to run ALS animations, aligning all at origin. This ensures MetaHuman visuals replace the mannequin while preserving ALS functionality. The component structure maintains animation integrity for seamless integration.
  • Animation Blueprint assignment: Assign a new MetaHuman Animation Blueprint with a Retarget Pose From Mesh node to the Body mesh, while the mannequin retains the ALS AnimBP. This separates MetaHuman retargeting from ALS animation logic, ensuring proper pose transfer. The setup prevents animation conflicts between systems. It enables the MetaHuman to mirror ALS poses accurately.
  • Blueprint Event Graph (if needed): In the MetaHuman AnimBP’s Event Initialize, cast to the pawn owner to reference the mannequin mesh, storing it as a variable for the Retarget Pose node. This automates source mesh setup, avoiding additional blueprint code. The approach minimizes complexity in the character blueprint. It ensures reliable pose transfer across frames.
  • Construction Script – Leader Pose: Use Set Leader Pose Component in the Construction Script to make the Body mesh drive Face, Hair, and other components, ensuring unified animation. This mirrors MetaHuman’s default blueprint, preventing mesh desync. The face follows the head bone accurately. Proper linking maintains visual coherence across all components.
  • Optional – Copy Curves or other data: Enable curve copying in the Retargeter to transfer ALS animation curves, like footstep triggers, to the MetaHuman AnimBP for features like sound syncing. This enhances functionality without altering ALS logic. Curve transfer supports interactive elements. It maintains consistency with ALS-driven behaviors.
  • Camera and Input: Retain ALS’s camera and input logic, adjusting the camera’s spring arm socket to the MetaHuman’s head bone if necessary. Ensure alignment with MetaHuman proportions to avoid visual mismatches. This preserves player control consistency across setups. Camera adjustments ensure seamless integration with the new character model.
  • Testing Blueprint Setup: Test in the editor for animation issues like T-posing, verifying the Retarget Pose node references a valid mannequin mesh. Check root motion to prevent sliding, debugging blueprint or retargeting errors. Testing ensures the setup functions as intended. Fixes address common integration issues promptly.
  • Final adjustments: Hide the mannequin’s visuals using transparency or scaling, ensuring it ticks for animation, and disable collision on the MetaHuman mesh. This keeps retargeting active while displaying only the MetaHuman. Final tweaks optimize visuals for gameplay. Collision relies on the capsule for efficiency.
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5

What are common issues when using ALS with Metahuman and how to fix them?

  • Warped or twisted fingers: Incorrect metacarpal bone mappings in the IK Retargeter cause finger distortions during animations. Edit the Retargeter’s hand chain mappings, setting metacarpal entries to None to prevent erroneous rotations. This corrects finger poses for natural hand animations. The fix ensures accurate retargeting of hand movements.
  • Feet sliding or misalignment: Skeletal differences cause foot sliding despite ALS’s foot IK system. Ensure matching retarget poses and apply additional IK via Control Rig or MetaHuman AnimBP to pin feet. Tuning IK settings or adding root motion warping resolves sliding issues. Enhanced foot placement improves grounding on terrain.
  • Clothing or accessories not following correctly: Clothing meshes desync if omitted from leader pose targets or affected by cloth simulation errors. Include clothing in Set Leader Pose Component targets and disable physics initially for testing. Verify retargeting for extra bones like coat tails. This ensures clothing moves in sync with the body.
    • Ensure that for any additional skeletal mesh (coat, etc.), you also included it in the Set Leader Pose Component targets so it follows the body pose. Including the coat in the leader pose targets ensures it animates with the MetaHuman’s body movements. This prevents the coat from lagging or moving independently during animations. It maintains visual coherence across all character components. Proper setup aligns the coat with the body’s skeletal pose.
    • If cloth simulation is on, consider turning it off (paint 0 weight) for initial tests. It might be easier to get it working without physics, then re-enable cloth sim. Disabling cloth simulation simplifies initial testing by removing physics-related artifacts. This isolates animation issues for easier debugging. Physics can be re-enabled after confirming stable animation behavior. The approach ensures reliable clothing integration.
    • Double-check retargeting of any bones the cloth uses. If the coat has extra bones (tail, etc.), you may need a retarget setup for them or they might be driven purely by physics. Verifying retargeting for coat-specific bones prevents animation mismatches. Extra bones may require dedicated retargeting or physics-driven motion. This ensures the coat’s unique elements animate correctly. Proper retargeting maintains the coat’s intended behavior.
  • MetaHuman face not moving with head or not animating: Face mesh desync occurs if not attached to the head bone or excluded from leader pose targets. Attach the face to the head bone socket and include it in leader pose targets. Ensure the face AnimBP supports expressions to avoid static faces. This maintains face-body synchronization during animations.
  • Character mesh disappearing or T-posing in game: Fully hiding the mannequin mesh stops the Retarget Pose From Mesh node, causing T-posing or frozen animations. Use transparency or scaling instead of Hidden In Game to keep animation ticking active. This ensures retargeting continues without visual interference. Visibility workarounds prevent animation disruptions.
  • Performance hitches or high CPU usage: Large motion databases or ALS blueprint ticks cause performance issues in complex scenes. Optimize Motion Symphony’s search settings or simplify unnecessary ALS features like layering. Use the C++ ALS version for better efficiency. Profiling identifies bottlenecks for targeted performance improvements.
  • Bone mismatch or animation offset: Root bone scaling or pose mismatches cause the MetaHuman to hover or sink slightly. Adjust the Retargeter’s root translation to Globally Scaled or Absolute, tweaking IK offsets for feet. This aligns the character with the ground accurately. Proper scaling corrects positional discrepancies.
  • ALS specific actions not working: MetaHuman proportions may disrupt ALS’s vault or mantle traces due to height differences. Adjust ALS config settings like capsule height or trace lengths to match MetaHuman size. Ensure curves and notifies transfer during retargeting. This restores functionality for ALS-specific actions.
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5

Does ALS support facial animation and blendshapes in Metahuman?

By default, ALS (Advanced Locomotion System) deals with body locomotion animations and does not include any facial animation logic. ALS doesn’t know about MetaHuman-specific features like ARKit blendshapes or the facial control rig, because it was designed for the mannequin which has no face rig. However, you can use MetaHuman facial animations in parallel with ALS – it just requires treating the face separately.

Here’s how it breaks down:

  • Body vs Face: The MetaHuman’s body skeletal mesh is controlled by ALS for locomotion, while the face mesh, attached to the head bone, supports separate facial animations. These systems run independently, allowing simultaneous body movement and facial expressions. ALS focuses on body bones, leaving facial animations unaffected. This separation enables flexible animation layering for complete character performances.
  • Using Face Animation with ALS: Real-time facial animations via Live Link Face or Faceware drive MetaHuman blendshapes concurrently with ALS body movements. These facial mocap solutions operate without conflicting with ALS’s body animation logic. This allows expressive characters during gameplay with ALS-driven locomotion. Integration of facial capture enhances MetaHuman realism seamlessly.
  • Triggering facial animations in gameplay: Facial Animation Montages or a separate face AnimBP can trigger expressions like smiles during gameplay, layering over ALS body animations. This approach maintains animation independence for body and face. It supports dynamic emotional responses in interactive scenarios. Montages enable context-specific facial performances.
  • Blendshapes and ALS Animations: ALS body animations lack facial movement, requiring separate face animations for actions like blinking. Use Animation Blueprint layering with a Linked Anim Graph to apply blendshapes for facial expressions. This keeps body and face animations distinct and compatible. Layering ensures comprehensive character animation with expressive faces.
  • Bone-driven facial bones: MetaHuman’s facial bones, like jaw and eyes, may need baseline animation to avoid drift, managed by the default face AnimBP. Retain this AnimBP unless explicitly animating facial bones to maintain stability. This ensures consistent facial positioning during locomotion. It supports natural face behavior in all scenarios.

ALS allows parallel MetaHuman facial animations through blendshapes or joints, driven by systems like Live Link or Sequencer, complementing ALS’s body locomotion for cinematic and gameplay performances. The layered approach combines procedural body motion with authored facial expressions, enhancing storytelling and character realism.

How do I import motion capture data into ALS for Metahuman use?

To import motion capture (mocap) data for ALS and MetaHuman use, follow these steps:

  • Importing new animations: Import FBX mocap files targeting the ALS mannequin skeleton, retargeting from other skeletons if necessary using the IK Retargeter. Clean data in tools like MotionBuilder to remove noise or sliding for optimal quality. This ensures animations are compatible with ALS’s system. Proper import prepares mocap for seamless integration.
  • Adding to ALS AnimBP or Motion Matching database: Incorporate mocap into ALS’s blendspaces for specific states or add to a motion matching database for dynamic selection. For motion matching, include animations in Motion Symphony or Pose Search assets with appropriate tags. This enriches the animation pool for varied movements. Mocap enhances realism in both ALS and motion matching workflows.
    • If sticking with ALS’s state machine: For example, you captured a new set of vault animations or a new idle variation. You can incorporate those by adding them to ALS’s blendspaces or state transitions. ALS has a complex AnimBP, so you’d find the relevant state (e.g., “Idle” state) and either replace or add the mocap animation (perhaps via a randomized sequence player or blend). This requires some understanding of ALS’s anim graph.
    • If using motion matching: This is where mocap really shines. You can import a large motion file (or many files) and add them to the Motion Matching database. For Motion Symphony, you’d use a Motion Data asset and add your animations to it. You’d also configure tags or features if needed (like “walking”, “running” tags or trajectory points).
  • Retarget to MetaHuman: Use runtime retargeting to apply mannequin-based mocap to the MetaHuman via the Retarget Pose node, leveraging ALS integration. Alternatively, retarget mocap to the MetaHuman skeleton for cinematic use without ALS. This ensures mocap drives MetaHuman visuals accurately. Retargeting maintains animation fidelity across skeletons.
  • Cleaning and trimming mocap: Clean raw mocap in MotionBuilder, Blender, or Unreal’s Control Rig to fix issues like foot sliding or jitter. Split clips to exclude bad frames, ensuring clean input for motion matching. High-quality input leads to smooth animation output. Proper cleanup is essential for professional results.
  • Using Live mocap directly: Record live mocap via Live Link into Anim Sequences using Take Recorder, treating them as imported animations for ALS or motion matching. This supports real-time puppeteering or recording for later use. Live capture enables dynamic animation workflows. Recorded assets integrate like standard FBX files.
  • Sequencer cinematic use: Use mocap in Sequencer for cinematic scenes, layering over ALS-driven characters or triggering as montages for specific actions. This mixes procedural locomotion with authored animations for narrative control. Sequencer enhances cinematic precision and flexibility. It supports complex, story-driven sequences.
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5

Mocap data, preferably root-motion based, enhances ALS or motion matching animations, with runtime retargeting ensuring MetaHuman compatibility. Well-cleaned and integrated mocap delivers highly realistic, unique movements for both gameplay and cinematic applications.

What is the best workflow for motion matching Metahumans using ALS?

Here’s a recommended workflow that covers real-time gameplay and cinematic considerations:

  • Start with ALS for baseline locomotion: Set up ALS Community and integrate a MetaHuman using IK Retargeter and blueprint configurations for robust third-person movement. Test basic locomotion to ensure stability and functionality in gameplay scenarios. This establishes a reliable foundation for character control. ALS’s mechanics provide consistent player interactions across levels.
  • Decide on motion matching integration approach: Replace ALS’s locomotion state machine with a Motion Matching node from Motion Symphony or UE5.4’s Pose Search, using character velocity and trajectory inputs. Integrate with ALS’s speed and direction parameters for seamless control. This enhances animation fluidity with minimal blueprint changes. Motion matching delivers dynamic, realistic locomotion effortlessly.
  • Build a rich animation dataset: Collect diverse root-motion animations, such as mocap or Epic’s 500+ locomotion clips, for walks, runs, and turns. Tag and preprocess animations for Motion Symphony or Pose Search databases, ensuring granular clips for smooth transitions. A comprehensive dataset prevents animation gaps. It drives natural, varied character movements.
  • Iterate on motion matching parameters: Tune motion matching weights for responsiveness versus realism, using debug visualization to monitor clip selection and adjust pose matching tolerance. This prevents snapping during transitions for a polished player experience. Iterative tuning optimizes animation behavior. Debugging ensures seamless, natural motion transitions.
  • Keep ALS features that add realism: Retain ALS’s foot IK for terrain adaptation, integrating with motion matching via Full Body IK nodes to correct sliding on slopes. Evaluate keeping ALS montages for actions like mantling to combine procedural and data-driven animations. This enhances environmental interaction. ALS features complement motion matching’s fluidity.
  • Facial and cinematic layers: Use ALS and motion matching for gameplay movement, layering cinematic animations in Sequencer for precise head turns or gestures. Apply facial mocap separately for expressive performances, saving time while polishing narrative details. This hybrid approach enhances storytelling. Sequencer refines cinematic quality efficiently.
  • Testing in gameplay scenarios: Playtest motion matching for input responsiveness and transition smoothness, adding animations or tweaking parameters to fix issues like snapping. Ensure consistent player control across various gameplay scenarios. Testing identifies and resolves animation gaps. Adjustments guarantee fluid, intuitive character responses.
  • Optimization pass: Profile AnimGraph to reduce CPU load, lowering motion matching search frequency or using LODs for distant MetaHumans. Simplify unused ALS features to boost efficiency while maintaining animation quality. Optimization ensures smooth real-time performance. Profiling targets bottlenecks for streamlined execution.
  • Iteration and polish: Blend motion matching with handcrafted montages for specific actions, using pose matching for smooth transitions to avoid jarring shifts. Refine animations iteratively for edge cases like object interactions. This balances procedural and authored animations effectively. Polish creates a cohesive, high-fidelity animation system.

Can you blend motion matching with keyframe animation in ALS?

Yes, blending motion matching with keyframed animations in ALS is feasible and often necessary to cover diverse gameplay scenarios, combining motion matching’s fluid locomotion with specific, authored actions. This approach uses Unreal’s animation tools to ensure seamless transitions, leveraging ALS’s existing framework for layered blending and montages.

Here are some ways blending occurs:

  • Montages for specific actions: ALS montages for actions like rolling or mantling can override motion matching, pausing it to play the montage and resuming with pose matching enabled. Enable pose warping on montages to blend back to motion matching smoothly. This ensures specific actions integrate seamlessly. Montages handle well-defined gameplay events without disrupting locomotion.
  • Layered blends (upper body/lower body): Apply keyframed upper-body animations, like aiming, over motion-matched leg locomotion using Layered blend per bone, masking to spine bones upward. This maintains fluid locomotion while executing authored upper-body actions. Layered blends support complex behaviors like combat. The approach ensures animation coherence across body parts.
  • State machine fallback: Use motion matching for locomotion but switch to traditional state machines for states like ladder climbing, toggling systems in the AnimBP based on state. This combines procedural motion with fixed animations for specialized movements. State machines handle unique scenarios effectively. The setup maintains animation system flexibility.
  • Inertial blending: Use Unreal’s Inertialization or Motion Symphony’s snapshot nodes to smooth transitions from motion matching to keyframed animations, interpolating poses to reduce snapping. This enhances continuity when switching animation types. Inertial blending improves visual coherence. It ensures seamless shifts for a polished experience.
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5

How do I use Control Rig with ALS and Metahuman in UE5?

Control Rig enhances ALS and MetaHuman animations in UE5 by enabling runtime procedural adjustments or editor-time animation authoring in Sequencer. For runtime, embed Control Rig nodes in Animation Blueprints post-ALS pose calculation to modify poses, like adjusting arms for aiming, using Epic’s sample MetaHuman Control Rig.

Control Rig can be integrated at multiple stages:

  • For gameplay, to add procedural animation on top of ALS (like better aiming, reaching, or adjusting animations to environment): Control Rig nodes in the Animation Blueprint can dynamically adjust MetaHuman poses, such as directing arms toward a target during gameplay, enhancing ALS’s locomotion. Using Epic’s sample MetaHuman Control Rig, you can customize lightweight IK solutions, like Two-Bone IK, for specific tasks without overloading performance. Inputs like target transforms are exposed via AnimBP, enabling real-time procedural effects. This approach ensures seamless integration with ALS’s motion, adding responsive, environment-aware animations.
  • For cinematics and polish, to edit animations (body or face) for MetaHumans after using ALS and motion matching to get the base movement: In Sequencer, Control Rig tracks allow precise tweaks to baked ALS animations, such as adjusting arm gestures or foot placement for cinematic scenes. This method refines MetaHuman performances for story-driven moments, overlaying custom animations like waves or spine adjustments. It leverages ALS’s base motion while enabling artistic control for polished cutscenes. Small studios benefit from this efficient workflow, enhancing narrative sequences without starting from scratch.

The use of Control Rig is not required for ALS+MetaHuman, but it’s a great tool in UE5 to have in your arsenal. Control Rig enables animators to treat MetaHumans like digital puppets, with ALS and motion matching handling core movement, ideal for cinematic fine-tuning.

How do I optimize Metahuman performance with ALS and motion matching?

MetaHumans, ALS, and motion matching demand optimization to maintain frame rates in UE5. Key strategies include leveraging LODs, reducing bone counts, and simplifying ALS and motion matching logic. Adjusting MetaHuman quality settings, disabling costly features like hair physics, and using threading improve performance. For crowds, simpler skeletons or tick disabling for off-screen characters help, while profiling on target hardware ensures bottlenecks are addressed.

Here are strategies to optimize:

  • Use LODs and quality settings for MetaHuman: Configure the LODSync component in the MetaHuman blueprint to enforce lower LODs, like LOD 1 or 2, reducing poly count and active morph targets significantly. This maintains visual quality for gameplay while cutting performance costs, as LOD 0 is overly detailed. Setting Forced LOD in the blueprint optimizes body and face rendering efficiently. Medium or Low quality MetaHumans further simplify assets for less demanding scenes.
  • Reduce bone count if possible: MetaHumans’ high bone count, especially facial bones, impacts performance due to animation calculations, even at lower LODs. Deactivating unnecessary bones, like facial rigs for non-animated faces, can help, though it’s complex and unsupported officially. Sticking to LOD optimizations is practical when facial animation is needed. This reduces computational overhead without compromising essential visuals.
  • Optimize ALS blueprint and tick: Use the C++ version of ALS to reduce tick costs compared to the blueprint version, and disable unneeded features like detailed footsteps. Avoid expensive logic for distant or AI characters to save resources. Profiling helps identify and trim redundant ALS calculations. This ensures efficient performance for controlled characters while maintaining core functionality.
  • Motion matching optimization: Tune Motion Symphony’s update frequency, such as 30fps instead of 120fps, to reduce overhead, and use animation buckets for efficiency. Ensure the Pose Search database avoids debug data in shipping builds for optimal performance. Follow plugin documentation to leverage built-in optimization settings effectively. This balances motion matching’s quality with computational demands for smooth gameplay.
  • Limit MetaHuman costly features: Disable dynamic groom physics or use simpler hair styles, like card hair, to lower tick costs, and avoid ray tracing for materials unless essential. Lowering physics tick rates for hair or cloth further optimizes performance. These adjustments reduce GPU and CPU demands significantly. Simpler assets maintain gameplay fluidity without sacrificing core visuals.
  • Animation budget considerations: Limit motion matching to key characters, using simpler animation systems for NPCs to avoid overloading with multiple characters. Apply network relevance to prioritize full motion matching for nearby characters only. Reduced motion datasets for distant NPCs help manage performance. This ensures efficient resource allocation across character types in crowded scenes.
  • Threading: Enable “Allow MultiThreaded Animation Update” in project settings to offload animation tasks, like AnimBP and motion matching queries, to worker threads. This reduces game thread bottlenecks, enhancing overall performance. It leverages UE5’s multithreading capabilities effectively. Animation processing runs smoothly, supporting real-time demands.
  • Profile on target hardware: Use Unreal’s Profiler or STAT commands on target platforms to identify performance issues, such as skinning or animation thread costs. Enable GPU Skinning to offload skeletal mesh processing to the GPU, which is typically default in UE5. This ensures optimizations target specific hardware limitations. Profiling guides precise adjustments for optimal frame rates.
  • Consider simpler skeleton for crowds: Use simplified skeletons without facial rigs for non-hero MetaHumans in crowds, as seen in Epic’s Matrix demo, to reduce bone processing. High bone counts strain performance at scale, even with LODs. This approach prioritizes hero characters’ detail while optimizing crowd performance. Simpler rigs maintain visual coherence with lower costs.
  • Tick disabling: Disable AnimBP updates for off-screen characters by setting Component Tick Enabled to false, but ensure motion consistency upon reappearance. Switch distant characters to pre-baked animations or crowd systems to avoid full ALS and motion matching. This reduces unnecessary processing for non-visible characters. It balances performance with visual continuity in dynamic scenes.

Optimizing is about balancing quality and cost. Methodical application of these strategies can yield significant performance gains, making MetaHumans with ALS and motion matching viable for real-time applications.

Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5

How realistic is the animation when using motion matching with Metahuman?

The animation quality achieved by using motion matching on a MetaHuman is extremely realistic – often approaching the look of high-end motion capture in AAA games. Here’s why and what to expect:

  • Smooth transitions: Motion matching avoids mechanical transitions by selecting mocap clips where movements, like stopping, occur naturally, ensuring realistic weight shifts and foot plants. This results in fluid MetaHuman movements, such as taking extra steps to halt momentum or adjusting posture during turns. Unlike state machines, it eliminates robotic blends, enhancing believability. The system’s data-driven approach delivers nuanced, lifelike animations effortlessly.
  • Reactive to player input: Motion matching dynamically selects poses based on input, allowing MetaHumans to respond instantly to actions like circling, with balanced, lifelike stepping. This avoids canned blends, making characters appear grounded and responsive in real-time gameplay. It outperforms traditional blendspaces by stringing together context-appropriate animations. The result is immersive, natural movement that aligns with player control.
  • Data-driven realism: High-quality mocap datasets enable MetaHumans to move indistinguishably from real humans, as seen in AAA titles using motion matching for physics-consistent grace. Comprehensive data ensures unique, flavorful movements without sliding or popping. The system’s fidelity depends on dataset richness, covering diverse scenarios. This approach delivers authentic, high-fidelity animations for immersive experiences.
  • MetaHuman visuals complement the motion: MetaHumans’ realistic proportions and joint placement synergize with motion-matched mocap, producing authentic hip and shoulder movements and weight shifts. Unlike simplistic run cycles, motion matching ensures complex actions like starts or leans look genuine. This enhances animation authenticity across gameplay and cinematics. The combination elevates MetaHuman performances to professional standards.
  • Edge cases and limitations: Incomplete datasets may cause brief transition issues, like foot slides, if the system lacks specific motions, requiring comprehensive data for maximum realism. Combat or non-locomotion actions rely on authored animations, impacting their realism separately. Motion matching excels in locomotion, where traditional methods struggle most. Polishing edge cases ensures consistent, high-quality output.
  • Comparing to traditional ALS: ALS’s traditional tricks, like stride warping, improve realism but can’t match motion matching’s direct use of mocap for natural foot placement and momentum. Motion matching eliminates the floating or gliding often seen in game characters, making every step intentional. It surpasses ALS’s hand-authored blends for superior realism. This results in lifelike MetaHuman movements that feel video-like.
  • Visual example: In a sharp 90-degree turn, motion matching selects a mocap clip of a realistic pivot, with the MetaHuman planting a foot and pushing off naturally, unlike traditional blends that may slide. Variations, like alternating foot plants, add realism based on timing. This seamless, data-driven approach enhances visual fidelity. It creates striking, human-like directional changes.
  • Facial and other details: Motion matching focuses on body animation, requiring separate facial animation for full realism, as a blank-faced MetaHuman looks odd despite perfect body motion. Pairing with subtle facial animations, like idle blinks, completes the lifelike effect. This ensures a cohesive, convincing character performance. Motion matching remains the gold standard for body movement realism.

In conclusion, using motion matching with a MetaHuman can produce animation that is highly realistic, smooth, and responsive. As long as your animation dataset is rich and your system tuned, the MetaHuman will showcase life-like motion that can elevate both gameplay immersion and cinematic quality.

How to make metahumans look realistic in unreal engine 5: complete guide for artists and developers
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5

Can I use third-party animations with ALS and Metahuman?

Third-party animations from marketplaces or libraries can be used with ALS and MetaHumans by retargeting them to the appropriate skeleton, typically the UE4 mannequin or MetaHuman skeleton.

Here’s how to incorporate third-party animations:

  • Retarget to the correct skeleton: Use UE5’s IK Retargeter to map third-party animations, like those on the UE4 mannequin skeleton, to the ALS or MetaHuman skeleton, ensuring compatibility. Marketplace assets often align with Epic’s skeleton, requiring minimal setup, while Mixamo or custom rigs need bone mapping via IK Rig. This process enables animations to work seamlessly with ALS’s framework. Retargeting ensures smooth application to MetaHuman characters without distortion.
  • Adding to ALS system: Integrate retargeted animations into ALS by creating new states or montages for abilities like wall-running, or replacing existing actions like idles with variations. ALS’s extensible blueprint allows easy addition of third-party moves, such as punch combos, with tweaks for root motion consistency. This expands ALS’s functionality to include diverse actions. Community examples demonstrate successful integration for enhanced gameplay mechanics.
  • Using with MetaHuman: Animations retargeted to the ALS mannequin automatically apply to MetaHumans via runtime retargeting, or you can retarget directly to the MetaHuman skeleton for a dedicated AnimBP. Keeping animations on the ALS mannequin simplifies the pipeline, leveraging existing setups. This ensures MetaHumans perform new actions effortlessly. It streamlines integration for consistent character performance across systems.
  • Motion matching databases: Add third-party animations to motion matching databases, ensuring they meet root motion requirements for seamless integration. Mixing sources is feasible, but consistent styles prevent jarring transitions between realistic and stylized motions. This enhances motion matching’s coverage, like adding new starts or stops. Proper tagging and organization ensure effective database utilization.
  • Example – using Mixamo or other free animations: Retarget Mixamo’s in-place animations to the UE4 mannequin, converting to root motion if needed for motion matching, using community tools or external apps. These can be triggered as ALS montages for actions like dances or used selectively in motion matching for special moves. This approach adds variety without custom mocap. It requires careful setup for compatibility.
  • Third-party Motion Libraries: Import specialized libraries, like Adobe Motion Library or Carnegie-Mellon mocap, and retarget them to the mannequin or MetaHuman skeleton for ALS or motion matching use. These expand animation options for unique character behaviors. Proper retargeting ensures compatibility with UE5’s systems. This leverages high-quality mocap for professional results.
  • Metadata and tagging: Use clear naming and metadata for mixed animation sources, tagging them in Motion Symphony for selective use, like restricting to realistic turns in gameplay. This organizes animations for efficient selection in complex projects. It prevents confusion during integration and runtime. Proper organization enhances workflow efficiency and animation coherence.
  • Licensing: Verify licensing for third-party animations, as marketplace assets are typically commercial-friendly, but free sources like Mixamo require confirmation for project use. This ensures legal compliance across development and distribution. It’s a critical step for professional projects. Checking rights upfront avoids future complications.

To sum up: using third-party animations is a great way to expand what your ALS+MetaHuman character can do without having to create animations from scratch. It’s a matter of retargeting and integration, which UE5 tools make relatively straightforward.

Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5

How does Unreal Engine 5 handle motion matching for Metahuman bodies?

Here’s how UE5 handles motion matching and what it means for MetaHuman characters:

  • Pose Search and Motion Matching Core: The Pose Search plugin uses a database to index animation clips, queried by a Motion Matching node in AnimBP to select poses based on character state, like velocity or trajectory. It blends seamlessly into the chosen animation, repeating to follow desired motion. The schema defines matching criteria, such as bone positions and velocities. This ensures fluid, responsive animations for any skeleton, including MetaHumans.
  • MetaHuman specific considerations: MetaHuman skeletons, based on Epic’s skeleton, use retargeted animations in a Pose Search database, focusing on locomotion-relevant bones like pelvis and thighs, excluding facial bones. The system treats MetaHumans like any humanoid, ensuring compatibility without special adjustments. Animations can be built directly for MetaHumans or retargeted from other skeletons. This simplifies setup while maintaining high-quality motion output.
  • Using it in AnimBP: Enable the Pose Search plugin and add a Motion Matching node to the MetaHuman’s AnimBP, referencing a database of MetaHuman-targeted animations. Feed it a Query pose from the character’s movement component or MotionTrajectoryComponent for trajectory data. This drives real-time pose selection based on input. The setup integrates seamlessly with MetaHuman skeletons for dynamic animation.
  • MetaHuman with ALS: Use ALS for movement logic, replacing its state machine with a Motion Matching node to handle animation, leveraging Pose Search for pose selection. ALS provides input velocity, while motion matching ensures realistic animation output. The MetaHuman skeleton follows without altering the core algorithm. This hybrid approach combines ALS’s control with motion matching’s fluidity.
  • Real-time retarget vs direct: Perform motion matching directly on MetaHuman animations or on mannequin animations, retargeting the pose to MetaHuman at runtime via a Retarget Pose node. Direct matching simplifies the pipeline, while runtime retargeting reuses data across characters. Both methods ensure smooth MetaHuman animation. The choice depends on project needs for flexibility or simplicity.
  • MetaHuman ragdoll or physics interactions: Motion Warping and root motion edits, used alongside motion matching, adjust MetaHuman animations for actions like jumps, treating their skeletal mesh like any other. These tools ensure compatibility with UE5’s animation systems. They enhance motion matching’s adaptability for dynamic interactions. MetaHumans integrate fully with these features for cohesive performance.
  • Foot IK and MetaHumans: UE5’s IK Retargeter ensures foot planting in motion-matched poses remains accurate when retargeted to MetaHumans, with runtime retargeting being efficient. Advanced users can map MetaHuman skeletons to share mannequin hierarchies for direct animation use. This minimizes retargeting overhead while maintaining precision. It supports seamless, grounded locomotion for MetaHumans.

What is the difference between traditional animation and motion matching in ALS?

The difference between traditional animation (as used in standard ALS) and motion matching can be highlighted in a few key points:

  • Animation Selection: Traditional ALS uses state machines to play animations based on rules, like speed thresholds, with explicitly defined transitions. Motion matching selects the best pose from a large animation pool at runtime, jumping between clips based on pose similarity. Transitions emerge naturally from data, not scripted paths. This results in seamless, realistic MetaHuman movements without predefined states.
  • Blending vs Matching: Traditional systems linearly blend animations, like crossfading idle to walk, often requiring noticeable transition times. Motion matching jumps to a matching frame, such as a walk’s start, minimizing blending for near-perfect continuity. Short blends maintain pose similarity, resembling action cuts. This ensures fluid, natural transitions for enhanced realism in animations.
  • Authoring Workload: Traditional ALS requires manual authoring of BlendSpaces and transitions, creating complex logic to cover movement combinations. Motion matching shifts effort to collecting mocap data, letting algorithms handle combinations, reducing blueprint complexity. It’s data-driven, requiring more animations but less manual tuning. This simplifies animation pipelines while demanding robust datasets for coverage.
  • Responsiveness and look: Traditional systems may sacrifice realism for responsiveness, causing foot slides, or vice versa with slower transitions. Motion matching offers both, instantly selecting fitting clips, like pivots, for fast, natural responses. This eliminates sliding or canned looks, enhancing MetaHuman realism. It delivers reactive, lifelike animations aligned with gameplay dynamics.
  • Complex moves coverage: Traditional ALS links complex sequences, like run-to-roll, through explicit states and blends, requiring detailed setup. Motion matching flows through such sequences if data exists, without predefined states, deriving transitions implicitly. This simplifies handling intricate movements naturally. It reduces the need for extensive animation graphs.
  • Foot IQ / locomotion “smarts”: ALS uses curves and IK to adjust canned animations for dynamic situations, like slopes or turns. Motion matching naturally includes these qualities if mocap data covers them, reducing reliance on adjustments. IK is still used for fine alignment, but less extensively. This leverages inherent data realism for smarter locomotion.
  • Example to illustrate: In a 180-degree turn, traditional ALS blends stop, turn, and start animations, risking foot slides or slow transitions. Motion matching selects a mocap clip of a pivot turn, planting a foot for a seamless swivel. No explicit transitions are needed, ensuring realism. The result is a natural, continuous motion from data.
  • Technology differences: Traditional animation relies on state machines and blends, using fewer animations through reuse, suitable for simpler systems. Motion matching, a newer technique, uses computation to search large animation datasets, optimized by structures like k-d trees. It requires more memory and animations for continuum. This delivers superior realism with increased resource demands.
  • In context of ALS specifically: ALS’s traditional AnimBP is complex, with crafted states and transitions for varied movements. Motion matching simplifies this to a single node, shifting complexity to data preparation. The anim graph becomes minimal, relying on animation datasets. This streamlines blueprint design while enhancing animation quality through data.
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5

Is ALS motion matching suitable for cinematic Metahuman projects?

ALS with motion matching can be suitable for cinematic projects involving MetaHumans, but with some considerations:

  • Rapid prototyping of movement: ALS and motion matching allow quick blocking of MetaHuman movements in cinematic scenes, recording realistic locomotion into Sequencer without manual animation. Control the character with a gamepad to navigate levels, ensuring natural paths via motion matching. This provides a believable movement baseline efficiently. It’s ideal for pre-visualizing scenes with minimal setup effort.
  • Blending into bespoke animation: Motion matching delivers realistic locomotion, which can blend into custom keyframed animations for specific actions, like hugs, using pose matching for smooth transitions. This combines motion matching’s efficiency with tailored performances for key moments. It ensures cinematic sequences maintain narrative focus. The approach balances automation with artistic control for cohesive results.
  • Maintaining cinematic quality: Motion matching’s realistic locomotion may lack the exaggerated posing needed for dramatic cinematic scenes, requiring Control Rig tweaks in Sequencer. It excels for background characters or generic actions, reducing manual effort for natural motion. Animators can enhance key moments for storytelling impact. This balances realism with cinematic expressiveness effectively.
  • Facial and finger animations in cinematics: ALS motion matching handles body animation, requiring separate facial capture or keyframing for cinematic expressions and dialogue. This integrates seamlessly, allowing animators to focus on emotionally critical face and hand details. It enhances overall MetaHuman realism in cinematics. Motion matching frees resources for high-fidelity facial work.
  • Cameras and framing: Close cinematic shots may reveal minor motion matching glitches, like foot slides, if datasets are incomplete, necessitating polish for critical moments. Motion matching’s smooth output reduces polishing needs overall, maintaining consistency. Careful dataset preparation minimizes visible errors. This ensures high-quality visuals under scrutiny in cinematic contexts.
  • Transition between cinematic and gameplay: Using ALS and motion matching for gameplay ensures smooth transitions to cutscenes, maintaining consistent character movement styles. It supports interactive cinematics where players retain minimal control, like walking during dialogue. This enhances immersion in hybrid scenes. Motion matching’s realism bridges gameplay and cinematic moments seamlessly.
  • Fully cinematic usage: For fully non-interactive scenes, motion matching acts as an auto-animation tool, puppeteering characters realistically, with fine-tuning for custom moments. This is faster than stitching mocap for smaller teams, though less precise than full keyframing. It balances speed and quality effectively. Motion matching supports efficient cinematic workflows with polish.
  • Example scenario: In a cinematic with a sneaking MetaHuman, motion matching handles cautious walking realistically, while Control Rig animates nervous head turns and facial expressions. Specific reactions, like to a noise, use custom animations for impact. This blends motion matching’s natural locomotion with tailored acting. The result is a lifelike, story-driven performance.

Where can I find tutorials for ALS and Metahuman motion matching in UE5?

To learn more and see step-by-step guidance, several resources are available:

  • Epic’s Official Documentation and Samples: Unreal Engine’s documentation on Motion Matching details Pose Search setup, and the Game Animation Sample in UE 5.4 showcases it. MetaHuman documentation covers retargeting and integration. A community tutorial by Kitatus on ALS to MetaHuman provides step-by-step guidance. These resources offer comprehensive, authoritative insights for beginners and advanced users.
  • YouTube Tutorials: Videos like “ALS with Metahuman and facial animations in Unreal Engine 5” demonstrate retargeting and facial integration. Outcast’s UE5 Motion Matching series covers MetaHuman integration, while Kenneth’s Motion Symphony tutorials detail plugin setup. These visual guides clarify complex workflows for practical implementation.
  • Motion Symphony Documentation: The plugin’s online docs and FAQ guide integration with ALS, supported by a GitHub sample project. These resources explain motion set creation and optimization. They’re essential for users leveraging Motion Symphony’s features. The documentation ensures clear, actionable setup steps.
  • Unreal Engine Forums and Reddit: Forums discuss MetaHuman and ALS integration, troubleshooting issues like retargeting, while Reddit’s /r/unrealengine offers community insights on motion matching. These platforms provide practical tips and real-world solutions. They’re valuable for addressing specific challenges. Community discussions enhance learning through shared experiences.
  • Community Discords or GitHub: ALS Community Discord and Animation Mentoring Discord share tips, while the ALS GitHub repo includes integration guides. These platforms foster direct interaction with developers. They offer real-time support and resources. Engaging here accelerates learning and problem-solving.
  • Kitatus and Friends blog: Ryan Shah’s tutorial on ALS with MetaHuman details retargeting, with potential additional content on UE5 features. His blog provides practical, visual walkthroughs for integration. It’s a reliable resource for next-gen techniques. Following updates ensures access to new insights.
  • Video Tutorials on Pose Search: Short tutorials, like “UE5.4 Motion Matching in 13 minutes,” overview enabling Pose Search and basic setup. These quick guides clarify initial steps for motion matching. They’re ideal for rapid learning. They complement deeper resources for comprehensive understanding.
  • MetaHuman Animator / Live Link face tutorials: Epic’s tutorials on MetaHuman Animator and facial animation, found on YouTube or documentation, complement body animation workflows. They guide facial integration with ALS and motion matching. These ensure holistic MetaHuman animation. They enhance cinematic and gameplay realism.
How to make metahumans look realistic in unreal engine 5: complete guide for artists and developers

FAQ Questions and Answers

  1. What does “ALS” stand for in Unreal Engine context?
    ALS, or Advanced Locomotion System, is a free framework for character locomotion in Unreal Engine. It offers pre-built animations and logic for natural, responsive movements like walking, running, and jumping. The system simplifies creating lifelike character motion. It’s widely used for third-person character setups.
  2. Is the Advanced Locomotion System available in Unreal Engine 5?
    Yes, ALS is available for UE5. Originally for UE4, the community-updated ALS v4 supports UE5 versions (5.1, 5.2, and later) and can be downloaded from GitHub as a plugin. The UE4 version on Epic’s marketplace can be migrated, but the community version includes UE5-specific fixes. This ensures compatibility with modern Unreal projects.
  3. Do I need to know C++ to use ALS or motion matching?
    C++ knowledge isn’t required for ALS or motion matching. ALS’s community version uses C++ for performance but is fully usable via Blueprints, as is UE5’s Pose Search or Motion Symphony for motion matching. Deep customization, like plugin integration, may involve C++, but most tasks are achievable with Blueprints alone. This makes the systems accessible to non-programmers.
  4. Can I implement motion matching without motion capture data?
    Motion matching can use non-mocap animations, like hand-keyed ones, if they provide sufficient coverage. Motion Symphony supports such workflows, but quality depends on animation variety. Mocap enhances realism, and animation packs or free datasets can substitute if mocap isn’t available. Limited animations reduce motion matching’s effectiveness.
  5. How do I get the MetaHuman to use my custom animations?
    Retarget custom animations to the MetaHuman skeleton using UE5’s IK Retargeter. Animations from the UE4 mannequin or other sources can be mapped to MetaHumans, then assigned to their Animation Blueprint or played as montages. This process ensures compatibility with MetaHuman characters. Any third-person animation can be adapted with proper bone mapping.
  6. Does motion matching work with multiplayer games?
    Motion matching works in multiplayer as client-side animation logic, with ALS handling movement replication. Deterministic results require synchronized data or inputs across clients and servers. Motion Symphony’s C++ optimization supports multiplayer, but testing and replicating animation states may prevent pose mismatches. CharacterMovement replication ensures authoritative movement.
  7. What is Motion Symphony and do I need it?
    Motion Symphony is a paid UE4/UE5 plugin offering motion matching and animation tools, like pose matching and foot locking. It’s unnecessary if using UE5.4’s built-in motion matching but is valuable for UE5.0-5.2 or extra features. It provides a robust, ready-to-use motion matching solution. It’s ideal for earlier UE5 versions or enhanced functionality.
  8. How can I learn to use the Pose Search (motion matching) in UE5.4?
    Learn Pose Search through Epic’s documentation and Animation Sample project. A tutorial, “Your First 60 Minutes with Motion Matching,” guides enabling the Pose Search plugin, creating a database, and adding a Motion Matching node to AnimBP. YouTube community tutorials for UE5.4 motion matching offer practical demonstrations. These resources provide comprehensive setup guidance.
  9. Will using these advanced systems make my game too heavy performance-wise?
    ALS, MetaHumans, and motion matching add performance overhead due to high-poly models, bone counts, and animation searches. Optimizations like LODs, C++ versions, and limiting active characters can manage this, with some achieving 40% performance gains. Profiling ensures feasibility on modern hardware. With tweaks, these systems are viable for real-time applications.
  10. Are there any ready-made projects that combine ALS, MetaHumans, and motion matching?
    No official project combines all three as of 2025, but resources like ALS Community, Epic’s Lyra, and Animation Sample provide components. Community-shared projects on forums or Gumroad may exist. Integrating them yourself using tutorials is common and educational. This approach helps master the systems’ interactions.

Conclusion

ALS, MetaHumans, and motion matching in Unreal Engine 5 enable realistic, responsive character animation for gameplay and cinematics. ALS provides a robust movement foundation, enhanced by motion matching to overcome traditional animation limits. MetaHuman integration ensures hyper-realistic visuals, making AAA-quality digital humans accessible for indie projects. The article explains integrating MetaHumans with ALS using UE5’s IK Retargeter and blueprint techniques like Retarget Pose and Set Leader Pose for runtime animation.

Motion matching, via Epic’s Pose Search or Motion Symphony plugin, uses animation clip data to create fluid transitions, reducing manual work. The workflow ensures facial animations and MetaHuman features work with ALS, incorporates motion capture data, and optimizes performance with LOD settings and ALS Community code. Common retargeting issues, like warped fingers, are addressed with fixes. This combination blurs the line between gameplay and cinematic animation, offering real-time filmic quality while maintaining creative control through tools like Control Rig.

How to make metahumans look realistic in unreal engine 5: complete guide for artists and developers
Als metahuman motion matching: how to combine realistic movement and character animation in unreal engine 5

Sources and Citation

  • Kitatus, R. (2022). Advanced Locomotion (Community) to MetaHuman – UE5.1 integration tutorial​kitatusandfriends.co.ukkitatusandfriends.co.uk. (Describes retargeting ALS to MetaHuman using IK Retargeter and blueprint setup).
  • Epic Games (2023). Unreal Engine 5 Documentation – Motion Matching in Unreal Enginedev.epicgames.comdev.epicgames.com. (Official documentation explaining motion matching system and its advantages over traditional state machines).
  • Animation Uprising (2020). Motion Symphony – Marketplace Description and FAQunrealengine.comwikiful.com. (Details the motion matching plugin, requirements like root motion, and clarifies it’s a toolset not a full movement system).
  • Unreal Engine Forums (2024). MetaHuman with ALS – Strange cloth movements threadforums.unrealengine.com. (Community discussion revealing solutions like using Retarget Pose From Mesh and Set Leader Pose for MetaHuman clothing).
  • Reddit (2023). “What’s the best locomotion system… (ALS vs others)” – user mothh9’s comment​reddit.comreddit.com. (Comparing motion matching with ALS, noting one-node simplicity and better transitions when using UE5.2+ motion matching).
  • GitHub – ALS Community (2023). README for ALSV4_CPPgithub.comdawnarc.com. (Maintainer notes about ALS being based on older techniques and recommending Lyra/modern systems; also confirms the community version is optimized and replicated).
  • Unreal Engine Forums (2019). Strider vs ALS thread – comment by Avindr​forums.unrealengine.com. (Explains difference: ALSv4 as blueprint locomotion vs Strider/Motion matching tools as modular C++ nodes, highlighting the monolithic vs modular approach).
  • Dawnarc Blog (2019). Motion Matching Locomotion Notesunrealengine.com. (Mentions high-profile AAA games using motion matching and how Motion Symphony brings this tech to UE with quality tools).
  • Epic Games (2024). MetaHuman Optimization Guide forum postforums.unrealengine.com. (Discusses performance issues of MetaHumans, citing bone count as the major cost and hinting at 5.3 improvements).
  • Unreal Engine Dev Community (2024). Motion Matching with Play Montage threadforums.unrealengine.comforums.unrealengine.com. (User question and answer about blending montages with motion matching, requiring adding a Default slot to AnimBP).

Recommended

Table of Contents

PixelHair

3D Hair Assets

PixelHair ready-made full weeknd 3D moustache stubble beard in Blender using Blender hair particle system
Bantu Knots 001
PixelHair ready-made 3D hairstyle of Halle Bailey Bun Dreads in Blender
PixelHair ready-made iconic Kodak thick black dreads 3D hairstyle in Blender using hair particle system
PixelHair ready-made Long Dreads Bun 3D hairstyle in Blender using Blender hair particle system
PixelHair pre-made Curly Afro in Blender using Blender hair particle system
PixelHair pre-made weeknd afro hairsty;e in Blender using Blender hair particle system
PixelHair ready-made 3D Lil Pump dreads hairstyle in Blender using hair particle system
PixelHair ready-made Big Sean braids 3D hairstyle in Blender using hair particle system
PixelHair ready-made 3D Dreads (Heart bun) hairstyle in Blender
yelzkizi PixelHair Realistic female 3d character 4 twist braids 4c afro bun hair with hair clip in Blender using Blender hair particle system
PixelHair ready-made Pop smoke braids 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made iconic J.cole dreads 3D hairstyle in Blender using hair particle system
PixelHair pre-made Afro Fade Taper in Blender using Blender hair particle system
PixelHair Realistic female 3d charactermohawk knots 4c hair in Blender using Blender hair particle system
PixelHair Realistic female 3d character curly afro 4c hair in Blender using Blender hair particle system
PixelHair pre-made Afro Fade Taper in Blender using Blender hair particle system
PixelHair ready-made iconic xxxtentacion black and blonde dreads 3D hairstyle in Blender using hair particle system
PixelHair ready-made full 3D goatee beard in Blender using Blender hair particle system
PixelHair ready-made 3D hairstyle of Lil uzi vert dreads in Blender
PixelHair Realistic 3d character full beard in Blender using Blender hair particle system
PixelHair Realistic female 3d character curly afro 4c big bun hair in Blender using Blender hair particle system
PixelHair ready-made Lil Baby dreads woven Knots 3D hairstyle in Blender using hair particle system
PixelHair ready-made Rhino from loveliveserve style Mohawk fade / Taper 3D hairstyle in Blender using Blender hair particle system
PixelHair Realistic 3d character bob afro  taper 4c hair in Blender using Blender hair particle system
PixelHair ready-made full  weeknd 3D moustache stubble beard in Blender using Blender hair particle system
PixelHair ready-made pigtail female 3D Dreads hairstyle in Blender with blender hair particle system
PixelHair ready-made 3D hairstyle of Nipsey Hussle Braids in Blender
PixelHair ready-made 3D hairstyle of Nipsey Hussle Beard in Blender
PixelHair ready-made full Chris Brown 3D goatee in Blender using Blender hair particle system
PixelHair ready-made Scarlxrd dreads hairstyle in Blender using Blender hair particle system
PixelHair ready-made Polo G dreads 3D hairstyle in Blender using hair particle system
PixelHair ready-made Drake full 3D beard in Blender using Blender hair particle system
PixelHair ready-made full 3D beard in Blender using Blender hair particle system
PixelHair ready-made 3D hairstyle of Big Sean  Spiral Braids in Blender with hair particle system
PixelHair ready-made short 3D beard in Blender using Blender hair particle system
PixelHair ready-made Neymar Mohawk style fade hairstyle in Blender using Blender hair particle system
PixelHair pre-made Drake Braids Fade Taper in Blender using Blender hair particle system
PixelHair ready-made 3D Rihanna braids hairstyle in Blender using hair particle system
PixelHair pre-made dreads / finger curls hairsty;e in Blender using Blender hair particle system
PixelHair ready-made 3D full stubble beard with in Blender using Blender hair particle system
PixelHair Realistic 3d character curly afro fade taper 4c hair in Blender using Blender hair particle system
PixelHair ready-made 3D hairstyle of Travis scott braids in Blender
PixelHair ready-made Jcole dreads 3D hairstyle in Blender using hair particle system
PixelHair ready-made Braids pigtail double bun 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made Afro fade 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made 3D full beard with magic moustache in Blender using Blender hair particle system
PixelHair ready-made 3D hairstyle of Doja Cat Afro Curls in Blender
PixelHair ready-made 3D KSI fade dreads hairstyle in Blender using hair particle system
PixelHair Realistic Dreads 4c hair in Blender using Blender hair particle system
PixelHair ready-made top bun dreads fade 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made 3D hairstyle of lewis hamilton Braids in Blender
PixelHair pre-made female 3d character Curly braided Afro in Blender using Blender hair particle system
PixelHair ready-made dreads afro 3D hairstyle in Blender using hair particle system
PixelHair pre-made Chris Brown inspired curly afro 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made Top short dreads fade 3D hairstyle in Blender using Blender hair particle system
PixelHair pre-made Drake Braids Fade Taper in Blender using Blender hair particle system
PixelHair Realistic 3d character bob mohawk Dreads taper 4c hair in Blender using Blender hair particle system
PixelHair pre-made Burna Boy Dreads Fade Taper in Blender using Blender hair particle system
PixelHair ready-made Vintage Bob Afro 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made 3D hairstyle of Khalid Afro Fade  in Blender
Dreads 010
PixelHair ready-made 3D hairstyle of Halle Bailey dreads knots in Blender with hair particle system
PixelHair ready-made 3D  curly mohawk afro  Hairstyle of Odell Beckham Jr in Blender
PixelHair ready-made iconic Asap Rocky braids 3D hairstyle in Blender using hair particle system
PixelHair ready-made top woven dreads fade 3D hairstyle in Blender using Blender hair particle system
PixelHair pre-made Ken Carson Fade Taper in Blender using Blender hair particle system
PixelHair ready-made Snoop Dogg braids hairstyle in Blender using Blender hair particle system
PixelHair ready-made top four hanging braids fade 3D hairstyle in Blender using hair particle system
PixelHair ready-made Afro fade 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made Rema dreads 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made female 3D Dreads hairstyle in Blender with blender particle system
yelzkizi PixelHair Realistic female 3d character curly afro 4c big bun hair with scarf in Blender using Blender hair particle system
PixelHair Realistic female 3d character curly afro 4c ponytail bun hair in Blender using Blender hair particle system
PixelHair pre-made Odel beckham jr Curly Afro Fade Taper in Blender using Blender hair particle system
PixelHair ready-made Omarion full 3D beard in Blender using Blender hair particle system
PixelHair pre-made Tyler the Creator Chromatopia  Album 3d character Afro in Blender using Blender hair particle system
PixelHair ready-made 3D fade dreads in a bun Hairstyle  in Blender
PixelHair ready-made Braids Bun 3D hairstyle in Blender using Blender hair particle system
PixelHair pre-made female 3d character Curly  Mohawk Afro in Blender using Blender hair particle system
yelzkizi PixelHair Realistic female 3d character curly afro 4c big bun hair with 2 curly strands in Blender using Blender hair particle system
PixelHair pre-made The weeknd Afro 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made 3D Jason Derulo braids fade hairstyle in Blender using hair particle system
PixelHair pre-made Chadwick Boseman Mohawk Afro Fade Taper in Blender using Blender hair particle system
PixelHair ready-made 3D hairstyle of Big Sean Afro Fade in Blender
PixelHair Realistic 3d character curly afro taper 4c hair in Blender using Blender hair particle system
PixelHair ready-made full 3D beard in Blender using Blender hair particle system
PixelHair ready-made short 3D beard in Blender using Blender hair particle system
Fade 009
yelzkizi PixelHair Realistic female 3d character full dreads 4c hair in Blender using Blender hair particle system
PixelHair ready-made chrome heart cross braids 3D hairstyle in Blender using hair particle system
PixelHair ready-made faded waves 3D hairstyle in Blender using Blender hair particle system
PixelHair pre-made Drake Double Braids Fade Taper in Blender using Blender hair particle system
PixelHair ready-made 3D Beard of Khalid in Blender
PixelHair ready-made 3D full big beard with in Blender using Blender hair particle system
PixelHair pre-made The weeknd Dreads 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made iconic 3D Drake braids hairstyle in Blender using hair particle system
PixelHair ready-made spiked afro 3D hairstyle in Blender using hair particle system
PixelHair Realistic female 3d character pigtail dreads 4c hair in Blender using Blender hair particle system
PixelHair pre-made Nardo Wick Afro Fade Taper in Blender using Blender hair particle system