Replacing Unreal Engine’s default mannequin with a high-fidelity MetaHuman character can elevate the realism of your project. This comprehensive guide will walk you through swapping the UE5 mannequin for a MetaHuman in a third-person or first-person template. We’ll cover everything from importing a MetaHuman, retargeting animations, adjusting skeletons, handling facial animations, optimizing performance, and troubleshooting common issues. Whether you’re a game developer, cinematic creator, or 3D artist, this step-by-step guide will help you replace the mannequin with a MetaHuman effectively.
How do I replace the Unreal Engine mannequin with a Metahuman?
Yes – you can replace Unreal Engine’s default mannequin with a MetaHuman as your playable character or actor. The process involves importing a MetaHuman into your project and then updating your character setup to use the MetaHuman’s skeletal mesh and animations instead of the mannequin. In Unreal Engine 5 (UE5), MetaHumans are fully compatible as character models, but they use a more complex skeleton than the mannequin. The high-level steps to replace the mannequin are:
- Import a MetaHuman into your UE5 project (via Quixel Bridge).
- Retarget or transfer the mannequin’s animations to the MetaHuman skeleton so it can use the same movements (walking, jumping, etc.).
- Swap the character mesh and animation blueprint: Replace the ThirdPersonCharacter (or FirstPersonCharacter) blueprint’s mesh with the MetaHuman and assign the retargeted animation blueprint.
- Test and refine: Ensure the MetaHuman moves correctly, and fix any pose or rigging issues.
By the end, your game’s character will look like a MetaHuman but behave just like the default mannequin, with all the same controls and animations. UE5’s tools make this workflow straightforward, and we’ll detail each step in the sections below.

What is the easiest way to swap the UE5 mannequin for a MetaHuman character?
If you’re looking for the quickest way to swap the UE5 mannequin for a MetaHuman, Unreal Engine 5 offers a Live Retargeting approach that minimizes manual work. The easiest method is to use the IK Retargeter system at runtime so that the MetaHuman directly mirrors the mannequin’s animations:
- Use the Third Person Template as a base. Duplicate the ThirdPersonCharacter blueprint and create a new one for your MetaHuman (for example, BP_MetaHumanCharacter).
- Copy MetaHuman components: Open your imported MetaHuman’s blueprint (e.g. BP_Ada if your MetaHuman is named Ada) and copy all its mesh components (Body, Face, Hair, etc.) into the new character blueprint. Parent these under the main character Mesh component, and reset their transforms so they align properly.
- Enable live retargeting: Create a new Animation Blueprint for the MetaHuman that uses the Retarget Pose from Mesh node. This node will take animations from the mannequin and apply them to the MetaHuman in real time. In the Retarget Pose node, choose the pre-made retarget asset (UE5 provides an IK retargeter asset for the mannequin by default, often named RTG_Mannequin) as the source/target mapping.
- Hide the mannequin mesh: In your MetaHuman character blueprint, keep the original mannequin skeletal mesh but set it to invisible and “Always Tick Pose” so it drives the MetaHuman’s movement while remaining hidden.
- Use the MetaHuman as the player: Set your game mode or character spawn to use the new MetaHuman character blueprint as the default pawn. In the ThirdPersonGameMode, update Default Pawn Class to BP_MetaHumanCharacter (the one you created).
With this setup, you essentially have the mannequin playing all animations in the background while your MetaHuman character copies those animations live. This means no manual animation retargeting step is needed, and you can swap the mannequin for a MetaHuman in just a few minutes. It’s a very convenient method for quick results. The MetaHuman will run, jump, and idle using the existing mannequin Anim Blueprint logic. Note: Live retargeting requires UE5’s IK Rig/Retargeter assets and may need IK bones or settings adjusted for perfect results (for example, enabling “Enable IK Retargeting” in the editor). We’ll discuss retargeting and IK setup more below.
Can I use MetaHuman in place of the default third-person mannequin?
Absolutely. MetaHumans can replace the default third-person mannequin in UE5 projects. In fact, Epic Games designed MetaHumans to integrate well with the same animation system used by the mannequin. This means you can use a MetaHuman as your Third Person Character MetaHumans can replace the default third-person mannequin (UE5 Manny/Quinn or UE4 mannequin) in UE5 projects, as they integrate with the same animation system. This allows realistic characters in gameplay or cinematic sequences. Using the ThirdPersonCharacter Blueprint, retain existing mechanics (movement, camera, input) and:
- Import or create a MetaHuman and retarget third-person animations to its skeleton.
- Replace the mannequin’s mesh in the ThirdPersonCharacter Blueprint (or a duplicate) with the MetaHuman’s skeletal mesh.
- Assign a new or retargeted animation blueprint for proper movement.
Upon playing, the MetaHuman spawns instead of the mannequin, maintaining all third-person template functionality (WASD movement, jump). This is ideal for prototyping with the mannequin before switching to MetaHuman for final visuals or swapping for high-quality cinematic rendering.

What tools do I need to replace the mannequin with MetaHuman in UE5?
Replacing the mannequin with a MetaHuman doesn’t require external purchases – everything you need is either built into Unreal Engine 5 or available for free. Here are the key tools and requirements:
- Unreal Engine 5: Ensure you have UE5 installed, as MetaHuman integration and the IK Retargeter are UE5 features. This guide assumes UE 5.0 or later (UE 5.3+ contains further improvements to retargeting).
- Epic Games account and Quixel Bridge: MetaHumans are downloaded through the Quixel Bridge integration. In UE5, enable the Quixel Bridge plugin (if not enabled by default) and log in with your Epic account. You can then access MetaHumans from the Bridge panel (found under Content > Quixel Bridge in the editor). An Epic account is required since your MetaHuman assets are tied to it.
- MetaHuman Creator (optional): If you want a custom MetaHuman, you use the MetaHuman Creator (a cloud tool) to design your character’s face, hair, and body. However, you can also use one of the premade MetaHuman presets available in Bridge if you don’t need a custom design.
- IK Rig and IK Retargeter: UE5’s IK Rig tools are used for animation retargeting. These are built-in editor assets. The Third Person Template comes with IK rigs for the UE4/UE5 mannequin (often named IK_UE4_Mannequin and IK_UE5_Mannequin) and a ready-made IK Retargeter asset for transferring animations. We will duplicate or configure these to target the MetaHuman skeleton.
- MetaHuman skeleton and assets: When you download a MetaHuman through Bridge, it will import a series of assets (skeletal meshes for body, face, hair grooms, materials, and a Blueprint). Make sure these import correctly. The MetaHuman skeletal mesh and its animation blueprint or face control rig are part of these assets.
- (Optional) External DCC Tools: While not required for the basic swap, tools like Blender or Maya can be useful if you plan to customize clothes or accessories, or if you want to do advanced retargeting externally. For example, Blender can be used to adjust custom clothing to fit the MetaHuman or to merge skeletons in complex ways. But for a straightforward mannequin-to-MetaHuman replacement, you do not need to use external software – UE5’s built-in tools suffice.
In summary, the core tools are UE5 with the MetaHuman (Quixel) plugin enabled, and the usage of IK Rig/Retargeter editor for transferring animations. Epic’s official resources provide these tools free of charge, and many community tutorials use the same tools we’ll use here.
How do I import a MetaHuman into a third-person project in Unreal Engine?
Importing a MetaHuman into your project is a crucial first step. Here’s how you do it:
- Open Quixel Bridge in UE5: In Unreal Engine 5, go to Content > Quixel Bridge (top menu). Bridge is now integrated into the editor for UE5. If it’s your first time, you might need to log in with your Epic Games account.
- Find MetaHumans section: In Bridge, navigate to the MetaHumans section. You’ll see a library of premade MetaHuman presets (e.g., Ada, Taro, etc.) as well as any custom MetaHumans you’ve created with MetaHuman Creator (they appear once you log in).
- Download and add to project: Select the MetaHuman you want and click Download (choose the desired quality level, typically Medium or High quality LODs). After downloading, click Add (or Export) to bring the MetaHuman into your open UE5 project. This will import all necessary files.
- Wait for the import: The first time, it might take a few minutes as it imports various assets (body mesh, face mesh, skeletal rigs, materials, textures, etc.). Once done, check your Content Browser. There should be a new folder named MetaHumans with a subfolder for your character’s name.
- Verify the content: Open the MetaHuman’s folder. You should see:
- A Blueprint (BP_[Name]) which assembles the character (with body, face, hair components).
- Skeletal Meshes for body and face (e.g., a body mesh and a face mesh).
- An IK Rig asset for MetaHumans (often named IK_MetaHuman).
- Materials, textures, and possibly an Anim Blueprint or Control Rig assets for face.
- Enable MetaHuman plugin (if prompted): UE5 may prompt you to enable the MetaHuman Identity plugin or other components when importing. Enable any suggested plugin so everything works correctly.
Now your MetaHuman is in the project. To test, you can drag the BP_[YourMetaHuman] into the level – you should see the character appear (often in a T-pose initially). If the MetaHuman appears bald or without hair at first, don’t worry – that can happen due to Level of Detail settings (hair might only show at close range; we’ll address this later).
For a third-person project, you now have the MetaHuman ready to replace the mannequin. The next steps will involve retargeting animations so that this imported MetaHuman can use the existing third-person template animations.

Do I need to change the skeleton when replacing the mannequin with a MetaHuman?
No, do not manually change or replace the skeleton asset. The MetaHuman’s skeleton differs from the mannequin’s, with more spine and facial bones. Assigning the mannequin’s skeleton to the MetaHuman mesh causes errors, seams, odd behavior, and loss of the facial rig. Instead, use Unreal’s retargeting system (IK Retargeter or Animation Retargeting) to map mannequin animations to the MetaHuman, preserving its detailed skeleton and facial rig while avoiding physics issues, animation glitches, or crashes reported in earlier UE5 versions.
How do I use IK retargeting to transfer animations to a MetaHuman?
IK Retargeting in UE5 is the advanced technique under the hood of the retargeting process we just described. It ensures that animations look good on characters of different proportions by using inverse kinematics (IK) solvers. To specifically use the IK Retargeter:
- Open or create IK Rigs for both source (mannequin) and target (MetaHuman). Define Retarget Chains (groups of bones for arms, legs, spine, etc.) in each IK Rig asset. UE5’s sample IK Rigs usually have chains set up (e.g., “Arm_L”, “Leg_L”, etc.).
- Create an IK Retargeter asset. In it, you will see a side-by-side of Source and Target characters. Choose the Source IK Rig (e.g., IK_UE5_Mannequin) and Target IK Rig (IK_MetaHuman). Now you’ll see options to adjust root motion, chain mappings, etc.
- Chain mapping: Ensure all the chains from the mannequin map to the corresponding MetaHuman chains. UE5 might auto-map if bone names are similar. For example, the “Spine” chain on Manny should map to “Spine” chain on MetaHuman, “LeftArm” to “LeftArm”, etc. Verify things like IK bones or finger chains if present.
- Use IK goals if needed: The IK retargeter can use IK goal targets (like feet placement). If your animations rely on IK (for instance, foot placement on uneven ground), make sure to set up corresponding IK goals in the rigs. In UE5’s third-person animations, foot IK is often used for the idle pose. MetaHumans have foot IK bones too, so these should map.
- Adjust translation retargeting: You might want to tweak settings like root translation scaling (if the characters have different heights). Usually, the retargeter handles this automatically by matching hip positions, but the editor gives you fine control.
- Preview and iterate: The IK Retargeter interface lets you scrub through animations and see the MetaHuman follow. If you notice any limb offset (e.g., feet slightly off the ground or arms too bent), you can adjust the retarget pose or add offsets in the chain settings. For example, if the MetaHuman’s knees are always bent weirdly, you might adjust the thigh bone rotation in the retarget pose or ensure the IK joint targets are correctly positioned.
- Export the animations once satisfied.
Using the IK Retargeter is powerful because it accounts for bone length differences. If the MetaHuman is taller or differently proportioned than the mannequin, IK retargeting will adjust footsteps and reach so that, say, foot contact and hand positions still look natural. The retargeter also allows retargeting animations in bulk, and you can save the Retargeter asset for reuse with other animations. Once set up, the IK Retargeter asset can be used to retarget any future animations to your MetaHuman quickly.

How do I fix pose and rig issues after replacing the mannequin?
Common issues post-retargeting and their fixes include:
- T-Pose/No Animation: Ensure the retargeted AnimBP is assigned to the MetaHuman’s mesh and the character blueprint is possessed by the player controller.
- A-Pose vs. T-Pose Offset: Adjust the MetaHuman’s Retarget Pose in the IK Retargeter to match the mannequin’s rest pose (e.g., rotate arms from T-pose to A-pose).
- Bent Knees/Incorrect Leg IK: Create virtual bones in the MetaHuman skeleton (e.g., thigh_l to foot_l) to match the mannequin’s, enabling correct IK foot placement in the AnimBP, and ensure “Enable IK” is active.
- Components Not Moving Together: Use a Master Pose Component or EnableMasterPose function in the MetaHuman blueprint to link face and hair meshes to the body’s animation, preventing floating or detached parts.
- Clipping/Mesh Offset: Adjust the Capsule Half-Height and camera position in the character blueprint to align with the MetaHuman’s height for proper collision and eye-level camera placement.
- Finger Animation Issues: Verify finger chains are included in retargeting and tweak hand poses if misaligned (e.g., open vs. closed hands).
- Facial Rig Static: Add basic facial animations (e.g., blink) or leave neutral, checking the MetaHuman’s post-process anim blueprint for facial pose handling to fix issues like an open mouth.
Iterative adjustments using the Retargeter and proper component attachment resolve most issues for seamless MetaHuman performance.
Can I use MetaHuman with the ThirdPersonCharacter Blueprint in UE5?
Yes. MetaHumans can integrate with the ThirdPersonCharacter Blueprint from the Third Person Template, leveraging its movement logic and input bindings:
- Method 1: Replace Mesh: In BP_ThirdPersonCharacter (or a duplicate), replace the mannequin’s mesh (CharacterMesh0) with the MetaHuman’s body mesh (e.g., SKM_MyMetaHuman_Body) and assign the retargeted AnimBP. Add face/hair meshes separately, as the blueprint expects one mesh.
- Method 2: Merge Blueprints: Duplicate ThirdPersonCharacter (e.g., BP_MetaHumanCharacter), copy MetaHuman components (Body, Face, Hair) into it, set the Body as the main Mesh, attach others with Master Pose setup, and combine with ThirdPerson logic.
- Game Mode: Update the Game Mode to use the new blueprint if modified, or use the edited original BP_ThirdPersonCharacter for immediate results.
- Benefits: Retains input events, camera boom, and spring arm functionality, ensuring the MetaHuman responds like the mannequin.
This approach ensures MetaHumans function correctly within the blueprint structure for gameplay scenarios.

How do I retarget animations from the mannequin to a MetaHuman?
Animation retargeting is the process of taking animations designed for one character (the mannequin) and adapting them to another character’s skeleton (the MetaHuman). In UE5, retargeting is typically done with the IK Rig and IK Retargeter system. Here’s an overview of how to retarget the third-person mannequin animations to your MetaHuman:
- Set up IK Rigs: Unreal should have imported an IK Rig for the MetaHuman (often named IK_MetaHuman) and you already have an IK Rig for the mannequin. If not, you can create an IK Rig asset for each skeleton (mannequin and MetaHuman) via Create > IK Rig and add all the bones. In our case, check the MetaHumans/Common folder for IK_MetaHuman asset.
- Configure preview meshes: Open the IK_MetaHuman asset and assign a preview mesh (your MetaHuman’s body) so you can see the skeleton. Do the same for the mannequin’s IK Rig if needed.
- Create an IK Retargeter: This asset defines how to map animations from Source to Target. UE5 templates often include one for UE4 → UE5 Manny. We will create one for Manny → MetaHuman. In Content Browser, find the existing retargeter (e.g., RTG_UE4Manny_UE5Manny). Duplicate it and name it something like RTG_MannequinToMetaHuman.
- Set the target IK Rig to MetaHuman: Open the new Retargeter asset. By default, it may show mannequin to mannequin. On the right-side panel, change the Target IK Rig Asset to IK_MetaHuman (select the MetaHuman rig). You should see the MetaHuman skeleton appear, possibly overlapping the mannequin.
- Adjust poses if needed: MetaHumans and mannequins might have different rest poses (e.g., A-pose vs T-pose). In the Retargeter, you can use the Edit Pose mode to adjust the target (MetaHuman) pose to better match the source’s pose. This prevents things like arms or legs twisting incorrectly. For example, ensure the MetaHuman’s arms are in a similar angle as the mannequin’s arms.
- Preview and export animations: In the Retargeter window, you can select an animation from the Asset Browser (like ThirdPersonIdle, Walk, Jump from the template) and preview it. The mannequin (source) and MetaHuman (target) should both play it. If everything looks good, click Export Selected Animations. This will create new Animation assets for the MetaHuman skeleton. Typically, you’d export all the base animations used by the ThirdPerson AnimBP (idle, walk, run, jump, etc.).
- Retarget the Animation Blueprint: UE5 also allows retargeting an entire Animation Blueprint in one go. You can right-click the mannequin’s Animation Blueprint (e.g., ABP_ThirdPersonCharacter) and choose Retarget. The engine will prompt for a target skeleton (choose your MetaHuman’s skeleton) and create a duplicated AnimBP plus all needed animations. This is a quick way to get a working AnimBP for the MetaHuman, using the animations you just exported.
- Assign the new AnimBP to your MetaHuman character. Open the MetaHuman’s blueprint (or the ThirdPersonCharacter if you replaced the mesh there) and set the Animation Class to the new AnimBP you got in the previous step.
After retargeting, your MetaHuman has its own set of animations identical to the mannequin’s, just adapted to its skeleton. Now the MetaHuman can perform the same actions. Epic’s official documentation confirms that any animation asset using the UE4 mannequin skeleton can be retargeted to any MetaHuman using this method. In practice, this means not only the template animations, but also any marketplace animations built on the Epic mannequin can be transferred (more on that later).
Once retargeting is done, hitting play should show your MetaHuman animated correctly. If you see a T-pose or broken animation, double-check that the AnimBP is assigned and that the retargeter had all bones mapped correctly. We’ll cover pose fixes next.
How do I set up facial animation after replacing the mannequin with MetaHuman?
When you replace the mannequin with a MetaHuman, you gain a sophisticated facial rig – but by default, the mannequin’s animations do not include facial movements, so your MetaHuman’s face will remain neutral unless you animate it. Setting up facial animation depends on your project’s needs:
- For gameplay characters (in-game): Typically, you might not need complex facial animation for the player character at all times. However, you could set up some basic facial idles (blinking, subtle head movement) to make them feel alive. One way is to use the MetaHuman facial rig’s Control Rig in an AnimBP or Anim Layer to add random eye blinks or idle expressions. Another way is using the Facial Pose Library that comes with MetaHumans – a set of premade facial poses (happy, angry, etc.) that you can trigger via code or animation blueprint if certain events happen.
- For cinematics or cutscenes: You’ll want full control of the face. UE5 provides a MetaHuman Facial Control Rig which you can use in Sequencer. In the Sequencer, you can add your MetaHuman and activate its facial control rig, giving you dozens of controls (jaw, brows, eyes, etc.) to pose and animate the face keyframe by keyframe. Epic has tutorials on using the MetaHuman Facial Rig for animation. This is great for pre-rendered sequences or any scripted moments where the character speaks or emotes.
- Live facial capture: A powerful feature is using Live Link Face (an iPhone/ARKit based app) to drive the MetaHuman’s face in real time. If you have an iPhone with Face ID, you can use Epic’s Live Link Face app to capture your facial performance and stream it onto the MetaHuman. The MetaHuman’s facial rig is compatible with ARKit blendshapes, so it will reproduce your expressions and mouth movements. Animating MetaHumans with Live Link is covered in documentation, showing how an iPhone’s TrueDepth camera can capture and apply facial animation in real time. This is ideal for real-time puppeteering or recording dialogue.
- MetaHuman Animator: In newer releases, Epic introduced MetaHuman Animator (which uses video or an iPhone capture to produce a high-quality facial animation asset). If you have UE 5.2+ and the MetaHuman Plugin, you can record an actor’s face and use MetaHuman Animator to generate an animation that you can play on your MetaHuman’s face. This is a more advanced workflow but yields very realistic facial movement quickly.
- Blendspace or Morph Targets via Blueprint: For gameplay, you might not go the mocap route. You can still trigger facial expressions using morph target curves. For example, you could make your character smile when accomplishing something by driving the “smile” blendshape on the MetaHuman. This requires knowing the names of the facial morph targets (which follow a standard naming from the MetaHuman Facial rig).
To set up facial animation after the swap, decide on your approach and ensure the necessary asset is in place. If you want idle facial animation continuously, you might create an AnimBP that blends in a facial animation sequence (like a looping blink animation). If cinematic, you’d animate in Sequencer. If using Live Link, follow Epic’s setup guide (ensure the LiveLink plugin and Apple ARKit plugins are enabled, and connect the app to the editor).
Remember, the mannequin had no face, so none of the default content covers facial movement. It’s on you to add this layer. The good news is the MetaHuman facial rig is production-quality – you can make it talk and emote at the fidelity of digital humans in films, if you invest the time.

Is it possible to replace the UE5 mannequin in a first-person project with MetaHuman?
Yes, it’s possible, though the workflow differs slightly from the third-person case. The first-person template in UE5 uses a pair of arms (and no full body by default) for the player character, whereas a MetaHuman is a full body with head. Here are considerations and steps:
- Decide on Full Body vs. Arms Only: Do you want a full-body MetaHuman that can be seen in mirrors or when looking down, or just arms for the first-person view?
- Full Body: You can use the full MetaHuman and position the camera at head height. Open the FirstPersonCharacter blueprint and replace the mesh with the MetaHuman’s body, similar to third-person. You’ll need to attach the camera to the head socket or position it appropriately. Also hide the MetaHuman’s head or upper body for the owning client to avoid the camera clipping into the face (this can be done by setting owner no-see on the head/upperbody material, or using a separate mesh for first-person). The legs and shadow will be visible when looking down, which can increase immersion.
- Arms Only: If you prefer the classic arms-only approach (to avoid clipping and improve performance), you could try to use the MetaHuman’s arms. MetaHumans don’t ship with an arms-only mesh, but you can create one by exporting the MetaHuman skeletal mesh and isolating the arms in Blender/Maya. A simpler hack: use the full MetaHuman, but hide all bones except the arms. For instance, set the visibility of the torso, head, etc. to hidden (if the MetaHuman blueprint allows that by separating meshes).
- Animation retargeting for arms: The first-person template animations (like arms swinging, shooting, etc.) would need retargeting to the MetaHuman skeleton (which includes arms). The process is similar, but you’re focusing on arm animations. If you use the full body MetaHuman, you can still retarget the full-body animations (even though first-person only shows arms, the full body can play the animation for consistency).
- Camera setup: In the FirstPersonCharacter blueprint, after swapping to MetaHuman, attach the CameraComponent to the MetaHuman’s head bone (for full body) at eyes level. Adjust the position so the view looks correct. If using arms only, you’d keep the camera where it was and just ensure the arms are positioned in front of it appropriately.
- Mesh visibility: For a full-body approach, set the MetaHuman’s head mesh to Owner No See = true, so the local player doesn’t see their own face. Optionally do the same for the torso if needed, so only arms are seen. Other players in multiplayer will see the whole body normally.
- Testing: Run the game in first-person view and check for any clipping (if you see inside the head, you missed hiding something). Check that animations like jumping or crouching work – you might need to adjust the capsule or camera relative locations after replacing the mesh.
Replacing the mannequin in first-person is definitely doable but tends to be a bit more involved than third-person because of the first-person specific considerations. Many developers opt to use a full-body MetaHuman for things like cutscenes or third-person views, and a separate simpler mesh for first-person view for performance and simplicity. Nonetheless, with the above approach, your first-person character can indeed be a MetaHuman, bringing arms, legs, and even a shadow that looks human.
How do I optimize performance after adding a Metahuman to my game project?
Here are tips to keep performance in check:
- Use LODs (Levels of Detail): MetaHumans come with multiple LODs.The LOD Sync component automatically reduces detail at distance. Adjust LOD switch distances in the MetaHuman’s asset. Ensure hair LODs support all levels to avoid issues. Forcing lower LODs for distant MetaHumans saves performance.
- Optimize hair: Hair and beards (grooms) are performance-heavy.Swap grooms for hair cards during gameplay for efficiency. Reduce hair strand density in groom settings. Configure hair LODs to prevent disappearing at distance. Use Forced LOD for cinematic close-ups only.
- Texture streaming and resolution: MetaHumans have high-res textures (skin, clothing).Downscale textures like 8K to 2K for distant cameras. Adjust texture streaming pool in Project Settings. Monitor texture memory to avoid overuse. This ensures efficient rendering without visual loss.
- Skeleton complexity: MetaHumans have more bones (especially facial bones).Disable unneeded morph targets for background characters. Turn off facial bone animations if unused. Manage bone count for multiple MetaHumans. This reduces computational overhead significantly.
- Animation Blueprint overhead: The AnimBP for MetaHuman (especially if it includes IK, control rig for face, etc.) can be heavier than the mannequin’s.Simplify AnimBP by removing unused features like hand IK. Profile AnimGraph to identify performance bottlenecks. Avoid full-body IK solvers if unnecessary. This streamlines animation processing.
- Use of Master Pose vs individual meshes: MetaHuman blueprint separates body parts.Use MasterPose to reduce draw calls for body parts. Merging meshes can optimize but limits customization. Evaluate trade-offs for performance vs. flexibility. This balances efficiency and visual quality.
- Test on target hardware: Always test your MetaHuman in-game on the platform (PC, console) with profiling.Profile on target platforms to identify issues. Use medium-quality MetaHumans for less demanding scenes. Simplify skeletons for distant proxy characters. This ensures smooth performance across devices.
- Physics and cloth: If your MetaHuman’s clothing uses Chaos Cloth, it could add CPU cost.Simplify or disable cloth simulation for minor parts. Switch to animated joints for cloth effects. Optimize physics settings for performance. This reduces CPU load effectively.
In summary, a single MetaHuman is manageable on modern hardware, but multiple MetaHumans or lower-end targets require careful LOD management and optimization to maintain performance.

How do I handle clothing and accessories when using Metahuman instead of the mannequin?
Here’s how to manage clothing and gear:
- Using MetaHuman default clothing: The MetaHuman you downloaded probably has a default outfit.Default outfits from MetaHuman Creator are ready to use. Mix and match by exporting new outfit parts. This requires minimal setup effort. It’s ideal for quick integration.
- Attaching accessories: If you had accessories on the mannequin (say a helmet or backpack from the Unreal Marketplace), you can still attach them to the MetaHuman.Attach accessories to equivalent MetaHuman bones like head. Adjust positioning for MetaHuman’s unique head shape. Use AttachComponentToComponent in Blueprints. This ensures compatibility with existing assets.
- Custom clothes from Marketplace or elsewhere: If you have a skeletal mesh outfit made for the mannequin, it won’t automatically fit the MetaHuman.Retarget outfits to MetaHuman skeleton in DCC tools. Use UE5.3 Skeletal Mesh Editor to adjust fit. Third-party tools like Metatailor auto-fit clothes. Re-skinning in Blender/Maya ensures proper rigging.
- Hiding the body under clothes: When you put custom clothing on a MetaHuman, you often need to hide the body mesh underneath to prevent clipping.Hide body parts using material masks in Blueprints. Set attributes to disable torso skin under shirts. This prevents visual clipping issues. It mirrors mannequin outfit workflows.
- Accessories and physics: If your MetaHuman wears things like a coat or jewelry, configure their physics or attachments.Attach to bones with Physics Asset for dynamic movement. Use MetaHuman bone names like spine_04. Ensure proper physics setup for realism. This maintains animation consistency.
- Facial accessories: MetaHumans have animated faces, so if you add glasses or goggles, ensure they are attached to a head or face bone that moves properly with expressions.Attach glasses to head bone for expression compatibility. Ensure accessories move with facial animations. Adjust positioning for natural appearance. This preserves facial animation integrity.
In summary, MetaHuman clothing can be managed using default outfits, retargeted Marketplace assets, or third-party tools, with careful rigging to ensure compatibility and visual quality.
Can I still use Marketplace animations after switching to Metahuman?
Yes. One of the benefits of retargeting is that you can continue to use any animations designed for the Epic mannequin on your MetaHuman. Marketplace animation packs almost universally target either the UE4 mannequin skeleton or the newer UE5 mannequin. Since your MetaHuman now has retargeted those base poses, you can retarget additional animations in the same way:
- If you have a pack of dances, fights, or other moves for the mannequin, use the IK Retargeter asset you set up to convert those animations to your MetaHuman.Load animations into the retargeter for quick conversion. Export retargeted animations for MetaHuman use. The retargeter saves mapping configurations. This streamlines batch animation processing.
- If the animations are for UE4 Mannequin and you started with UE5, note that UE5’s “Manny” is slightly different from UE4’s skeleton (though they share many similarities).Retarget UE4 animations to UE5 Manny first. Use built-in UE4-to-UE5 retargeter for compatibility. Then apply MetaHuman retargeting process. Alternatively, retarget UE4 directly to MetaHuman.
- Animation Blueprint logic: If a marketplace asset comes with its own AnimBP (say for locomotion or combat), you would likely retarget that AnimBP to MetaHuman as well (using the retarget AnimBP function).Retarget AnimBP to create MetaHuman-compatible versions. Integrate new AnimBP into character setup. This applies complex systems like climbing. It ensures full animation functionality.
- Consider root motion and scale: MetaHumans might have different scale (though they are generally human-scale).Retargeting preserves root motion for accurate movement. Test animations for stride length issues. Adjust retargeter settings for natural motion. This ensures visual consistency.
- Facial animations from marketplace: If any marketplace content provided facial animation (unlikely, usually they’re body animations only), those won’t directly apply since the mannequin had no face.Handle facial animations separately for MetaHumans. Use MetaHuman facial rig for expressions. Marketplace animations focus on body only. This requires additional facial setup.
The takeaway is that all your investment in animations is preserved, and Marketplace animations can be reliably retargeted to MetaHumans with proper setup and testing.

How do I replace the mannequin in a cinematic sequence with a Metahuman?
Here’s how to replace the mannequin in a cinematic sequence with a MetaHuman:
- Add the MetaHuman to the Level: First, make sure the MetaHuman is in your level (drag the BP_MetaHuman in, or spawnable in sequencer).Drag BP_MetaHuman into the level for setup. Ensure it’s accessible in Sequencer. This establishes the character in the scene. It’s the foundation for replacement.
- Attach to existing sequence: If your sequence was animating a mannequin Skeletal Mesh Actor, you can replace that actor.Replace mannequin actor in Sequencer with MetaHuman. Copy tracks/keys from mannequin to MetaHuman. This transfers existing animations seamlessly. It maintains sequence integrity.
- Retargeted animation asset: If the sequence was using an animation asset (from the third-person template or elsewhere) on the mannequin, you should retarget that animation to the MetaHuman skeleton as we discussed.Retarget animation to MetaHuman skeleton compatibility. Assign new animation asset to MetaHuman track. This ensures correct playback in Sequencer. It aligns animations with the new character.
- Use Control Rig for refinement: One advantage is you can now leverage the MetaHuman’s Control Rig in sequencer to tweak the animation.Add Control Rig track for fine adjustments. Use FK or IK rigs for body tweaks. Animate facial expressions with Facial Control Rig. This enhances animation precision.
- Lighting and materials: Ensure the MetaHuman’s shaders are compiled and look good under your cinematic lighting.Compile shaders for cinematic lighting compatibility. Adjust material settings for close-up quality. Disable LOD for high-detail rendering. This optimizes visual fidelity.
- Camera framing: MetaHuman proportions might be slightly different, so check your camera framing after replacement.Adjust camera for MetaHuman head and body proportions. Ensure framing suits new character scale. This prevents visual misalignment. It maintains cinematic composition.
- Test playback: Run the sequence.Verify MetaHuman performs mannequin actions correctly. Check for intersections with other characters. Adjust timing for body shape differences. This ensures smooth sequence execution.
- Facial animation: If your cinematic has dialogue, you’ll now want to animate the face.Use facial pose library for expressions. Keyframe control rig for dialogue lip-sync. This adds realistic facial movements. It enhances emotional impact.
In summary, replacing the mannequin in Sequencer is straightforward, involving actor swapping and animation retargeting, allowing MetaHumans to enhance cinematic realism with minimal adjustments.
What are common problems when replacing the mannequin with a Metahuman?
Here’s a list of those problems and quick solutions:
- MetaHuman is T-posing in game: This means the animations aren’t applied.Assign correct AnimBP to MetaHuman mesh. Ensure character blueprint is used. Verify retargeting completion for animations. This restores proper animation playback.
- Animations look twisted or off: For instance, arms raised too high or feet sliding.Adjust retarget rest pose for skeleton alignment. Verify IK chain mappings in retargeter. Check root motion and IK settings. This corrects distorted movements.
- MetaHuman’s hair/cloth disappears at distance: This is due to LODs not having that asset.Use grooms with full LOD support. Avoid forcing high LOD for performance. Select alternative hair assets if needed. This prevents visual disappearance.
- Performance drops/stutters: If you notice a big performance hit after switching, it could be shaders compiling or just the complexity of the MetaHuman.Wait for shader compilation to complete. Use medium-quality MetaHumans for efficiency. Apply LOD management techniques. This stabilizes performance levels.
- Crashes when assigning skeleton: If someone tried to use “Assign Skeleton” to use the mannequin skeleton for the MetaHuman mesh, it might crash or error.Use retargeting instead of assigning skeleton. Follow Epic’s recommended workflow. Avoid direct skeleton assignments. This prevents crashes and errors.
- Face is static: The MetaHuman might have an uncanny blank stare.Implement facial animations like idle blinks. Use MetaHuman’s blink animation asset. Create looping facial movements. This adds lifelike expressions.
- Parts of the body not moving: If the MetaHuman’s body animates but, say, the eyes or other sub-component are not following, it could be a Master Pose issue.Ensure sub-meshes use Master Pose correctly. Verify proper attachment to body. Check component setup in Blueprint. This synchronizes all parts.
- Incorrect scaling: If your MetaHuman is much taller or shorter than the mannequin, retargeting should handle animation scale, but you might need to adjust capsule size, camera height, etc., in the character blueprint.Tweak capsule size for MetaHuman stature. Adjust camera height for correct perspective. Test retargeted animation scaling. This aligns character proportions.
- Finger IK or grip not perfect: Minor differences in skeleton might cause fingers to not grip weapons perfectly if you’re using a gun from the mannequin for example.Adjust socket positions for hand grips. Use Control Rig for pose tweaks. Fine-tune IK settings for accuracy. This improves weapon handling.
- Lighting or materials look different: MetaHumans use skin shaders that react to light differently than the plain material on the mannequin.Adjust lighting for MetaHuman skin shaders. Tweak material parameters like roughness. Optimize scene lighting for realism. This ensures consistent visual quality.
Most issues can be resolved with careful retargeting and blueprint setup, with Epic’s MetaHuman FAQ and community forums providing solutions for common problems.

Can I use custom Metahumans to replace the default mannequin?
Yes. “Custom MetaHumans” (i.e., ones you create in MetaHuman Creator with your own unique face, body, hair, and clothing) are fully compatible with this process. The steps to replace the mannequin do not depend on a specific MetaHuman; whether you use one of Epic’s premade presets (like the sample characters) or a MetaHuman you personalized, it works the same way.
Important points regarding custom MetaHumans:
- Same skeleton: All MetaHumans, regardless of facial appearance, share the same base skeletal structure for a given body type.Shared skeletons ensure animation compatibility. Retargeting applies across same-gender/size MetaHumans. No changes needed for skeleton swaps. This simplifies animation reuse.
- Download and import each character: If you want to use multiple custom MetaHumans (say for different characters in your game), you’ll import each via Bridge.Import each MetaHuman with unique assets. Organize in Content/MetaHumans folders. Apply retarget assets to all skeletons. This streamlines multi-character setups.
- Customization after import: If you realize you want to tweak the MetaHuman’s look (like different eye color or face shape), you’d have to go back to MetaHuman Creator, adjust, and re-download.Plan custom designs in Creator beforehand. Re-download for appearance tweaks. Material tweaks possible in Unreal. This ensures desired character aesthetics.
- Mixing custom and default: You can absolutely have one character be a MetaHuman and others remain mannequins or other characters.Use MetaHumans for main characters only. Keep simpler NPCs for performance. No forced changes for other characters. This offers flexible character design.
Custom MetaHumans shine in cinematics and as main characters, following the same retargeting pipeline as default MetaHumans, with minor animation tweaks for unique proportions.
How can PixelHair be used to customize the Metahuman after replacing the mannequin?
Here’s how PixelHair can come into play:
- What is PixelHair?: It’s essentially a library of premade 3D hair grooms.PixelHair offers realistic braids, dreadlocks, and more. Designed for Blender and Unreal compatibility. Exportable as grooms or hair cards. This expands MetaHuman hairstyle options.
- Using PixelHair with MetaHuman: The typical workflow is:Fit PixelHair to MetaHuman head in Blender. Export as Alembic groom or FBX hair cards. Import into Unreal as Groom asset. Attach to MetaHuman’s head bone in Blueprint.
- Advantages of PixelHair: You get a unique look, and often the hair models are optimized with full LODs (unlike some Creator grooms).Provides customizable, game-ready hair assets. Full LODs ensure distant visibility. Hair cards offer performance efficiency. This enhances visual uniqueness.
- Performance: Some PixelHair assets are hair cards which can be faster than strand hair grooms.Hair cards reduce performance impact vs. grooms. Choose low-poly hairstyles for optimization. Test hair animations for smoothness. This balances quality and efficiency.
- Examples: Let’s say your MetaHuman replaced the mannequin as the main character, but you want them to have a specific braided hairstyle to fit your game’s aesthetic.Select PixelHair braid asset for styling. Fit and export using provided tools. Attach braid to MetaHuman in Unreal. This creates a distinctive character appearance.
In essence, PixelHair and similar tools expand what you can do after the basic swap, helping create unique MetaHuman appearances with optimized, customizable hair assets.

What is the difference between mannequin and Metahuman animation workflows?
While the mannequin and MetaHuman ultimately use the same Unreal Engine animation system, there are a few workflow differences to be aware of:
- Skeleton complexity: The mannequin’s skeleton is simple (e.g., around 70 bones for UE4 mannequin).MetaHuman’s 342+ body and 875 facial bones offer finer control. Extra bones are managed by retargeting. Standard animations use shared core bones. This requires careful bone management.
- Animation Blueprint: The default ThirdPerson AnimBP for the mannequin drives body movement and uses a simple state machine (Idle/Walk/Run, Jump, etc.).MetaHumans may use separate body and facial AnimBPs. Facial blueprints handle blinks and expressions. Integration requires additional setup. This adds animation complexity.
- Retargeting step: With the mannequin, you often use animations directly (marketplace animations plug and play on the mannequin).MetaHumans require retargeting for each animation asset. Use IK Retargeter for conversion. This adds an upfront time investment. It ensures animation compatibility.
- Live retarget vs. baked animation: MetaHumans allow the option of runtime retargeting (copying pose from mannequin at runtime).Runtime retargeting uses “Retarget Pose from Mesh” node. Mannequin uses authored animations directly. Choose between live or baked workflows. This offers flexible animation options.
- Facial animation: The mannequin workflow had no facial animation.MetaHumans include facial mocap or control rig animation. Keyframe expressions in Sequencer for dialogue. This adds a significant workflow layer. It leverages realistic facial capabilities.
- Tools: The same animation tools (Sequencer, Control Rig, etc.) apply to both.MetaHumans use Control Rig for finer adjustments. Facial rigs enhance expression control. Mannequin relies on existing animations. This increases MetaHuman animation flexibility.
- Physics: The mannequin’s physics asset is simple capsules.MetaHumans have detailed physics for cloth and hair. Tweak physics bodies for realistic jiggle. Mannequin lacks these complexities. This requires additional physics setup.
In summary, MetaHuman workflows involve more setup with retargeting and facial animation but offer higher fidelity, while mannequin workflows are simpler and more rigid.
Where can I find tutorials on replacing the mannequin with a Metahuman in Unreal Engine 5?
There are numerous resources available to help you with this process, including official documentation and community tutorials. Here are some reputable sources and tutorials:
- Epic Games Official Documentation: The Epic Developer Community documentation has a dedicated guide on Retargeting Mannequin Animations to MetaHumans and even a step-by-step on using a MetaHuman in a third-person game.Guides cover IK Rig setup and gameplay integration. Step-by-step instructions ensure clarity. Authoritative resources from Epic guarantee reliability. This is ideal for beginners.
- Epic Official Video Tutorials: Epic’s Unreal Engine YouTube channel often has tutorials.Videos like “Using the MetaHuman Facial Rig” teach animation. Livestreams cover MetaHuman integration. Tutorials provide practical setup insights. These are accessible and authoritative.
- Community Video Tutorials (YouTube): Several Unreal Engine content creators have made videos on this exact topic:Mizzo’s “How To Replace The Mannequin With A Metahuman” shows live retargeting. Jobutsu’s retargeting tutorials apply to MetaHumans. Many cover specific setup aspects. These offer quick, practical guidance.
- Unreal Engine Forums: The forums are a goldmine for Q&A.Threads like “Retargeting into MetaHuman on UE5” troubleshoot issues. Search for “MetaHuman ThirdPerson tutorial” for guides. Community shares step-by-step advice. This provides real-world solutions.
- Unreal Engine AnswerHub/Reddit: On r/unrealengine (Reddit) and the old AnswerHub, you’ll find questions asked by others doing the same swap.Reddit threads clarify skeleton differences. AnswerHub offers technical Q&A. Community answers address common issues. This complements official resources.
- Blender & MetaHuman Clothing Tutorials: If you’re interested in custom clothing or advanced customization, check out community tutorials like ORTyOW’s “Custom Clothes for MetaHuman”.ORTyOW’s tutorial teaches Blender clothing rigging. Guides simplify custom outfit workflows. These enhance MetaHuman customization. This supports post-replacement polish.
- PixelHair Guides: For PixelHair specifically, Yelzkizi (the creator) has documentation and presumably video guides on using their hair with MetaHumans.Yelzkizi provides setup documentation. Guides ensure PixelHair integration ease. Tutorials focus on hair customization. This aids unique character styling.
- “MetaHuman as Player” Tutorials: Look for that phrasing.Tutorials labeled “Making MetaHuman player character” cover retargeting. These overlap with mannequin replacement steps. Community guides offer practical workflows. This targets gameplay integration.
- Learning Courses: Outside of free content, some platforms (like Udemy or ArtStation Learning) might have courses on MetaHuman integration.Free resources are often sufficient. Courses offer structured learning if needed. Platforms like Udemy provide depth. These are optional for advanced users.
In summary, leverage Epic’s official documentation, video tutorials, and community resources like forums and YouTube for comprehensive guidance on replacing the mannequin with a MetaHuman in Unreal Engine 5.
In the Sources and Citations below, we’ve listed some of these resources with links, so you can easily navigate to them for more detailed instructions and examples.

FAQ: Replacing the Mannequin with a Metahuman
- Is MetaHuman free to use in Unreal Engine 5?
Yes – MetaHumans are free for all Unreal Engine users. You just need an Epic Games account. You create or download MetaHumans via the free MetaHuman Creator and Quixel Bridge. There’s no additional cost or royalty to use them in your Unreal projects. - Do I need to know how to animate to use a MetaHuman as my character?
Not necessarily. You can use existing animations (from the Third Person template or marketplace) and retarget them to your MetaHuman. You don’t have to manually animate the character. However, if you want facial animations or very custom movements, you might need to animate those or use motion capture. For basic movement (running, jumping, etc.), no extra animation skills are required beyond using the retarget tool. - Why is my MetaHuman stuck in T-pose after I replaced the mannequin?
This usually means the MetaHuman isn’t using an animation blueprint or the retargeting wasn’t applied. Make sure you have assigned a valid Animation Blueprint to the MetaHuman’s Skeletal Mesh component. If you retargeted the ThirdPerson AnimBP, use that. If you’re using live retargeting, ensure the setup (Retarget Pose node, hidden source mesh) is correct. Basically, a T-pose MetaHuman means “no animation data is driving it.” - What is the IK Retargeter, and do I really need to use it?
The IK Retargeter is UE5’s tool for transferring animations between skeletons. You’ll use it to convert mannequin animations to MetaHuman animations. It’s not strictly “mandatory” if you use the live-retarget method (which applies animations on the fly), but under the hood even that uses an IK retargeting asset. So in practice, yes – you’ll interact with the IK Retargeter either directly (exporting animations) or indirectly (live retarget node) to get your MetaHuman moving. - Will using a MetaHuman slow down my game?
It can, as MetaHumans are more detailed. A MetaHuman has more polygons, more bones, and complex materials (especially hair and skin). If you have one MetaHuman as the player, most modern PCs or consoles handle it fine. But if you fill a scene with many MetaHumans, you could see a performance drop. You can mitigate this with Level of Detail settings, simplifying hair, and other optimizations (see the performance section above). Always profile your game – a single MetaHuman typically runs well on decent hardware, but it’s heavier than the simple mannequin. - Can I use a MetaHuman in a first-person game?
Yes, you can. You might use the full MetaHuman character and attach the camera to their head for a body-aware first-person view (hiding the head mesh to prevent clipping). Or you can isolate the arms and use those for a traditional FPS arms-only view. It requires some setup (camera placement, possibly hiding parts of the mesh), but it’s definitely possible and many have done it for immersive first-person projects. - Do I need to change any code or blueprint logic when switching to a MetaHuman?
Not fundamentally. The character movement and input code can remain the same. You’ll only adjust the character Blueprint to use the new mesh and AnimBP. If your game logic references specific skeleton sockets or bone names, make sure those exist on the MetaHuman (for example, “hand_r” bone exists on MetaHuman so weapon attachment logic will still work). But generally, no major code changes – it’s mostly asset swapping and retargeting. - Can I use multiple MetaHumans in the same project (e.g., different MetaHuman for different characters)?
Absolutely. You might have your main character as one MetaHuman and a handful of others as NPCs or extras. Just import each MetaHuman you need. You can retarget the animations to each if they use different skeletons (male vs female MetaHuman skeletons might be slightly different). Manage performance accordingly if many high-quality characters are on screen. But UE can handle multiple MetaHumans; they’re just skeletal meshes and assets in the content folder. - How can I add facial expressions or lip-sync to my MetaHuman during gameplay?
Facial animation can be done through several methods. For lip-sync, you could animate the face in Sequencer for cutscenes, or use a Live Link Face app to drive it in real-time for dialogue. There’s also a MetaHuman facial pose library you can tap into for preset expressions (happy, angry, etc.) and trigger those via Animation Montages or blueprints. If you want automated lip-sync from dialogue text or audio, that’s more complex – you’d need to drive the ARKit blendshapes via a lip-sync plugin or tool (third-party solutions exist). But for most purposes, either live mocap or hand animation via control rig are used for facial movements on MetaHumans. - Does this process work in Unreal Engine 4 as well?
MetaHumans were introduced in UE4 (around 4.26) with a different workflow (using the Retarget Manager). The general idea of replacing the mannequin with a MetaHuman is possible in UE4, but UE5 greatly streamlined it with the IK Retargeter. If you’re on UE4, you would use the Animation Retarget Manager to map bones between the UE4 mannequin and MetaHuman, then retarget animations. It’s a bit more manual. So yes, it’s possible in UE4, but the steps differ. UE5 is recommended for the best MetaHuman experience (plus many MetaHumans now target UE5 skeletons by default).

Conclusion
Replacing the UE5 mannequin with a MetaHuman opens up a world of realism for your project. We’ve covered the step-by-step process: from importing your MetaHuman and retargeting animations, to integrating it in the ThirdPersonCharacter blueprint, handling first-person setups, and adding the polish of facial animation, custom hair, and clothing. The key takeaways are:
- Preparation and Retargeting: Import your MetaHuman via Quixel Bridge and use UE5’s IK Rig and Retargeter to transfer the mannequin’s animations. This is the cornerstone of making the MetaHuman move correctly.
- Blueprint Integration: You can slot the MetaHuman into the existing character blueprint, preserving all your game logic. Minor adjustments (like Master Pose for body parts, or camera height) ensure a seamless swap.
- Beyond the Basics: Leverage MetaHuman’s capabilities – add facial animations for expressiveness, use high-quality hair like PixelHair for customization, and consider performance by utilizing LODs and optimizations.
- Troubleshooting: Many common issues have straightforward fixes (usually involving proper retarget pose or blueprint setup). Don’t be discouraged by initial hiccups like a T-posing MetaHuman or a bald LOD – those can be resolved with the tips we provided.
- Resources: There’s a rich ecosystem of tutorials and official docs to guide you (we cited several). When in doubt, refer to Epic’s documentation or community wisdom for answers.
By following this guide, game developers and creators of all levels can confidently upgrade their default character to a stunning MetaHuman. The result is a more engaging experience – whether in a game, an architectural walkthrough, or a cinematic short – where your characters look as realistic and dynamic as the worlds you build around them.
With Unreal Engine 5’s tools, the process is quite accessible. So go ahead and swap out that mannequin – your project will literally put on a new face. Happy developing with your new MetaHuman character!
Sources and Citations
- Epic Games Documentation – Retargeting Mannequin Animations to MetaHuman: Official step-by-step guide on using IK Rig and Retargeter to transfer UE4 mannequin animations to a MetaHumandev.epicgames.comdev.epicgames.com.
- Epic Games Documentation – Using a MetaHuman as a Third-Person Character: Explains how to set up a MetaHuman in the third-person template, including blueprint merging and using live retargetingdev.epicgames.comdev.epicgames.com.
- Epic Games Documentation – MetaHuman FAQ: Troubleshooting common MetaHuman issues (LOD hair disappearing, etc.)dev.epicgames.comdev.epicgames.com.
- Epic Games Documentation – Animating MetaHumans (Live Link, Facial Rig): Resources on animating MetaHuman faces, including using the Live Link Face app for facial capturedev.epicgames.com.
- Unreal Engine Forums – Community Q&A and Tutorials:
- Retargeting into MetaHuman on UE5 – Discussion where users share tips on retargeting and mention not needing to manually retarget if using live retargetingforums.unrealengine.com.
- MetaHuman to ThirdPerson Character forum thread – A user outlines their process of duplicating the MetaHuman blueprint and re-parenting to Character, as well as issues encountered (like bent knees)forums.unrealengine.comforums.unrealengine.com.
- Custom Clothes for MetaHuman – Tutorial by ORTyOW – Guide on using Blender to fit any clothing model to a MetaHuman skeletonforums.unrealengine.com.
- Changing MetaHuman skeleton to Manny – Discussion that highlights why reassigning skeleton is problematic (mentions of errors and loss of facial rig)forums.unrealengine.com.
- Reddit – MetaHuman Skeleton vs Mannequin: Community answer noting the differences in bone count and structure (MetaHuman has more spine joints, corrective bones, etc.)forums.unrealengine.com.
- PixelHair Documentation (Blender Market page) – Confirms PixelHair hairstyles can be exported to UE5 and used with any MetaHumanblendermarket.com.
- YouTube – Mizzo’s UE5 MetaHuman Tutorial: “How to Replace the Mannequin with a MetaHuman” (covers live retarget method in UE5.3)forums.unrealengine.com. Another YouTube tutorial by “Unreal Engine Mania” showing a similar process in earlier UE5youtube.com.
Recommended
- Camera Transitions Made Easy in Blender Using The View Keeper
- Exporting Custom MetaHuman Grooms from Maya XGen to Unreal Engine: A Comprehensive Guide
- How do I align the camera to a specific axis in Blender?
- The Ultimate Guide to the Most Popular Black Hairstyle Options
- PixelHair vs. Manual Hair Grooming in Blender: Which is Better?
- Can You Use Perspective and Orthographic Cameras in Blender?
- Real-Time Hair for AAA Games: Full Tutorial and Workflow Using ZBrush, Maya, Blender, and Marmoset
- How do I smooth camera motion in Blender animations?
- The View Keeper’s Best Features for Blender Animators
- The Best Bearded Video Game Characters: Iconic Facial Hair and Their Impact on Gaming