How to Create Your Own Metahuman in Unreal Engine 5: Complete Step-by-Step Guide

Creating a MetaHuman in Unreal Engine 5 (UE5) unlocks the ability to have a fully rigged, realistic or stylized digital character ready for animation. This comprehensive guide will walk you through how to create your own MetaHuman in UE5 from start to finish. We’ll cover everything from the tools you need and accessing the MetaHuman Creator, to advanced customization (facial features, custom hair, clothing), importing into UE5, animation (including control rig and motion capture), and using MetaHumans in cinematics or games. Whether you’re a solo developer, 3D artist, or part of a studio, this step-by-step guide will help you leverage MetaHuman for both realistic and stylized characters in your UE5 projects. Let’s dive in!

What tools do I need to make a custom Metahuman in UE5?

To get started with creating a custom MetaHuman, make sure you have the following tools and prerequisites in place:

  • Epic Games Account: You’ll need a free Epic Games account to use MetaHuman Creator. If you don’t have one, sign up on the Unreal Engine website before proceeding.
  • Unreal Engine 5: Install Unreal Engine 5 on your computer. While the MetaHuman creation process happens in a browser via the cloud, UE5 is required to download, edit, and use your MetaHuman in projects.
  • Supported Web Browser (Internet Connection): MetaHuman Creator is a cloud-based app, so a stable internet connection and a compatible web browser (Chrome, Edge, or Firefox on a capable PC/Mac) are required. The tool uses cloud streaming (Pixel Streaming), so ensure your system meets the minimum requirements for WebGL. There’s no offline mode – you must be online to create MetaHumans.
  • Quixel Bridge (Plugin or App): UE5 comes with Quixel Bridge integration. Bridge is used to download your MetaHuman into Unreal Engine. Ensure you have Bridge available (in UE5, it’s under the Add Content menu by default).
  • MetaHuman Plugin (for advanced workflows): For using Mesh to MetaHuman or MetaHuman Animator features inside UE5, enable the MetaHuman Plugin (downloadable from Epic’s new Fab marketplace and enable in UE5). This plugin allows creating MetaHumans from custom meshes or video footage.
  • 3D Modeling Software (Optional): If you plan to create custom meshes (e.g. a head sculpt or clothing) or further tweak assets, software like Blender, Maya, or ZBrush can be very useful. For example, Blender with add-ons (like KeenTools FaceBuilder) can help convert a photo to a 3D head model for MetaHuman, as we’ll discuss later.
  • Motion Capture Tools (Optional): If you intend to animate via mocap, you might need an iPhone with TrueDepth camera (for MetaHuman Animator facial capture) or a body motion capture suit. These are not required for creation, but for advanced animation they are common tools.
How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

For quick reference, here’s a table of essential tools and their purpose:

Tool / RequirementPurpose in MetaHuman Workflow
Epic Games AccountRequired to access MetaHuman Creator (cloud service)​.
Unreal Engine 5Needed to import, edit, and use MetaHumans in your project.
Web Browser + InternetUsed to run MetaHuman Creator (online app). No offline creation.
Quixel Bridge (in UE5)Downloads the created MetaHuman into UE5, bringing in all assets.
MetaHuman UE5 PluginEnables Mesh to MetaHuman and MetaHuman Animator inside UE5​.
3D Modeling Software (optional)Create or edit custom meshes (head scans, clothes, etc.) for MetaHuman.
Motion Capture gear (optional)iPhone (TrueDepth) for facial capture, or mocap suit for body animation.

With these tools ready, you’ll be equipped to create and customize your MetaHuman character.

How do you create your own Metahuman in Unreal Engine 5?

Creating your own MetaHuman in UE5 is a straightforward process thanks to the MetaHuman Creator app provided by Epic Games. In summary, the workflow involves designing your character in the MetaHuman Creator (a cloud app) and then exporting it to Unreal Engine 5. Here’s a step-by-step overview:

  1. Launch MetaHuman Creator: Go to the MetaHuman Creator website and log in with your Epic Games account. You can also launch it via the Epic Games Launcher or from Quixel Bridge in UE5 (which opens the cloud app in your browser). If the service is busy, you might enter a short queue before the app opens (wait times are usually a few minutes).
  2. Create a New MetaHuman: In MetaHuman Creator, click the Create button to start a new character. You’ll be presented with a gallery of preset faces and body types​. Select a preset as a starting point – choose one that is closest to the general look you want, since you’ll customize it extensively. After selecting, click Create Selected to generate a new MetaHuman based on that preset. This new character will appear in your My MetaHumans library on the site.
  3. Customize the Character: Now the fun part – customize your MetaHuman’s appearance. You can morph facial features (eyes, nose, jaw, etc.) by using Blend Mode and Sculpt Mode tools (more on this in the next section), choose skin complexion, eye color, hairstyle, and even set body proportions and clothing. MetaHuman Creator provides intuitive controls to achieve the look you want, whether it’s a photorealistic human or a stylized character (within the tool’s realistic bounds).
  4. Save Your MetaHuman: The MetaHuman Creator saves changes in real-time to your cloud library. Once you’re happy with the face and body, simply exit the creator. The character is automatically saved to your account (under My MetaHumans).
  5. Download to Unreal Engine: Open your Unreal Engine 5 project. Using the Quixel Bridge panel (within UE5), sign in with the same Epic account. In Bridge, find the MetaHumans section and you should see your newly created character in the library. Download the MetaHuman assets (this will fetch all the 3D models, textures, rigs, etc. – it might be several gigabytes). After download, click Add or Import to bring the MetaHuman into your project.. The MetaHuman will be added to your Content Browser (typically under MetaHumans folder, with a BP_[Name] MetaHuman blueprint).
  6. Setup in UE5: Make sure the MetaHuman Plugin is enabled in UE5 (this is often enabled by default in recent versions). Drag the MetaHuman Blueprint into your scene – you should see your fully textured character mesh. At this point, the character is ready for animation or further editing in Unreal.
How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

That’s the basic process. In just minutes, you can go from a preset to a fully rigged MetaHuman and have it inside Unreal Engine​.

The following sections will delve into more details and common questions for each part of this workflow – such as using image references, advanced face customization, importing custom assets (meshes, hair, clothing), and how to animate your MetaHuman.

How do I access the Metahuman Creator tool in Unreal Engine 5?

MetaHuman Creator is accessed through a cloud-based app – it’s not a program you install directly in UE5, but you launch it via Epic’s services. Here’s how to access it:

  • Via Web Browser: The primary way to use MetaHuman Creator is to visit the official MetaHuman Creator website. You can get there by clicking “Launch the App” on the Unreal Engine MetaHuman page or going to the URL metahuman.unrealengine.com. Log in with your Epic Games account, select your desired Unreal Engine version (for compatibility) from the dropdown, and click Launch Latest MetaHuman Creator​. This will open the MetaHuman Creator in your browser (it uses streaming; your browser will display a 3D viewport for the character).
  • Through Unreal Engine (Bridge): In UE5, you can open Quixel Bridge (usually by clicking the Quixel icon or via Window > Quixel Bridge). In Bridge, ensure you’re logged in. Navigate to the MetaHumans section and there should be an option to “Launch MetaHuman Creator”. This essentially opens the same web app, but it may automatically handle project version selection. Some users prefer launching through Bridge as it links your project version and ensures compatibility when you download the character later.
  • System Requirements: Make sure you use a supported browser (Chrome, Edge, Firefox on Windows, or Safari on Mac). MetaHuman Creator requires WebGL 2.0 and a GPU with DX12 support on Windows. If you encounter a message like “device not supported”​. check that your browser is up to date and that hardware acceleration is enabled. Also note that as of now, MetaHuman Creator usage on mobile devices or tablets is not supported – use a desktop/laptop.
  • Epic Games Launcher: Occasionally, Epic integrates MetaHuman Creator access via the Launcher’s UE5 tab. Ensure UE5 and Bridge are updated. If there’s a “MetaHuman Creator” launch button in the launcher, that’s another entry point (it will still open a browser).

Once launched, you might see a queue message if the server is busy (during peak times). Typically, wait times are short (under 5 minutes). After that, the MetaHuman Creator UI will appear, and you can start creating or editing MetaHumans.

Tip: If you have trouble accessing the app (e.g., it hangs on loading), try switching to a different browser or clearing your cache. Also verify that you have agreed to any Terms of Service required for MetaHuman. The first time you run it, you might need to allow it through any firewall since it streams content.

Can I create a Metahuman from a photo or image reference?

Many creators wonder if you can generate a MetaHuman directly from a photograph – for example, to make the character look exactly like a real person. Out-of-the-box, MetaHuman Creator doesn’t have a “photo upload” feature that automatically creates a face. However, there are workflows to achieve this using external tools and the Mesh to MetaHuman feature:

  • Use a Photo as Reference (manually): You can import a reference photo into MetaHuman Creator’s background or use a second monitor for side-by-side comparison. Selecting a preset close to the person’s likeness allows manual tweaking of facial features using sculpting controls. This method relies on the user’s artistic eye to approximate the photo’s appearance. It’s a practical, low-tech option for achieving reasonable resemblance without external tools.
  • FaceBuilder for Blender (Photo to 3D Mesh): FaceBuilder in Blender generates a 3D head mesh from one or more photos, producing MetaHuman-compatible topology and UVs. The workflow involves inputting photos, creating a realistic model, and using Mesh to MetaHuman to rig it in UE5. This method captures distinctive facial details, such as birthmarks or skin texture, for transfer to the MetaHuman. It offers a high-tech, precise solution for photo-based character creation.
  • 3D Scanning or Photogrammetry: Photogrammetry tools like RealityCapture or 3D scanning equipment create high-resolution head meshes from multiple photos or scans. After cleanup for neutral expressions and proper topology, the Mesh to MetaHuman plugin converts the mesh into a MetaHuman. This approach yields highly detailed results, ideal for users with access to advanced scanning technology. Epic’s tutorials guide the process for accurate outcomes.
How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

Limitations: Keep in mind that MetaHuman Creator enforces physically plausible human ranges​unrealengine.com. Extremely unique features might be hard to 100% replicate. Also, currently MetaHumans default to adults – creating a child’s face or very stylized/cartoon proportions is not directly supported by the Creator’s blending (more on stylization later). You might need to do those via a custom mesh approach.

Artists achieve convincing likenesses by combining photo-based modeling with MetaHuman’s rigging. These workflows ensure realistic MetaHumans while adhering to Creator’s realistic human constraints.

How do I customize facial features in Metahuman Creator?

MetaHuman Creator provides powerful tools to customize facial features so you can craft a unique face. There are two primary modes for shaping the face, plus detailed sliders for textures and colors:

  • Blend Mode: Blend Mode enables users to interpolate between 3 to 6 MetaHuman preset faces to create a new facial shape. By selecting presets with desired features, such as a specific nose or jawline, users can blend them in varying percentages. This mode, accessed via the Sculpting Toolbar, is ideal for quickly establishing broad facial characteristics. It performs a weighted average of the selected faces’ shapes for a cohesive result.
  • Sculpt Mode: Sculpt Mode allows direct, real-time manipulation of facial regions like brows, eyes, nose, or jaw using on-screen handles. Users can click and drag to adjust shapes, such as widening the nose bridge or altering eye slant, for precise refinements. This mode leverages hundreds of morph targets, functioning as an intuitive morph target editor. It’s essential for fine-tuning details after setting the base shape in Blend Mode.
  • Move Mode: Move Mode repositions facial features without altering their shape, such as adjusting the spacing between eyes or shifting the nose’s height. Accessed through the Sculpting Toolbar, it ensures accurate facial proportions post-sculpting. This tool is critical for balancing feature placement, like eye or mouth distances. It complements Blend and Sculpt Modes for a polished facial layout.
  • Facial Feature Details:
    • Skin: Skin controls offer sliders for tone, aging textures, pore detail contrast, and roughness to create varied appearances. Users can add freckles, adjusting their density, opacity, and color for realistic or stylized effects. Redness in areas like cheeks or nose can be enhanced for a flushed look. These options, found in Skin/Accent/Makeup categories, add depth and personality to the character’s skin.
    • Eyes: Eye customization includes iris color presets, size adjustments, and sclera redness for realistic or expressive effects. Users can blend colors for heterochromia or apply detailed iris patterns from one preset to another. These controls create lifelike or unique eye appearances tailored to the character. They ensure the eyes convey emotion effectively in animations.
    • Teeth: Teeth options allow adjustments for color, from whiter to stained, and dental patterns like gaps for added realism. Gum exposure can be tweaked via smile settings in Sculpt Mode. These controls enhance close-up expressions, ensuring natural dental appearances. They contribute to the character’s overall authenticity and expressiveness.
    • Makeup and Beard: Makeup presets for eyeshadow or lipstick allow customization of intensity and hue for varied looks. Beard and mustache styles can be selected, with adjustable colors under Skin/Accent/Makeup categories. These options add distinct personality traits to the MetaHuman. They integrate seamlessly with other facial customizations for cohesive designs.
    • Hair and Eyebrows: Hair customization includes color, highlights, and root darkness sliders, paired with preset hairstyles for diverse looks. Eyebrow style and color are adjustable in the Face panel, complementing the hair design. These controls enhance the character’s visual identity and distinctiveness. They ensure harmony with the overall facial aesthetic.
  • Using Reference Poses: Reference poses allow users to toggle expressions like smiling or frowning via the Animation Toolbar to test the face’s animation quality. A standard Face ROM animation evaluates the rig’s expression range without altering the sculpt. This tool helps identify and fix deformation issues early in the design process. It ensures the neutral face performs well across various expressions.

Customizing facial features in MetaHuman Creator is intuitive, starting with Blend Mode for base shapes, refining with Sculpt Mode, and finalizing with texture sliders. Iterative tweaks and expression testing create unique, animatable characters, editable anytime in the online library.

How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

Can I use my own 3D mesh or scan with Metahuman in UE5?

Yes – Unreal Engine 5’s MetaHuman framework provides a feature called Mesh to MetaHuman that allows you to use your own 3D character mesh (such as a head model from a 3D scan or sculpt) and convert it into a MetaHuman. This is incredibly powerful for creating custom faces that aren’t achievable with the default Creator alone. Here’s how it works and what you need to know:

  • MetaHuman Plugin (Mesh to MetaHuman): The MetaHuman Plugin, available on the Epic Marketplace, must be installed and enabled in UE5 to access Mesh to MetaHuman functionality. This plugin integrates seamlessly into the editor, enabling the conversion process. It’s a critical first step for custom mesh workflows. Without it, the MetaHuman Identity asset cannot be created.
  • Preparing Your Mesh: A 3D head mesh in OBJ or FBX format, with a neutral expression, closed mouth, and forward-facing eyes, is required. The mesh should include neck and ears, be free of extreme asymmetries or artifacts, and may need decimation if high-poly. No rigging or blendshapes are necessary, as the plugin handles rigging. This preparation ensures compatibility and smooth conversion to a MetaHuman.
  • MetaHuman Identity Asset: In UE5, create a MetaHuman Identity asset and import the custom mesh as a static mesh. Add a Frame, select the mesh, and initiate the Mesh to MetaHuman process within the Identity asset. This asset acts as a container for aligning and processing the custom mesh. It streamlines the workflow for rig fitting.
  • Fitting the Mesh: Manually place facial feature markers (eyes, nose, mouth) to align the MetaHuman rig with the custom mesh during the fitting process. The plugin tracks these features to ensure accurate rigging and animation compatibility. A Promoted Frame snapshot finalizes the tracking for cloud processing. This step is crucial for preserving the mesh’s unique shape and functionality.
  • Create MetaHuman: After alignment, submit the mesh to Epic’s cloud servers for processing, generating a new MetaHuman in your online library. The MetaHuman can be edited in MetaHuman Creator or downloaded via Bridge for UE5 projects. This cloud-based step automates rig integration and topology conversion. It delivers a fully functional MetaHuman based on the custom mesh.
  • What gets transferred: The resulting MetaHuman retains the custom mesh’s shape but adopts MetaHuman’s topology, rig, and default materials. If the mesh has compatible UVs, unique textures like birthmarks can transfer to the MetaHuman’s skin. The facial rig and animation capabilities remain fully intact. This balances custom design with MetaHuman’s standardized features.
  • Custom Body or Full Character Mesh: Mesh to MetaHuman primarily processes heads, with bodies limited to MetaHuman’s 18 standard types. Full-body custom meshes require advanced rigging to attach to MetaHuman heads, typically using only the head portion from scans. This restricts full-body customization but supports head-focused designs. Advanced users can merge custom bodies with MetaHuman heads via skeletal mesh adjustments.

The Mesh to MetaHuman workflow, though requiring mesh preparation and cloud processing, produces unique, rigged MetaHumans for precise or stylized character creation.

How do I import a Metahuman into an Unreal Engine 5 project?

After creating a MetaHuman (using the Creator or Mesh to MetaHuman), importing it into your UE5 project is a crucial step. Thankfully, Epic has streamlined this via Quixel Bridge:

  • Open Quixel Bridge in UE5: Launch UE5, open your project, and access Quixel Bridge through the Content Browser’s Add dropdown menu. Log in with your Epic account to connect to your MetaHuman library. In UE5.1+, Bridge is integrated; older versions may require standalone access. This step establishes the connection to your created MetaHumans for import.
  • Find Your MetaHuman: In Bridge’s MetaHumans section, locate your created characters listed under My MetaHumans, identifiable by their thumbnails and names. This ensures you select the correct MetaHuman for your project. The interface organizes assets for easy navigation. Browsing this section confirms your MetaHuman is available for download.
  • Download the MetaHuman Assets: Click Download on your chosen MetaHuman, selecting the highest quality with all LODs for maximum detail. The large file includes meshes, textures, hair grooms, and material data, requiring ample disk space. Downloads are stored locally, typically in Documents\Megascans Library\MetaHumans. Verify the library path in Bridge’s settings to avoid download interruptions.
  • Add to Project: Once downloaded, click Add in Bridge to import all MetaHuman assets into your UE5 project’s Content Browser. This includes a Blueprint (BP_[Name]), skeletal meshes, materials, textures, and a DNA asset for the facial rig. The import process is automated, organizing assets efficiently. This step makes the MetaHuman accessible for scene use.
  • Verify Content: Check the Content Browser for a MetaHumans folder containing your character’s folder, with the Blueprint and subfolders for Face, Body, and Materials. Opening the Blueprint reveals the assembled character with all components. This confirms the import was successful and assets are correctly structured. It prepares the MetaHuman for immediate use in the project.
  • Drag into Scene: Drag the BP_[Name] from the Content Browser into the level viewport to place the MetaHuman in your scene. The character appears fully textured in a neutral idle pose, ready for interaction. Press Play to ensure visibility and check for shader compilation issues. This integrates the MetaHuman into your project environment for further setup.
  • Troubleshooting Import: If the import fails, verify that the MetaHuman plugin is enabled in the project and the engine version is compatible. Ensure the project matches or exceeds the Creator’s target version to avoid version conflicts. Check disk space and library path accessibility to resolve download issues. Restarting the editor may be necessary to apply plugin changes.
  • Multiple MetaHumans: Import multiple MetaHumans, each stored in its own folder within the MetaHumans directory, sharing common assets like skeleton hierarchies. This optimizes project size, as additional MetaHumans don’t linearly increase storage demands. It supports projects requiring diverse character casts. Asset sharing enhances efficiency for large-scale scenes.
How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

Imported MetaHumans are high-fidelity assets optimized for animation or scene integration, with LOD adjustments ensuring performance across various project types.

How do I apply clothing and accessories to a Metahuman character?

MetaHumans come with a set of preset clothing options in the Creator, but you might want to change or add custom clothing and accessories to better fit your project’s character design. There are a few approaches to outfitting your MetaHuman:

  • Preset Clothing in MetaHuman Creator: In MetaHuman Creator’s Clothing section, under Body controls, users can choose from a variety of tops, bottoms, and shoes. Options include shirts, jackets, pants, and footwear with customizable texture prints and colors. This limited library focuses on modern casual styles, suitable for quick setups. Accessories like glasses or hats are not available in the Creator.
  • Custom Clothing via DCC (Digital Content Creation) Tools:
    • Export MetaHuman Body: Export the MetaHuman’s body skeletal mesh as an FBX file from UE5 to serve as a modeling reference in Blender or Maya. Including the nude body ensures accurate proportions for clothing design. This step provides the exact body shape for fitting custom garments. It’s essential for creating tailored outfits that move naturally with the character.
    • Model or adjust the clothing: Import or create clothing meshes in Blender or Maya, aligning them to the exported MetaHuman body’s proportions and pose (A-pose or T-pose). Rig the clothing to the MetaHuman skeleton with proper weight painting to ensure smooth deformation during movement. Models can be sourced from marketplaces or designed in tools like Marvelous Designer. This ensures the clothing integrates seamlessly with the character’s animations.
    • Import to UE5: Import the rigged clothing mesh into UE5, selecting the MetaHuman skeleton in the import options to maintain compatibility. Add the clothing as a Skeletal Mesh component to the MetaHuman Blueprint, attaching it to the root pelvis or torso bone. This integrates the custom garment into the character’s structure. The clothing will follow the MetaHuman’s movements accurately.
    • Hide Body Underneath: Prevent poke-through by hiding body parts covered by clothing, using material slot toggles in the Blueprint to disable visibility of torso or legs. Alternatively, apply an opacity mask in the skeletal mesh editor or use an invisible “nude suit” for covered areas. This ensures clean visuals without mesh intersections. It’s a critical step for professional-looking character designs.
    • Weight & Physics: For dynamic clothing like coat tails or dresses, apply UE5’s Chaos Cloth or rigid body physics by painting cloth weights in the skeletal mesh editor. Adjust physics settings to control stiffness and movement for realistic draping effects. This enhances cinematic quality and visual realism in animations. Static clothing may not require physics, simplifying the setup.
  • In-Editor MetaHuman Clothing Customization (UE5.3+): UE5.3+ introduces tools in the Skeletal Mesh Editor and Modeling mode for directLAPD in-engine clothing modifications, such as sculpting sleeve lengths or necklines. Users can clone MetaHuman clothing assets and reshape them directly in UE5, saving time for minor adjustments. This emerging workflow streamlines customization without external software. It’s ideal for quick edits to existing outfits.
  • MetaTailor and other plugins: Third-party plugins like MetaTailor automate the process of fitting clothing meshes to various MetaHuman body variants. These tools resize and adapt outfits efficiently, reducing manual adjustments. They’re particularly useful for projects requiring multiple custom outfits. Exploring these plugins can enhance workflow efficiency for large-scale character customization.
  • Accessories: Attach accessories like glasses, helmets, or jewelry as Static or Skeletal Mesh components to the MetaHuman’s head or face bone. Position and rotate them precisely to fit the character’s face or head proportions. This adds detailed personalization to the character’s appearance. Accessories move naturally with the head, enhancing visual coherence.

Custom clothing requires treating the MetaHuman like a standard character: design or acquire meshes, rig them to the skeleton, and import them. With proper integration, MetaHumans can wear anything from fantasy armor to period costumes, offering unlimited creative possibilities.

How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

How do I edit Metahuman hair, eyes, and skin textures?

Customizing the hair, eyes, and skin beyond the default options can really define your character’s style. Here’s how you can edit and fine-tune these aspects:

  • Hair Styles and Color:
    • Switch to a different preset groom: MetaHuman Creator offers preset hairstyles, including short, medium, long, curly, or braided options, to diversify character appearances. Switching to an uncommon hairstyle can significantly refresh the character’s look when paired with a unique face. This simple customization leverages the existing library for quick results. It’s an accessible way to enhance visual variety without external tools.
    • Modify hair color and roughness: The Creator’s hair controls allow adjustments to hair color via a color picker and melanin slider for realistic tints. Users can tweak softness, roughness, highlights, and root darkness to simulate dyed or ombre effects. These sliders provide intuitive control for creating distinct hair aesthetics. They add depth and personality to the character’s hairstyle.
    • Custom Hair (see next section): For unique or stylized hairstyles, users can create or import custom hair grooms using tools like Blender or XGen, or purchase assets like PixelHair. This approach, detailed in the custom hair section, supports elaborate or non-standard designs. It requires external modeling and import into UE5. It’s ideal for bespoke character looks.
  • Eyes (Iris and Eye Appearance):
    • Base Color and Detail Color: In MetaHuman Creator, users select eye color presets and fine-tune base and detail colors for complex iris patterns, including heterochromia. Blending two colors creates intricate designs, enhancing expressiveness. These controls ensure eyes convey emotion and character depth. They’re accessible in the Face customization panel for easy adjustments.
    • Bloodshot amount: Adjust the sclera’s vein intensity to create clear, healthy eyes or tired, realistic ones by increasing redness. This parameter adds subtle realism or narrative cues, like fatigue, to the character. Sliders offer precise control over the effect’s intensity. It enhances the authenticity of the MetaHuman’s gaze.
    • Eye Occlusion and Shadow: MetaHuman eye materials include an occlusion mesh simulating wetness and shadow under the eyelid for depth. In UE5, users can adjust the shader’s reflectance properties if needed, though defaults are typically sufficient. This maintains the eyes’ lifelike, three-dimensional appearance. It ensures realistic light interaction in various scenes.
    • Custom Eye Textures: For stylized looks like anime or glowing sci-fi eyes, edit the eye material instance in UE5 by overriding the iris texture or color. Replacing textures with flat emissive ones creates glowing effects while preserving the parallax effect for depth. Custom iris patterns, like logos, can be imported with matching UVs. This supports unique, non-photorealistic eye designs for creative projects.
  • Skin Textures and Materials:
    • Material Instances: In UE5, locate the MetaHuman’s skin material instances for head and body to tweak subsurface scattering, hue, or normal map strength. These adjustments refine skin appearance to match lighting conditions or artistic goals. Material instances allow non-destructive edits without altering base assets. They provide flexibility for scene-specific skin rendering.
    • Replacing skin texture: Export the skin’s albedo map to Photoshop or Substance Painter to add tattoos, scars, or unique makeup not available in Creator. Edited textures are imported as material instance overrides, applied to the character’s skin atlas. This process supports highly personalized skin details. It requires texture editing expertise for seamless integration.
    • Wrinkles and pores: Adjust the normal map slider in Creator to increase or decrease wrinkles and pores, aging or youthifying the face. For stylized smooth skin, override normal maps with flattened ones in UE5 for a cartoon-like effect. This controls the level of skin detail visible in renders. It balances realism with artistic stylization as needed.
    • Makeup/Accent maps: Customize makeup by editing the makeup mask or creating new designs using Creator’s textures as references. Import edited masks via material instances for unique eyeshadow or lipstick looks. This enhances character personality with bespoke cosmetic styles. It’s a powerful way to differentiate characters visually.
  • Other Materials (Teeth, Tongue, etc.): Teeth materials can be recolored or adjusted for shine, supporting effects like vampire fangs with mesh and texture edits. Tongue materials are similarly tweakable for specific visual effects, though less commonly customized. These adjustments add subtle character details in close-up scenes. They integrate with the MetaHuman’s material system for cohesive results.
  • LOD Consideration: Custom hair and eye textures must remain consistent across LODs to avoid visual discrepancies at lower detail levels. Bright hair colors or unique eye textures require adjusted LOD-specific textures to match the high-detail versions. Check lower LODs in UE5 to ensure seamless transitions. This prevents jarring shifts in appearance during distance rendering.

MetaHuman’s high-quality baseline assets can be minimally tweaked for realism or extensively retextured for stylization, with UE5 edits enabling precise, project-specific enhancements.

How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

How do I use PixelHair or custom hair assets with Metahuman?

PixelHair is a popular collection of premade 3D hair assets (often created in Blender) that are compatible with MetaHumans, but the question generalizes to using any custom hair on a MetaHuman. The default MetaHuman hairstyles might not cover every look (for example, elaborate braids, fantasy hairdos, or cartoonish hair). Using PixelHair or similar assets allows you to give your MetaHuman a unique hairstyle. Here’s a guide on how to do it:

  • What is PixelHair? PixelHair is a collection of realistic hair grooms designed for Blender and UE5 MetaHumans, available as Alembic files or Blender hair curves. It includes fitted scalp caps that conform to various head shapes, simplifying attachment. These assets are explicitly compatible with MetaHuman’s hair system. They offer styles like afros or braids, expanding creative options.
  • Exporting/Converting the Hair Asset: In Blender, configure the PixelHair asset on its provided scalp cap, ensuring proper shaping and alignment. Apply modifiers, such as shrinkwrap, to fit the cap to the head, then export the hair as an Alembic (.abc) file with the Export Hair option. Export the scalp cap mesh separately as a static or skeletal mesh. This prepares the hair and cap for seamless import into UE5.
  • Importing Hair to UE5: Enable the Groom Plugin in UE5 and import the Alembic file as a Groom Asset to represent the hair strands. Import the scalp cap mesh and create a Groom Binding Asset, linking the groom to the MetaHuman’s head skeletal mesh. This binding ensures the hair moves correctly with the character. The import process integrates the hair into the project’s asset structure.
  • Attach to MetaHuman: Open the MetaHuman’s Blueprint (BP_[Name]) and add a Groom Component, assigning the imported Groom and Binding Assets. Attach the component to the Head bone and disable or remove the default MetaHuman hair component. This replaces the original hair with the custom groom. The new hairstyle integrates fully with the character’s movements.
  • Adjust Materials: Assign the PixelHair-provided material, configured for Unreal’s Hair shading model, to the Groom Asset, incorporating texture maps for strand color variation. Tweak strand thickness, shadow settings, and shader properties to match the MetaHuman’s lighting environment. This ensures the hair blends visually with the character’s overall appearance. Material adjustments enhance realism and coherence.
  • Fitting and Scaling: If the hair doesn’t fit perfectly, import the MetaHuman’s head mesh into Blender and adjust the scalp cap using a shrinkwrap modifier targeting the head. Re-export the adjusted cap and hair for a snug fit across head shapes. PixelHair’s versatile caps accommodate minor variations with minimal tweaking. This step ensures precise alignment and eliminates gaps.
  • LOD and Performance: Configure LOD settings in the Groom Asset to reduce strand count at distance, optimizing performance for multiple grooms. Optionally generate hair cards for far LODs to replace strands, an advanced but effective technique. Monitor resource usage to maintain smooth rendering in complex scenes. This balances visual quality with project efficiency.
  • Physics: Enable Chaos physics in the Groom Asset for dynamic hair like ponytails, adjusting stiffness and damping for natural movement. Include colliders via the head’s Physics Asset to prevent hair intersecting the body. Disable physics for static styles like short cuts to save resources. This adds realism to animations where hair motion is key.

Custom hair like PixelHair enhances MetaHuman uniqueness, requiring careful import, binding, and optimization for high-quality, seamless results.

How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

Is it possible to use PixelHair or custom hair assets with Metahuman?

Absolutely – PixelHair and other custom hair assets can be used with MetaHumans, and this has become a common practice to extend MetaHuman customization. Epic’s system is flexible: the hair that comes with MetaHumans is essentially a groom asset attached to the MetaHuman skeletal mesh. You can swap that out for any other groom asset. To reiterate the key points (and address any doubts explicitly):

  • Compatibility: PixelHair assets, marketed for Blender and MetaHuman, are designed to work with Unreal’s Groom system, using Blender’s particle hair and fitting caps. They integrate seamlessly, requiring no modifications to MetaHuman’s hair framework. This compatibility is explicitly supported by PixelHair’s design specifications. It ensures straightforward adoption for custom hairstyles.
  • Procedure: Import the custom hair groom into UE5, remove the default MetaHuman hair, and attach the new groom to the head via the Blueprint’s Groom Component. This mirrors standard Unreal hair workflows, as demonstrated in Epic’s tutorials for Maya XGen or Blender hair imports. The process is well-documented and accessible. It allows easy swapping of hairstyles.
  • Multiple Custom Hairs: Import a library of PixelHair or custom hair assets and swap them in the MetaHuman Blueprint to test different styles. This flexibility supports rapid experimentation with various looks, such as braids or spiky anime hair. It enhances creative iteration during character design. Multiple hairs can coexist in a project for versatility.
  • Benefits: Custom hair enables unique or stylized looks, like elaborate fantasy hairdos or cartoonish spikes, breaking free from the Creator’s realistic defaults. It allows the face to remain high-fidelity while the hair matches the project’s art style. This customization makes MetaHumans stand out visually. It aligns characters with specific creative visions.

Custom hair is a recommended, supported approach for achieving distinctive MetaHuman hairstyles, leveraging tools like PixelHair for seamless integration.

Can I export Metahuman characters to other software like Blender or Maya?

You might want to export your MetaHuman out of Unreal for various reasons – maybe to do additional modeling in Blender, render in another software, or integrate into a different pipeline. Technically, you can export MetaHuman characters to DCC software like Blender or Maya, but there are important considerations:

  • Exporting from Unreal: In UE5, right-click the MetaHuman’s skeletal mesh assets (body and face) in the Content Browser and export as FBX, including morph targets for facial blendshapes. Export materials and textures from the Textures subfolder or individually as needed. This captures the mesh, skeleton, skin weights, and simplified facial expressions. It prepares assets for import into Blender or Maya without data loss.
  • In Blender or Maya: Import the FBX into Blender or Maya to access the mesh, skeleton, and approximately 50 ARKit-compatible blendshapes for facial animation. The full MetaHuman DNA rig, with hundreds of micro-shapes, simplifies outside UE5 unless using Epic’s Maya DNA plugin or community Blender tools. Materials must be recreated in the target software for rendering. This enables editing but requires rig management for full functionality.
  • Use Cases for Export:
    • Further Modeling: Export MetaHumans to Blender for sculpting extreme body shapes or facial tweaks beyond Creator’s limits, then re-import via Mesh to MetaHuman. This supports stylized designs, like larger heads or unique proportions, for creative projects. Edited meshes require re-rigging for UE5 compatibility. It expands customization beyond Creator constraints.
    • Rendering: Render MetaHumans in Blender’s Cycles or Maya’s Arnold for linear content like images or videos, recreating shaders for accurate visuals. Epic’s license permits such non-interactive use, making it ideal for high-quality stills or films. This leverages external renderers’ strengths. It’s a common workflow for cinematic outputs.
    • Animation Workflow: Animate MetaHumans in Maya using custom rigs, then retarget animations to UE5, preserving skeleton names for body and using ARKit blendshapes for faces. This suits studios with established Maya pipelines, ensuring seamless animation transfer. Facial animations may lose some DNA-driven nuance. It integrates external animation into UE5 projects.
  • Limitations: Epic’s license prohibits using MetaHuman assets in competing engines like Unity for interactive applications, limiting exports to editing or rendering in Blender/Maya. Interactive projects must return to UE5 to comply with terms, restricting cross-engine use. Personal use for modeling or rendering is permitted. This enforces Unreal exclusivity for real-time applications.

Exporting MetaHumans supports advanced editing and rendering workflows, but careful management of rigs and licensing ensures compliance with Epic’s UE5-focused terms.

How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

How do I use the Metahuman control rig inside Unreal Engine 5?

The MetaHuman Control Rig is a built-in rig that Epic provides to easily animate MetaHuman characters directly in Unreal Engine, especially in the Sequencer (UE’s cinematic timeline editor). Using it allows you to create animations (poses, movements, facial expressions, lip-sync) without needing to round-trip to an external program. Here’s how to use it:

  • What is Control Rig? Unreal’s Control Rig system animates MetaHumans using a pre-made rig tailored for their skeleton, covering both body and face. It includes IK effectors for hands, feet, and spine, plus facial sliders for expressions like smiles or frowns. This intuitive setup drives complex animations within UE5’s environment. It’s designed for cinematic workflows, offering Maya-like control in-engine.
  • Accessing Control Rig in Sequencer:
    • Add MetaHuman to Sequence: Place the MetaHuman in the level and add it to a Level Sequence via the Sequencer’s + Track > Actor option, selecting the Blueprint (BP_[Name]). This integrates the character into the cinematic timeline for animation setup. It ensures the MetaHuman is ready for rigging. It’s the foundational step for animation workflows.
    • Add Control Rig Track: Right-click the MetaHuman in Sequencer and add a Control Rig track, selecting MetaHuman_ControlRig or separate body/face rigs if available. This activates the rig’s controls, displaying handles like hand IK or facial gizmos in the viewport. The track enables keyframing and animation editing. It connects the rig to the Sequencer’s timeline for precise control.
  • Animating with Control Rig: Keyframe rig controls, such as hand IK, spine, or facial sliders, by posing the character at specific frames in Sequencer for movements like reaching or smiling. Body controls include IK limbs, finger curls, and spine adjustments; facial controls cover brows, lips, and eyes. This creates detailed, layered animations for cinematic scenes. Blend with premade animations for efficiency, using weighted layers or baking options.
  • Examples of use: For a dialogue cinematic, keyframe body gestures like head turns or hand waves using the rig, and animate facial lip-sync via the Face Control Board. Integrate Audio-to-Facial animation or mocap for realistic speech, refining with manual tweaks. This crafts expressive, story-driven performances. It supports complex scenes with natural character interactions.
  • Live Mode: Enable Live Mode to move rig controls during sequence playback, capturing real-time performance for iterative animation workflows. This advanced feature records live poses, ideal for experimenting with timing or expressions. It complements traditional keyframing for dynamic results. Typically, keyframing offers greater precision for final outputs.
  • Switching between FK/IK: Toggle limbs between FK (joint rotation) and IK (hand/foot placement) modes within the rig for flexible animation styles. IK is default for planting limbs, while FK suits rotational movements like arm swings. Controls allow seamless switching to match animation needs. This adaptability enhances creative control during posing.
  • Fine-tuning: Access detailed controls, like individual finger joints or facial features (tongue, cheeks, eye look-at), for nuanced expressions and subtle movements. Keyframe these for effects like a smirk or squinted eyes to convey emotion. This level of granularity mimics high-end rigging in external tools. It ensures lifelike, polished animations within UE5.
  • Using with Motion Capture: Apply body or facial mocap via Live Link or animation blueprints, then bake to the Control Rig for manual edits, like adjusting hand positions or exaggerating expressions. Facial mocap from MetaHuman Animator integrates seamlessly, with rig tweaks enhancing performance. This polishes captured data for cinematic quality. It combines automation with artistic refinement.

The MetaHuman Control Rig offers a robust, in-engine solution for animating MetaHumans, delivering professional-grade cinematics with real-time feedback and versatile animation blending.

How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

Can I use Metahuman with motion capture in Unreal Engine 5?

Yes, MetaHumans are fully compatible with motion capture in UE5, designed to leverage mocap for realistic body and facial animations.

Motion Capture Options:

  • Body Motion Capture: Import body mocap data from systems like Vicon, OptiTrack, or Rokoko and retarget it to the MetaHuman’s Epic-based skeleton using Live Link for real-time streaming or IK Retargeter for recorded animations. The skeleton’s human-like proportions ensure natural motion transfer with minimal adjustments. For extreme proportion differences, tweak retargeting settings or use the MetaHuman IK Pose asset. This drives realistic body movements, from walks to complex gestures, in UE5 projects.
  • Facial Motion Capture:
    • Live Link Face (iPhone app): Use the Live Link Face app on an iPhone with TrueDepth to stream facial expressions to the MetaHuman in real time, leveraging the 52 ARKit blendshapes. Add the Live Link Face component to the MetaHuman Blueprint and link it via UE5’s Live Link plugin. This enables live performances, ideal for virtual production or interactive scenes. It delivers instant, responsive facial animation with minimal setup.
    • MetaHuman Animator (offline capture): Record a short video with depth data using an iPhone’s front camera, then process it in UE5’s MetaHuman Plugin to generate high-fidelity facial animation assets. This captures subtle nuances, like micro-expressions and tongue movements, for realistic performances. Apply the resulting animation to the MetaHuman’s face for cinematic-quality results. It’s a powerful offline solution for detailed facial capture.
    • Other Facial Mocap Systems: Use professional systems like Faceware or head-mounted cameras (HMC) with marker-based tracking, mapping data to MetaHuman’s 52 ARKit blendshapes or bone transforms. The MetaHuman Plugin supports stereo HMC footage for advanced capture setups. This integrates high-end facial mocap into UE5 workflows. It expands options for studios with specialized equipment.
  • Setup Requirements: Enable the Live Link and MetaHuman plugins in UE5 and follow documentation for calibration, including ARKit Face Blueprint setup for facial data. Ensure proper network connections for Live Link streaming and calibration frames for MetaHuman Animator. This ensures seamless data integration and accurate animation mapping. It’s critical for both live and offline mocap workflows.
  • Full Performance Capture: Combine body and face mocap, such as an actor in a mocap suit with an iPhone on a helmet rig, to capture both simultaneously in UE5. Apply body data via Live Link or recorded animations and face data via Live Link Face or MetaHuman Animator to the same MetaHuman. This creates a fully driven digital human, as seen in high-end demos like Hellblade II. It delivers comprehensive, lifelike performance capture for immersive cinematics.
  • Finger and Additional Mocap: If the mocap system includes finger tracking via gloves or suits, MetaHuman’s finger joints animate accordingly for detailed hand gestures. Without finger tracking, use default hand poses or keyframe fingers via the Control Rig for specific actions. This adds precision to hand animations, enhancing overall realism. It ensures hands contribute to the character’s expressiveness.

Mocap with MetaHumans produces natural, high-fidelity results, with Control Rig touch-ups for polishing animations, making it ideal for realistic character performances.

How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

How do I blend animations and control facial expressions in Metahuman?

Blending animations and controlling facial expressions is key to bringing a MetaHuman to life, especially in cinematic scenarios or complex interactions. This involves combining multiple animation sources and fine-tuning the face:

  • Blending Body Animations: If your MetaHuman has multiple animation sources (say, a base walk cycle and an upper-body wave, or transitioning from one motion to another), you’ll use Unreal’s animation blueprint or Sequencer blending features:
    • Animation Blueprint: In gameplay, you might have a state machine or layered blend. For example, you can play a run animation on the lower body and a shooting animation on the upper body by splitting the skeleton hierarchy in an animation blueprint.
    • Sequencer: In cinematics, you can simply place two animations on the timeline with overlap and use the built-in blending (by default, when one animation track overlaps another on the same character, UE will crossfade between them). If you have a Control Rig track, you can also blend between keyframe animation and an imported animation by using weighted layers or the “Apply Animation” control rig node.
    • IK Rig Retargeter: If blending between different sources, ensure all are retargeted properly to MetaHuman’s skeleton.
  • Facial Expressions and Lip-sync: There are a few ways to drive facial expressions:
    • Direct Keyframing (Control Rig): As discussed, you can keyframe facial controls for expressions (smile, frown, blink, etc.). If your character is speaking, you might keyframe the jaw open/close, lip shapes, etc., to match dialogue (or use a viseme-based approach).
    • Facial Animation Curves (Morph Targets): If you prefer, you can animate the 52 ARKit morph target curves directly in Sequencer or via an animation asset. Unreal allows you to key those curve values (like CTRL_eyeBlinkLeft, CTRL_mouthSmile, etc.) on an animation track.
    • Facial Pose Library: MetaHumans come with a facial pose library (a set of premade facial expressions like Joy, Anger, etc.). You can use those poses as starting points. In Sequencer, you can keyframe a pose asset to quickly get a certain expression, then refine.
    • Lipsync from Audio: There are plugins and tools (third-party, like SRanipal, FaceFX, etc.) that generate lip sync from audio. Also, MetaHuman Animator includes an audio-to-animation feature (Audio2Face style) where given dialogue, it can create facial animation. If not using those, manual animation or using something like Live Link Face recorded data is an option.
  • Blending Facial Animations: If you have multiple facial animation sources (e.g., a base talking animation and an additional expression), the MetaHuman facial rig (via Control Rig or animation blueprint) can blend them:
    • The face is typically driven by morph targets, which blend additively. If you smile while talking, the shapes add together (provided they don’t 100% conflict). Unreal’s animation system can handle additive blending for morph targets. For example, you could have a base talking animation (from lip sync) and then layer an “angry expression” animation on top with additive mode, so the mouth moves and the brows furrow concurrently.
    • In Sequencer, you might achieve this by using the Control Rig to set an expression, then letting the live capture drive the mouth. Or record two separate takes and use additive tracks.
  • Control Rig vs Animation Assets: You can mix these. A common approach:
    • Use mocap or keyframe for the body movement.
    • Use MetaHuman Animator or Live Link for initial facial performance (which gives natural movement).
    • Then use Control Rig to tweak or exaggerate certain facial expressions or to ensure the character’s emotion reads well. For instance, increase the eyebrow raise a bit more for a surprised look at a specific moment by keying that on top of the mocap.
  • Example – Cinematic Dialogue Scene: Suppose your MetaHuman is in a cutscene talking:
    • You recorded an actor’s face and got an animation for the dialogue.
    • However, you want to blend in a pre-made head nod animation and a hand gesture from an animation library.
    • In Sequencer, you put the body gesture animation on one track and the face mocap on another (the face might come in as an animation on the Face track if using the MetaHuman schema).
    • Ensure the face track only affects facial bones/morphs, and the body animation affects body bones. If needed, use Animation Blueprint with layer blend per bone (exclude head from body anim so that head from face mocap takes priority).
    • Then you realize you want the character to smile at the end – you could animate the mouth corners up via Control Rig starting at that moment, blending on top of the mocap (which might have them neutral).
    • You might also use a Look At control to have the eyes focus on another character at a certain time, blending away from the default eye motion.
  • Fine Control of Blends: UE5’s Control Rig in Sequencer allows weight blending. You could have two Control Rig tracks (one neutral, one with an expression) and fade weight between them. Or use an animation clip of a facial expression and adjust its influence.

Blending animations and controlling expressions is a broad topic, but the key takeaway is that MetaHumans support layered animation: you can separately animate body and face and combine them, and you can mix mocap with keyframes. The robust rig and the ARKit blendshape foundation ensure that even when blending animations, the results remain coherent (since they’re all using the same morph targets under the hood).

How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

What is the best way to use Metahuman for cinematics in UE5?

MetaHumans truly shine in high-quality cinematics. To get the best results for a cinematic or film sequence in UE5, consider the following best practices and techniques:

  • Use Highest LOD for Close-ups: Force LOD 0 on MetaHuman components for close-ups to capture intricate details like skin pores, wrinkles, and hair strands. Disable LOD sync auto-adjustment or use Cinematic LOD settings to maintain maximum fidelity. This prevents detail loss or hair switching to lower-quality cards in critical shots. It ensures immersive, high-resolution visuals that enhance emotional impact.
  • Lighting and Materials: Enable ray tracing or Lumen for realistic skin subsurface scattering and eye reflections, enhancing MetaHuman’s lifelike appearance. Adjust skin material specular or oiliness to suit scene lighting, avoiding waxy looks or overly matte finishes. Position catchlights or add gentle eye-lights to create sparkling eye highlights. This crafts a cinematic look that rivals real actors under professional lighting setups.
  • Depth of Field: Utilize shallow depth of field (DOF) in UE5’s cinematic cameras to focus on the MetaHuman’s eyes during close-ups, emphasizing emotional beats. Implement smooth rack focus transitions to guide viewer attention, mimicking professional cinematography techniquesaza. This creates visually compelling shots with a filmic aesthetic. It draws audiences into the character’s expressions and story.
  • Facial Animation and Emotion: Capture performances with MetaHuman Animator to record actors’ micro-expressions, tongue movements, and lip-sync for authentic facial animation. Refine these with Control Rig to amplify emotions, like sharpening a smile or adding a furrowed brow, ensuring readability. Incorporate subtle body language—breathing, shrugs, or weight shifts—via the rig to keep the character dynamic. This delivers emotionally resonant performances that connect with viewers.
  • Cinematography Considerations: Apply motion blur, bloom, and film tonemapping in cinematic cameras to achieve a polished, film-like quality. Add micro head movements and eye darts via Control Rig to avoid static, statue-like appearances, as humans naturally exhibit subtle motions. Ensure precise lip-sync alignment with audio to maintain immersion in close-ups. Use asymmetric expressions, like smirks, to enhance realism and character nuance.
  • Use Sequencer Features: Leverage Sequencer to animate material parameters, such as increasing skin redness for a blush or triggering tear particles for crying scenes. Accentuate eye tearline and waterline materials for emotional effects, adjusting opacity or reflectivity. These small details add depth and realism to dramatic moments. They elevate the cinematic experience with subtle, impactful touches.
  • Cinematic Hair and Cloth: Increase simulation quality for hair and clothing using Chaos Cloth with refined settings for natural draping or wind effects, viable for offline renders. Enable hair physics with higher substeps for accurate motion, like flowing ponytails in dynamic scenes. This enhances visual realism, especially in close-ups or action sequences. It ensures hair and cloth move convincingly, complementing the MetaHuman’s lifelike design.
  • Movie Render Queue (MRQ): Render cinematics with Movie Render Queue, using high anti-aliasing, motion blur, and multiple frame sampling for polished outputs. Opt for path tracing for ultimate quality, eliminating real-time shading approximations, if memory allows. MetaHumans in path-traced renders achieve near-photorealistic fidelity. This delivers professional-grade frames for cinematic presentations.
  • Performance vs Quality: For real-time cutscenes, balance quality with frame rate by using LOD 1 for medium shots or reducing hair strand density. For offline renders, maximize all settings, including full LOD 0 and high-resolution textures, to prioritize visual excellence. This optimizes cinematics for target platforms, from games to films. It ensures smooth performance or maximum fidelity as needed.
  • Audio and Sound: Pair MetaHumans with high-quality voice acting and perfectly synced audio to complement their visual realism. Weak audio or misaligned lip-sync undermines the cinematic impact, especially in close-ups. Ensure robust sound design to match the MetaHuman’s accurate lip-sync capabilities. This completes the immersive storytelling experience, making performances believable.

Treating MetaHumans like real actors with advanced UE5 tools results in cinematics where viewers forget they’re digital, achieving storytelling excellence.

Live link face with metahuman: how to capture real-time facial animation in unreal engine 5
How to create your own metahuman in unreal engine 5: complete step-by-step guide

Can I use Metahuman with third-party characters or assets?

MetaHumans can certainly coexist and interact with other characters or third-party assets in your project. Depending on what you mean by “use with,” here are a few interpretations and answers:

  • Mixing MetaHumans with Other Characters in Scene: Place MetaHumans alongside custom or marketplace characters in UE5 scenes, adjusting for real-world scale differences (MetaHumans are ~180 cm tall). They interact via animations or physics, like dialogue or combat, without technical conflicts. This supports diverse casts, from realistic to stylized characters. MetaHumans function as standard assets, enabling flexible scene compositions.
  • Retargeting Animations between MetaHuman and others: If third-party characters use the Epic skeleton, animations can be directly shared with MetaHumans; otherwise, UE5’s IK Retargeter maps animations between different skeletons. For example, MetaHuman mocap can be retargeted to a Daz3D character, unifying animation workflows. This streamlines projects with mixed character types. It ensures consistent motion across assets for cohesive scenes.
  • Swapping Parts (Head/Body) with Third-Party:
    • MetaHuman Head on Another Body: Attach a MetaHuman head to a custom body, like a creature or android, by aligning or merging skeletons, requiring advanced rigging skills. Community tutorials demonstrate creating hybrid MetaHumans, such as a MetaHuman face on a CGI monster. This enables unique character designs for creative projects. It demands precise bone alignment for seamless animation.
    • Other Head on MetaHuman Body: Convert a third-party head into a MetaHuman-compatible head using Mesh to MetaHuman, integrating it with the MetaHuman body rig. Attaching arbitrary heads sacrifices the MetaHuman facial rig’s advanced blendshapes and DNA-driven animations. This approach is less common due to rigging complexities. It’s viable for specific design goals but requires careful integration.
  • Using Third-Party Assets ON MetaHumans: Attach third-party props, like swords or phones, to MetaHuman hand sockets as standard character attachments. Apply third-party animation packs, such as fighting moves, via retargeting to the MetaHuman skeleton. Custom clothing and hair, like PixelHair, integrate as discussed in prior sections. This enhances MetaHuman customization with marketplace or bespoke assets for project-specific needs.
  • Third-Party Animation Tools: Animate third-party characters in tools like Maya or Reallusion iClone, then retarget animations to MetaHumans via Live Link or FBX imports. iClone supports direct MetaHuman control through Live Link, bridging external toolchains. This leverages preferred animation workflows for MetaHuman projects. It ensures compatibility with studio pipelines for efficient production.
  • Limitations: Realistic MetaHumans may clash visually with stylized third-party characters, requiring material adjustments (e.g., simplified shaders) for coherence. Epic’s license prohibits MetaHuman use in non-UE5 interactive projects, like Unity games, limiting cross-engine applications. Editing or rendering in Blender/Maya is permitted, but interactive use must return to UE5. This enforces Unreal exclusivity while allowing flexible workflows within its ecosystem.
  • AI or Third-party AI drivers: Apply AI-generated animations, like NLP-driven facial sequences, to MetaHumans if output as standard animation data compatible with ARKit blendshapes or skeletal transforms. Map AI outputs to MetaHuman’s rig for seamless integration, supporting emerging AI tools. This expands animation possibilities with cutting-edge technology. It requires proper data formatting for effective use.

MetaHumans function as versatile assets, integrating with third-party characters, props, and animations via retargeting and attachments, with licensing ensuring UE5 exclusivity for interactive projects.

How to make metahumans look realistic in unreal engine 5: complete guide for artists and developers
How to create your own metahuman in unreal engine 5: complete step-by-step guide

What are common mistakes to avoid when creating a Metahuman in UE5?

When working with MetaHumans, especially for the first time, a few pitfalls can occur. Here are some common mistakes and how to avoid them:

  • Ignoring LOD Settings: Failing to use the MetaHuman_LODSync component can cause mismatched LODs, like hair switching to low detail while the face remains high, resulting in visual glitches. Always enable the LODSync component in the MetaHuman Blueprint to synchronize transitions across face, body, hair, and clothing. For cinematics, manually force LOD 0 to maintain consistent high detail in close-ups. This ensures seamless visual quality across all character components.
  • Overlooking Project Settings: Neglecting to enable ‘Support Compute Skin Cache’ in Project Settings > Engine > Rendering disrupts groom and cloth rendering, causing visual artifacts. Verify this setting is active, as it’s essential for MetaHuman hair and clothing to render correctly. Address any skin cache or groom warning messages promptly to avoid rendering issues. This foundational setup optimizes MetaHuman performance and appearance.
  • Not Using the Latest Version: Using a MetaHuman in an outdated UE5 version can lead to material, rig, or compatibility issues, breaking animations or visuals. Ensure the project matches or exceeds the MetaHuman Creator’s target engine version, selectable during creation. Stick to the latest UE5 release for optimal support and bug fixes. This prevents version-related disruptions in the workflow.
  • Facial Animation Missteps: Over-animating facial morphs creates unnatural, exaggerated expressions, while under-animating results in stiff, lifeless faces. Study real human expressions or use mocap to capture nuanced movements, avoiding extreme simultaneous morph values. Reference videos or MetaHuman Animator data to guide realistic animation. This balances subtlety and expressiveness for lifelike facial performances.
  • Mixing Materials Incorrectly: Replacing MetaHuman skin materials with simplistic ones strips away subtleties like peach fuzz or normal map details, diminishing realism. Always modify materials via instances, preserving the original shader structure, and verify normal maps after Mesh to MetaHuman to ensure forehead or cheek details align. Incorrect material swaps can make custom heads look off. This maintains the MetaHuman’s high-fidelity appearance during customization.
  • Clothing Weight Painting Issues: Poorly weighted custom clothing causes clipping or unnatural deformations during movement, especially if not rigged properly to the MetaHuman skeleton. Ensure accurate weight painting in external tools and match clothing LODs to the character’s for consistent detail levels. Hide underlying body mesh sections using material toggles or opacity masks to prevent poke-through. This delivers clean, professional-looking clothing animations.
  • Naming and Asset Management: Using non-ASCII characters (e.g., accents or symbols) in MetaHuman names or folder paths can break project packaging, causing build failures. Stick to standard alphanumeric names, like “JohnDoe,” when creating MetaHumans to avoid known issues. This simple naming convention ensures smooth asset management and deployment. It prevents obscure errors during project compilation.
  • Forgetting Physics Assets: Omitting physics configurations for long clothing or hair results in static or chaotic movement, like unmoving dresses or hair clipping through the body. Check the MetaHuman’s Physics Asset for ragdoll and cloth settings, and enable hair physics with head colliders in the Groom Asset. Test physics setups with animations to confirm natural dynamics. This adds realism to dynamic elements, enhancing visual polish.
  • Relying Only on MetaHuman Presets: Using unmodified presets for multiple characters risks “same-face syndrome,” where MetaHumans look too similar, reducing visual diversity. Customize by blending faces, tweaking noses, or adjusting skin hues to create distinct appearances. Leverage the Creator’s tools to push uniqueness, especially in projects with multiple characters. This ensures each MetaHuman feels individualized and memorable.
  • Not Testing on Target Platform: Designing MetaHumans solely on high-end PCs without testing on target platforms, like consoles or mobile via Pixel Streaming, can lead to performance issues. Test scenes with multiple MetaHumans on intended hardware to evaluate LODs, frame rates, and texture resolutions. Simplify materials or hair for background characters if needed. This ensures smooth performance tailored to the project’s deployment environment.
  • Missing Tutorials/Documentation: Overlooking Epic’s official documentation or community tutorials delays problem-solving and misses critical setup nuances. Regularly consult the MetaHuman Documentation, Unreal forums, or community posts for solutions to issues like rig detachment or packaging errors. Search for specific problems, as others likely encountered them. This accelerates learning and resolves workflow bottlenecks efficiently.

Avoiding these common mistakes through diligent setup, testing, and resource utilization ensures a robust MetaHuman workflow, maximizing their potential for stunning results.

How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

Where can I find tutorials to learn Metahuman creation in Unreal Engine 5?

Learning to fully leverage MetaHumans can take some practice, but thankfully there are plenty of resources and tutorials available:

  • Official Unreal Engine Documentation: Epic Games provides comprehensive MetaHuman Documentation on their developer site, covering creation, import, Mesh to MetaHuman, Animator, and troubleshooting. Step-by-step guides with images detail workflows, complemented by a robust FAQ section. This resource is the definitive starting point for beginners and advanced users. It ensures a thorough understanding of MetaHuman tools and best practices.
  • Epic’s Official Tutorials and Videos: The Unreal Engine YouTube channel hosts MetaHuman tutorials, including “MetaHuman Creator Quick Start” and “Animating MetaHumans with Control Rig.” The 2023 GDC presentation on MetaHuman Animator offers deep insights into facial capture techniques. Documentation pages often link to these videos, enhancing visual learning. These resources provide practical, expert-led guidance for key workflows.
  • Unreal Engine Learning Portal: Epic’s Unreal Online Learning platform offers free courses, potentially including MetaHuman-specific modules or broader UE5 content relevant to MetaHuman workflows. Check regularly for new additions tailored to MetaHuman creation or animation. This structured learning environment suits users seeking formal education. It’s ideal for building foundational skills alongside MetaHuman expertise.
  • Community Tutorials (Epic Dev Community): The community tutorial “Unlocking the Potential of MetaHuman Customization” provides advanced guidance on custom garments and materials, accessible with an Epic login. Unreal forums feature user-shared techniques, like hair customization or rig fixes, searchable by keywords like “MetaHuman clothing tutorial.” This taps into a wealth of collective knowledge. It’s invaluable for niche or troubleshooting queries.
  • YouTube (Community Creators):
    • JSFilmz: JSFilmz produces videos on MetaHuman creation, Mesh to MetaHuman workflows, and generating MetaHumans from photos, demonstrating practical steps. These tutorials cater to various skill levels, from beginner to intermediate. They’re widely accessible on YouTube, offering visual walkthroughs. They simplify complex processes with clear, hands-on examples.
    • MetaHuman Creator Tutorials: Videos like “Create your own MetaHuman in UE5” showcase live MetaHuman creation, guiding users through the Creator’s interface and tools. These are beginner-friendly, emphasizing practical application over theory. They help users visualize the end-to-end process. They’re ideal for those starting their MetaHuman journey.
    • Carlos Ruelle: An early MetaHuman adopter, Carlos Ruelle offers extensive customization tutorials, diving into advanced techniques like facial tweaks or material edits. These cater to users seeking in-depth workflows beyond basic creation. Search for his content on YouTube for detailed insights. It’s a go-to for technical, hands-on learners.
    • KeenTools Channel: KeenTools, creators of FaceBuilder, provides video demos on photo-to-MetaHuman workflows, focusing on Blender integration. These technical tutorials clarify the FaceBuilder-to-Mesh to MetaHuman pipeline for accurate likenesses. They’re perfect for users leveraging photos or scans. They bridge Blender and UE5 effectively.
  • Blender to MetaHuman Guides: Blender Artists forums and YouTube host guides on exporting models or using FaceBuilder for MetaHuman workflows, including a tutorial on generating MetaHumans from single images via a custom Blender add-on. These resources support hybrid Blender-UE5 pipelines for advanced customization. They’re essential for users integrating Blender into their MetaHuman process. They offer technical depth for modelers and animators.
  • Marketplace Assets Documentation: Purchased assets like PixelHair or MetaTailor include detailed instructions or video tutorials on MetaHuman integration, provided in readme files or accompanying documentation. These ensure proper setup for specific tools, like hair grooms or clothing fitters. Checking these resources avoids misuse and maximizes asset value. They’re critical for users leveraging third-party tools.
  • Examples and Sample Projects: Epic’s MetaHumans Sample Project, available for UE4.26 and updated for UE5, includes MetaHuman characters in a scene, demonstrating lighting, animation, and Blueprint setups. The “MetaHuman Playground” project showcases varied animations, serving as a hands-on learning tool. Dissecting these projects reveals practical implementation details. Download them from Epic’s site for interactive study.
  • Forums and Q&A: Unreal Engine Forums and the r/unrealengine subreddit are active hubs for MetaHuman discussions, addressing issues like hair physics or child-sized MetaHumans. Searching these platforms often reveals answered questions, saving time on common problems. They foster community-driven problem-solving and knowledge sharing. They’re a dynamic resource for real-time support.
  • Third-Party Courses: Platforms like Udemy or ArtStation Learning offer MetaHuman courses, though users should verify they’re updated for UE5 features like MetaHuman Animator. These provide structured, in-depth learning for comprehensive skill-building. Check course reviews for relevance and quality before enrolling. They suit users seeking formal, paid education.

Following tutorials while working on a simple MetaHuman project, like creating and animating a character, solidifies skills through hands-on practice. Stay updated for new resources as UE5 evolves, especially with major releases introducing features like MetaHuman Animator, ensuring access to the latest guidance.

How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

FAQ questions and answers

Below are 10 frequently asked questions about creating and using MetaHumans in Unreal Engine 5, along with concise answers:

  1. Is MetaHuman Creator free to use?
    Yes. MetaHuman Creator is completely free for anyone with an Epic Games account. The MetaHuman assets and plugin are free to use in Unreal Engine projects. (You cannot use them in other engines, but there’s no cost to create or use MetaHumans in UE5.)
  2. Do I need Unreal Engine 5 installed to make a MetaHuman?
    You can create MetaHumans using just a web browser and an Epic account (no local install needed for the Creator itself). However, to download and use the character in a project, you’ll need Unreal Engine 5 (with Quixel Bridge) to import the MetaHuman. Essentially, creation is cloud-based, but implementation is in UE5.
  3. What hardware or specs do I need for MetaHuman Creator?
    A: MetaHuman Creator runs via cloud streaming, so the requirements are modest: a Windows or macOS computer that can handle 3D in a browser (WebGL2 support). A stable internet connection is required. For using the UE5 plugin (Mesh to MetaHuman or Animator), a Windows PC with a decent CPU/GPU (e.g., at least 8-core CPU, 32GB RAM, RTX 2070 or better recommended) is suggested. MetaHuman Animator currently requires Windows (not Mac) and an iPhone for capture.
  4. How many MetaHumans can I create, and are there any limits?
    There’s no hard publicized limit on how many MetaHumans you can create on your account. You can make dozens of variations if you want. Just keep track of them in your MetaHuman Creator library. The main limitation is each MetaHuman’s data size (high-quality assets can be gigabytes), which will use disk space when downloaded. Also, if you plan to use many MetaHumans simultaneously in a scene, performance can become a factor.
  5. Can I make a MetaHuman that looks like a specific real person (e.g., a celebrity or myself)?
    You can approximate real people by manually tweaking features or using a workflow like FaceBuilder + Mesh to MetaHuman to capture their likeness from photos. However, achieving a perfect replica can be challenging due to MetaHuman’s blending limitations. Also note, if you plan to use a celebrity’s likeness, ensure you have rights to do so. Technically it’s possible to get very close to a real face with enough skill and reference data.
  6. Why won’t my MetaHuman import into my project or appear in Unreal?
    Common reasons: not logged into Bridge with the same account, forgetting to enable the MetaHuman plugin, or trying to import into an unsupported engine version. Make sure you used the same Epic account in MetaHuman Creator and Bridge, enable MetaHuman Plugin in UE5, and use Bridge’s Add button after download. Also ensure your project is UE5 (MetaHumans won’t work in UE4 without back-compat steps).
  7. Can I change my MetaHuman’s body or height?
    To an extent. MetaHuman Creator offers 18 body types (combination of 3 heights and 3 builds for each gender). You can also adjust the head scale relative to the body a bit. If you need a drastically different body (like a child or a very muscular superhero), you might have to do a custom solution (attach MetaHuman head to a custom body, or scale the entire MetaHuman which can look off for kids). Epic hasn’t provided child proportions yet.
  8. Are MetaHumans suitable for game performance?
    MetaHumans are high-quality and can be heavy on performance, but they are game-optimized with LODs. For next-gen platforms and PC, you can use MetaHumans in games (e.g., some games have started using them). On lower-end or mobile, you might use lower LODs or fewer MetaHumans. Each MetaHuman has about 8 LOD levels and you can simplify materials if needed. For crowds, you might not use MetaHumans for every character due to the complexity.
  9. Can I use a MetaHuman in other engines or software (Unity, Blender)?
    The MetaHuman license requires using them in Unreal Engine for real-time projects. You can export a MetaHuman to Blender/Maya for non-real-time work (like rendering a film or 3D printing) – that’s fine. But you cannot take the MetaHuman assets and use them in Unity or other game engines or share them as assets. Always render interactive content in Unreal. For static renders or videos, you have more leeway to use other tools after export.
  10. How do I make my MetaHuman talk or lip-sync?
    There are multiple ways. You can manually animate the face using Control Rig or morph targets, but the easiest is to use MetaHuman Animator / Live Link Face with an iPhone to capture your own performance and drive the MetaHuman’s facial animation. Alternatively, you can use third-party lip-sync tools that generate animation curves from audio, applied to the MetaHuman’s face (since it uses standard ARKit blendshapes). Once you have the animation (via capture or keyframe), apply it in Sequencer or via an Animation Blueprint to see your MetaHuman talk.

These FAQs address some of the top questions newcomers have. As you work with MetaHumans, you’ll naturally have more specific questions, but the above should give you a solid starting understanding.

How to create your own metahuman in unreal engine 5: complete step-by-step guide
How to create your own metahuman in unreal engine 5: complete step-by-step guide

Conclusion

MetaHumans in Unreal Engine 5 enable easy creation of realistic digital characters. This guide covered setting up tools, accessing MetaHuman Creator, customizing facial features, and using Mesh to MetaHuman for custom scans. It detailed importing characters into Unreal, adding custom clothes and hair, and preparing for animation.

Animation techniques included using Control Rig for keyframe animation, applying motion capture for body and face, and blending animations for nuanced performances. The guide addressed common pitfalls, offered best practices for cinematics, and emphasized lighting and directing MetaHumans for emotional authenticity.

In summary, UE5’s MetaHuman framework allows solo developers and small teams to create film-quality digital humans efficiently. Combining MetaHuman Creator’s ease with UE5’s customization and animation tools, you can craft realistic or stylized characters for games, films, and interactive media.

Harness the resources out there, keep an eye on updates (Epic is continually improving MetaHumans), and happy MetaHuman crafting! Your virtual characters are limited only by your imagination, and now you have the step-by-step knowledge to realize them in Unreal Engine 5.

Sources and Citation

  1. Epic Games – MetaHuman Official Page: “High-fidelity digital humans made easy” – Unreal Engine MetaHuman overview​unrealengine.comunrealengine.com.
  2. Epic Games Documentation – Creating a MetaHuman (MetaHuman Creator): Steps to create and launch MetaHuman Creator​dev.epicgames.comdev.epicgames.com.
  3. Epic Games Documentation – MetaHuman Plugin and Mesh to MetaHuman: Using custom meshes and facial capture for MetaHumans​dev.epicgames.comdev.epicgames.com.
  4. Epic Games Documentation – MetaHuman Body and Proportions: Available body types, heights, and proportions in MetaHuman Creator​dev.epicgames.comdev.epicgames.com.
  5. KeenTools (FaceBuilder) – Reddit Q&A: Workflow to transfer a real person’s likeness from photos to a MetaHuman using Blender FaceBuilder and Mesh to MetaHuman​reddit.com.
  6. Unreal Engine Forums – Importing MetaHumans: Using Quixel Bridge in UE5 to download and add MetaHumans to a project​forums.unrealengine.com.
  7. Blender Market (Yelzkizi) – PixelHair for MetaHumans: Description of PixelHair realistic hair assets and usage with MetaHumans​blendermarket.com.
  8. Unreal Engine Documentation – MetaHuman Face and Hair Customization: Adjusting skin, hair color, and other facial features in MetaHuman Creator​dev.epicgames.comdev.epicgames.com.
  9. Epic Games – MetaHuman Animator Announcement: Using iPhone for facial animation capture – MetaHuman Animator capabilities​unrealengine.com.
  10. Unreal Engine Dev Community – Common MetaHuman Issues: LOD Sync and other troubleshooting tips for MetaHumans (community tutorial video content)​forums.unrealengine.comdev.epicgames.com.

Recommended

Table of Contents

PixelHair

3D Hair Assets

PixelHair ready-made 3D full stubble beard with in Blender using Blender hair particle system
PixelHair Realistic Dreads 4c hair in Blender using Blender hair particle system
PixelHair ready-made iconic Asap Rocky braids 3D hairstyle in Blender using hair particle system
PixelHair ready-made Polo G dreads 3D hairstyle in Blender using hair particle system
PixelHair ready-made 3D hairstyle of Big Sean  Spiral Braids in Blender with hair particle system
yelzkizi PixelHair Realistic female 3d character full dreads 4c hair in Blender using Blender hair particle system
PixelHair ready-made 3D Jason Derulo braids fade hairstyle in Blender using hair particle system
yelzkizi PixelHair Realistic female 3d character 4 twist braids 4c afro bun hair with hair clip in Blender using Blender hair particle system
PixelHair ready-made top woven dreads fade 3D hairstyle in Blender using Blender hair particle system
PixelHair Realistic female 3d character curly bangs afro 4c hair in Blender using Blender hair particle system
PixelHair ready-made chrome heart cross braids 3D hairstyle in Blender using hair particle system
PixelHair ready-made full Chris Brown 3D goatee in Blender using Blender hair particle system
yelzkizi PixelHair Realistic female 3d character curly afro 4c big bun hair with scarf in Blender using Blender hair particle system
PixelHair ready-made short 3D beard in Blender using Blender hair particle system
PixelHair pre-made female 3d character Curly  Mohawk Afro in Blender using Blender hair particle system
yelzkizi PixelHair Realistic female 3d character curly afro 4c big bun hair with 2 curly strands in Blender using Blender hair particle system
PixelHair pre-made Curly Afro in Blender using Blender hair particle system
PixelHair pre-made Drake Double Braids Fade Taper in Blender using Blender hair particle system
PixelHair ready-made faded waves 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made iconic Lil Yatchy braids 3D hairstyle in Blender using hair particle system
PixelHair ready-made 3D hairstyle of lewis hamilton Braids in Blender
Dreads 010
PixelHair ready-made full 3D beard in Blender using Blender hair particle system
PixelHair ready-made Afro fade 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made top four hanging braids fade 3D hairstyle in Blender using hair particle system
PixelHair ready-made short 3D beard in Blender using Blender hair particle system
PixelHair ready-made curly afro fade 3D hairstyle in Blender using hair particle system
PixelHair ready-made full weeknd 3D moustache stubble beard in Blender using Blender hair particle system
PixelHair pre-made Chadwick Boseman Mohawk Afro Fade Taper in Blender using Blender hair particle system
PixelHair Realistic 3d character dreads fade taper in Blender using Blender hair particle system
Bantu Knots 001
PixelHair ready-made Lil Baby dreads woven Knots 3D hairstyle in Blender using hair particle system
PixelHair Realistic female 3d character pigtail dreads 4c hair in Blender using Blender hair particle system
PixelHair ready-made iconic Juice Wrld dreads 3D hairstyle in Blender using hair particle system
PixelHair ready-made Jcole dreads 3D hairstyle in Blender using hair particle system
PixelHair ready-made short 3D beard in Blender using Blender hair particle system
PixelHair ready-made 3D Lil Pump dreads hairstyle in Blender using hair particle system
PixelHair Realistic female 3d character bob afro 4c hair in Blender using Blender hair particle system
PixelHair ready-made goatee in Blender using Blender hair particle system
PixelHair Realistic female 3d character curly afro 4c big bun hair in Blender using Blender hair particle system
PixelHair ready-made 3D Beard of Khalid in Blender
PixelHair ready-made Snoop Dogg braids hairstyle in Blender using Blender hair particle system
PixelHair ready-made Long Dreads Bun 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made Braids Bun 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made Top short dreads fade 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made 3D Rihanna braids hairstyle in Blender using hair particle system
PixelHair pre-made Afro Fade Taper in Blender using Blender hair particle system
PixelHair Realistic female 3d character curly afro 4c hair in Blender using Blender hair particle system
PixelHair pre-made Lil Baby Dreads Fade Taper in Blender using Blender hair particle system
PixelHair ready-made 3D Dreads curly pigtail bun Hairstyle in Blender
PixelHair ready-made Pop smoke braids 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made full 3D goatee beard in Blender using Blender hair particle system
PixelHair pre-made The weeknd Dreads 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made Rema dreads 3D hairstyle in Blender using Blender hair particle system
PixelHair pre-made female 3d character Curly braided Afro in Blender using Blender hair particle system
PixelHair pre-made Drake Braids Fade Taper in Blender using Blender hair particle system
yelzkizi PixelHair Realistic female 3d character 4 braids knot 4c afro bun hair in Blender using Blender hair particle system
PixelHair ready-made Omarion full 3D beard in Blender using Blender hair particle system
PixelHair ready-made iconic J.cole dreads 3D hairstyle in Blender using hair particle system
PixelHair ready-made 3D  curly mohawk afro  Hairstyle of Odell Beckham Jr in Blender
PixelHair ready-made 3D hairstyle of Lil uzi vert dreads in Blender
PixelHair ready-made 3D hairstyle of Khalid Afro Fade  in Blender
PixelHair ready-made 3D Dreads hairstyle in Blender
PixelHair pre-made Ken Carson Fade Taper in Blender using Blender hair particle system
PixelHair ready-made short 3D beard in Blender using Blender hair particle system
PixelHair ready-made iconic xxxtentacion black and blonde dreads 3D hairstyle in Blender using hair particle system
Fade 009
PixelHair ready-made Afro fade 3D hairstyle in Blender using Blender hair particle system
PixelHair Realistic Juice 2pac 3d character afro fade taper 4c hair in Blender using Blender hair particle system
PixelHair ready-made female 3D Dreads hairstyle in Blender with blender particle system
PixelHair ready-made Braids pigtail double bun 3D hairstyle in Blender using Blender hair particle system
PixelHair Realistic r Dreads 4c hair in Blender using Blender hair particle system
PixelHair pre-made Afro Fade Taper in Blender using Blender hair particle system
PixelHair ready-made iconic 3D Drake braids hairstyle in Blender using hair particle system
PixelHair Realistic 3d character curly afro taper 4c hair in Blender using Blender hair particle system
PixelHair ready-made Drake full 3D beard in Blender using Blender hair particle system
PixelHair ready-made iconic Kodak thick black dreads 3D hairstyle in Blender using hair particle system
PixelHair ready-made Rhino from loveliveserve style Mohawk fade / Taper 3D hairstyle in Blender using Blender hair particle system
PixelHair ready-made 3D hairstyle of Nipsey Hussle Beard in Blender
yelzkizi PixelHair Realistic female 3d character Pigtail dreads 4c big bun hair in Blender using Blender hair particle system
PixelHair ready-made iconic 21 savage dreads 3D hairstyle in Blender using hair particle system
PixelHair ready-made Scarlxrd dreads hairstyle in Blender using Blender hair particle system
PixelHair Realistic female 3d charactermohawk knots 4c hair in Blender using Blender hair particle system
PixelHair ready-made 3D KSI fade dreads hairstyle in Blender using hair particle system
yelzkizi PixelHair Realistic female 3d character curly dreads 4c hair in Blender using Blender hair particle system
PixelHair ready-made dreads afro 3D hairstyle in Blender using hair particle system
PixelHair ready-made full 3D beard in Blender using Blender hair particle system
PixelHair ready-made 3D full big beard with in Blender using Blender hair particle system
PixelHair ready-made dreads pigtail hairstyle in Blender using Blender hair particle system
PixelHair Realistic 3d character afro dreads fade taper 4c hair in Blender using Blender hair particle system
PixelHair Realistic 3d character bob mohawk Dreads taper 4c hair in Blender using Blender hair particle system
PixelHair pre-made Chris Brown inspired curly afro 3D hairstyle in Blender using Blender hair particle system
PixelHair pre-made weeknd afro hairsty;e in Blender using Blender hair particle system
Fade 013
PixelHair ready-made pigtail female 3D Dreads hairstyle in Blender with blender hair particle system
PixelHair ready-made 3D hairstyle of Travis scott braids in Blender
PixelHair ready-made 3D hairstyle of Ski Mask the Slump god Mohawk dreads in Blender
PixelHair ready-made Big Sean braids 3D hairstyle in Blender using hair particle system
PixelHair ready-made 3D hairstyle of Dreadlocks wrapped in scarf rendered in Blender
PixelHair Realistic 3d character bob afro  taper 4c hair in Blender using Blender hair particle system