Creating a custom 3D character for Unreal Engine 5 (UE5) can be an exciting and rewarding process. Whether you’re aiming for a realistic human or a stylized cartoon character, the general workflow involves designing the character, modeling and texturing it in external software, then importing it into Unreal Engine for animation and use. In this comprehensive guide, we’ll answer common questions and break down the entire pipeline from choosing software and modeling your character to applying materials, rigging, animation, and optimization. By the end, you’ll know how to make your own character for Unreal Engine 5 using industry-best practices, including both traditional workflows and Epic’s powerful MetaHuman system. Let’s dive in!
How do I make my own character for Unreal Engine 5?
Making a character for UE5 involves several key stages. At a high level, you will go through concept/design, 3D modeling (or sculpting), retopology & UV mapping (for game-ready meshes), texturing, rigging, animating, and finally exporting and importing into Unreal Engine. Here’s a step-by-step overview:

- Design or Reference: Start with a concept or reference for your character’s appearance. Decide whether it’s realistic or stylized, human or creature, etc. Good concept art or reference images will guide the modeling process.
- Modeling/Sculpting: Create the 3D model using a digital content creation (DCC) tool. You can either sculpt high-detail geometry (common for organic characters) or model with polygons. Many artists sculpt a high-resolution model first, then derive a game-ready mesh from it (see step 3). This stage can be done in software like Blender, Maya, 3ds Max, or ZBrush.
- Retopology & UV Unwrapping: If you made a high-poly sculpt or model, create a low-poly version suitable for real-time. This process is called retopology – you rebuild the mesh with a cleaner, lower-resolution topology that captures the shape. Also unwrap the UVs of the mesh to prepare for texturing. A game-ready character typically uses mostly quad polygons for deformations and is optimized to a reasonable triangle count (e.g. tens of thousands of triangles, depending on target platform).
- Baking & Texturing: Bake detail maps from the high-poly model onto the low-poly (for example, normal maps to capture sculpted details). Then create textures for the character’s surfaces. This usually involves painting or generating PBR (Physically Based Rendering) textures such as Base Color, Roughness, Metallic, Normal, and possibly Ambient Occlusion. You can use tools like Substance 3D Painter or Blender’s texture painting. The goal is to make materials that will look good under Unreal’s lighting. (For instance, PBR materials in UE5 use Base Color, Roughness, Metallic, and Specular inputs by default.)
- Hair and Accessories: Model any hair, fur, or accessories (like armor, clothing pieces, etc.). Hair can be a complex topic on its own (see the section on PixelHair below for one approach). You might use hair cards with alpha textures for games, or UE5’s groom system with strand-based hair for higher fidelity. Accessories might be separate meshes attached to the character.
- Rigging: Set up a skeletal rig for your character. Rigging means creating an armature (bone hierarchy) and skinning the 3D mesh to those bones so it can deform. You can rig in tools like Blender (e.g. using Rigify addon), Maya (HumanIK or ART tools), or use autoriggers (Mixamo, AccuRIG, etc.). Ensure the rig supports all the motions you need (spine, limbs, fingers, facial bones or blendshapes, etc.). This stage is crucial for animation.
- Animation: Create or acquire animations for your character. You can animate by hand in the DCC or in Unreal’s Sequencer/Control Rig, or use motion capture data. Many developers also retarget existing animations (from UE5’s mannequin or marketplace, or Mixamo) onto their custom character to save time (more on retargeting later). At a minimum, test a T-pose/A-pose and some basic animations to ensure the rig deforms the character well.
- Export to Unreal: Once the character model is textured, rigged, and has animations (or at least a bind pose), export it from your DCC in a format Unreal Engine supports (commonly FBX for skeletal meshes). Ensure you export the skeleton, mesh, and any animations correctly scaled and oriented for UE5.
- Import into UE5: In Unreal Engine, import the FBX (or other format). UE5 will create a Skeletal Mesh asset for your character, a Skeleton asset (if it’s a new skeleton), and can also import animation sequences. You’ll then set up materials in UE by plugging in your texture maps to a Material asset. At this point, you should see your character rendered in UE5’s content browser and you can drag it into a level or use it in a Blueprint.
- Setup and Testing: Set up a Physics Asset for ragdoll or physical simulations (Unreal can generate this for you on import). If your character has cloth or hair, configure Chaos Cloth or the Groom system as needed. Then test the character with animations – for example, retarget some existing animations or play an imported animation to verify everything looks correct in-engine. Iterate on weights or correctives if you see deformation issues.
Throughout this process, remember that a lot of work happens before you ever touch Unreal Engine. In fact, designing, modeling, surfacing (texturing), rigging, and animating the character are significant tasks that occur in external software “long before you even touch the Unreal Engine”. Unreal comes into play once you import the character for use in the engine.
By following these steps, you’ll create a character from scratch and get it running in UE5. In the sections below, we’ll dive deeper into each aspect of this pipeline, and also discuss alternatives like using MetaHuman Creator or ready-made character systems.

What is the best character creation workflow for Unreal Engine 5?
The “best” workflow can vary based on your goals (real-time game vs. film animation), your art style (realistic vs. stylized), and the tools you prefer or have available. Generally, however, an optimal Unreal Engine 5 character workflow will include all the stages mentioned above, organized in a logical sequence to maximize efficiency and quality. Here are some tips on crafting an ideal workflow:
- Use Specialized Tools for Each Stage: In professional pipelines, artists use different software for different tasks – e.g. ZBrush for high-poly sculpting, Maya or Blender for retopology and rigging, Substance Painter for texturing, etc. Choose tools that play well together and export reliably to Unreal. (We discuss software options in the next section.)
- High-Poly to Low-Poly Pipeline: For high-quality results, it’s common to sculpt a high-detail model and then create a low-poly version for the game. This way you capture realism (through normal maps and detailed textures) while keeping the in-game mesh efficient. For film or cinematics, you might use the high-poly directly or a subdivision surface, but for a game-ready character, low-poly + normal maps is the way to go.
- Early Rigging Tests: It’s wise to rig and test basic deformations before investing too much in detailed texturing. A rough rig on a proxy mesh can reveal proportion issues or topology problems early. You can refine the rig weights after finalizing the model, but early testing prevents nasty surprises where the character bends or animates poorly.
- Leverage MetaHuman for Realistic Humans: If your goal is a realistic human character and you don’t need a completely unique design, consider using MetaHuman Creator (Epic’s cloud-based character creator). It can significantly shorten the workflow by providing a high-quality rigged human with realistic textures and hair in minutes. The MetaHuman comes fully rigged, with facial blendshapes and eight levels of detail (LOD) for performance, which covers many tedious aspects of the workflow automatically. You can then customize or animate it in UE5. (We cover MetaHumans in detail later.)
- Plan for Animations and Engine Integration: Think about what animations the character will need (walking, running, talking, etc.) and whether you will create them or use existing ones. If you plan to use marketplace animations or UE5’s default mannequin animations, consider rigging your character to the UE5 mannequin skeleton (also called the Epic skeleton). This can save time via direct retargeting. If using a custom rig, be ready to use Unreal’s IK Retargeter to map animations from another skeleton onto yours.
- Keep an Eye on Performance: The workflow for a game-ready character might diverge from a film/cinematic character in terms of optimization. For a game, you’ll need to budget polygon count, texture resolution, and use LODs. For a cinematic in Unreal (using movie render queue, etc.), you can push higher detail (maybe use 8K textures or groom hair) since it’s offline rendering or targeting high-end hardware. Decide early and adjust your workflow. For instance, if it’s for a game, you’ll retopologize more aggressively and create multiple LOD meshes; if it’s for a cinematic, you might skip some optimization steps to preserve quality.
- Use a Structured Pipeline: A recommended high-level pipeline is: concept → modeling → high-poly sculpt (if needed) → low-poly retopo → UV unwrap → bake maps → texture painting → rigging → animation → engine import. Unreal Engine’s official documentation also outlines a similar flow: “Create your art assets (Skeletal Meshes) and animations using a 3rd party DCC (e.g. 3ds Max or Maya). Import your Skeletal Meshes and animations into Unreal Engine…”. Following such a structure ensures you don’t miss crucial steps.
Remember that each artist may tweak the workflow to their needs, but the fundamental idea is to use the right tools for each task and integrate with Unreal at the import stage. In summary, the best workflow is one that produces a character meeting your artistic vision and technical requirements, with as little rework as possible. Plan ahead, leverage available technology (like MetaHuman for humans, or auto-riggers for faster rigging), and test your asset in-engine as you go.

What software should I use to create a custom character for Unreal Engine 5?
Creating a character is a multi-step endeavor, and typically you will use several different software applications (often called DCC tools – Digital Content Creation tools) throughout the process. Unreal Engine 5 itself is the target environment, but you cannot create a full character entirely inside Unreal – you’ll need external programs for modeling, texturing, etc. In fact, one Unreal Engine expert plainly states: “You cannot create completely new characters inside of the Unreal Engine. To create a character you’ll need a 3D modeling program.”. With that in mind, here are the categories of software you should consider and some popular choices for each:
- 3D Modeling & Sculpting: For building the character’s 3D geometry. You can use Blender (free and open source) which offers both polygon modeling and sculpting tools – “the free and open source 3D creation suite” is fully capable of character creation. Other options include Autodesk Maya or 3ds Max (industry-standard for modeling and animation, but expensive), ZBrush (industry-standard sculpting software for high-detail organic modeling), or Mudbox/3DCoat for sculpting as well. Many artists use a combination (e.g. Maya for base mesh and ZBrush for sculpting details).
- Retopology & UV Unwrapping: Often done in the modeling software itself (Blender, Maya, etc.). There are specialized tools like TopoGun or ZBrush ZRemesher for retopology. UV mapping can be done in Blender, Maya, RizomUV, etc. Ensure whatever tool you use can export an OBJ/FBX with the new topology and UVs.
- Texturing & Baking: The go-to choice for PBR texturing is Adobe Substance 3D Painter, which allows you to paint directly on the model and export Unreal-ready texture maps. Alternatives include Quixel Mixer, 3D Coat, or Blender’s texture paint mode. Baking high-to-low poly maps can be done in Substance Painter, Marmoset Toolbag, Blender, or xNormal. Substance Painter is particularly convenient as it can bake maps and let you see the PBR material under environment lighting similar to Unreal’s. Photoshop or GIMP can also be used to edit or create textures (e.g. hand-painting stylized textures).
- Rigging & Animation: If you use Maya, it has powerful rigging tools (and Epic provides an ART (Animation Rigging Toolkit) for Maya geared toward Unreal Engine skeletons). Blender has rigging tools and the Rigify addon for human rigs. Specialized rigging tools like Auto-Rig Pro (Blender addon) or Advanced Skeleton (Maya) can accelerate the process. For animation, Maya is widely used in industry for its animation tools, but Blender is also fully capable of character animation. There are also middleware solutions: Mixamo, a free online service by Adobe, can auto-rig a humanoid character and even provide some basic animations. Reallusion AccuRIG (part of ActorCore) is another free auto-rigger for humanoid models. These can save time if you’re not an expert rigger.
- Hair and Cloth Creation: If your character has hair and clothing, you might use additional software. Marvelous Designer is popular for designing realistic clothing with physics simulation (exporting the garment mesh for use in Unreal). For hair, you might model hair cards in Blender/Maya or use Blender’s particle hair system and convert to cards or use Alembic exports. Unreal Engine 5 also supports importing groom data (strand-based hair) via Alembic. We’ll discuss hair further under PixelHair and in the physics section.
- Specialized Character Creators: Aside from manual creation, there are softwares specifically for generating characters. For example, MetaHuman Creator (cloud app by Epic) for realistic humans, Reallusion Character Creator for highly customizable humans (and others) with an easy pipeline to Unreal, and Daz 3D for a large library of ready characters and morphs. These tools are covered in their own sections below, but keep in mind they combine many of the above steps (modeling, texturing, rigging) into one package at the cost of some creative flexibility.
In summary, you might use multiple programs: one for modeling (e.g. Blender), one for sculpting detail (ZBrush or Blender), one for texturing (Substance Painter), and one for animation (Blender or Maya). The good news is Unreal Engine imports standard formats from these tools (like FBX, PNG textures, etc.) fairly seamlessly. Epic even provides official addons to improve Blender→Unreal workflows – for instance, a “Send to Unreal” addon for one-click export, and a “UE to Rigify” addon to help animate Unreal skeletons in Blender.
Choose the software that fits your budget and skill – Blender is a fantastic all-in-one starting point for beginners (with no cost), whereas industry veterans might stick to Maya/ZBrush/Substance. What matters is that the tool can produce formats UE5 accepts, and there’s plenty of documentation for getting assets from that tool into Unreal.

What are the best tools for character modeling for Unreal Engine 5?
When it comes specifically to 3D modeling (creating the character’s mesh geometry), you have several top-tier tools to choose from. “Best” can depend on personal preference, but here are some of the leading options and their strengths, especially in the context of making characters for Unreal Engine 5:
- Blender: Blender is free but very powerful. It has a full suite of modeling tools. For character artists on a budget, Blender is often the best choice since it can handle the entire modeling workflow in one program. The Blender community also provides many addons that integrate well with Unreal Engine pipelines. Many indie and even studio projects now incorporate Blender for character work. If you’re just starting, Blender is an excellent “best tool” contender.
- ZBrush: ZBrush is the industry standard for high-detail sculpting, allowing artists to craft organic characters with millions of polygons, perfect for skin textures, facial details, or creature designs. Features like ZRemesher and Dynamesh enable rapid sculpting, but high-poly models require retopology in Blender or Maya for game optimization. Artists bake normal and displacement maps to transfer details to low-poly meshes. ZBrush’s steep learning curve is offset by its unmatched precision, making it essential for realistic or complex characters in UE5 projects, especially when paired with other tools.
- Autodesk Maya: Maya is robust for creating clean topology. Maya shines especially if you plan to animate in the same package, as its rigging and skinning tools are powerful. Also, Maya’s familiarity in industry means there are established workflows (and tutorials) for Maya to Unreal Engine, including Epic’s ART tools. However, Maya is expensive (subscription-based), so it’s usually best if you already have access or are collaborating with a studio. For purely modeling purposes, it doesn’t offer sculpting like Blender/ZBrush, but it’s great for making the final game-ready topology and doing complex hard-surface modeling.
- Autodesk 3ds Max:Similar to Maya, 3ds Max is used in some studios for modeling (more historically in game environment modeling, but it can model characters too). It has a capable poly modeling toolset. Fewer new character artists gravitate to Max compared to Maya or Blender, but it’s still a powerful tool if that’s your preference. Like Maya, it’s paid software.
- Sculpting Alternatives: Apart from ZBrush, you have Mudbox (Autodesk’s sculpting tool) and newer tools like Nomad Sculpt (on tablets) or 3DCoat (which does sculpting and painting). These aren’t as common in an Unreal workflow but can be used to create the character’s high detail before retopo.
- Topogun / Maya / Blender (Retopo): For the specific task of retopology (creating the low-poly mesh), dedicated tools like Topogun have traditionally been used. However, nowadays both Blender and Maya have decent retopology tools (Blender’s PolyBuild and snapping, Maya’s Quad Draw). There’s also Wrap3 for converting topology from one model to another (helpful if you have a base mesh topology saved). The best tool here is whichever lets you efficiently draw a clean mesh over your high-poly model.
- MetaHuman (for base mesh): If your character is a human, one interesting approach is to export a MetaHuman’s mesh as a starting point for modeling. MetaHumans are very high-quality; you could take a MetaHuman face mesh and modify it in your modeling software to create a new character, then re-import via Mesh to MetaHuman or use your own rig. This is a hybrid approach that some artists explore to leverage the excellent topology and proportions of MetaHumans as a base.
The ideal tool aligns with your project’s goals Blender for cost-effective versatility, ZBrush for intricate sculpting, or Maya for professional-grade animation workflows. All tools export to UE5 via FBX or OBJ, enabling flexible character creation for any game style.

Can I use Blender to make characters for Unreal Engine 5?
Yes – Blender is an excellent choice for making characters for Unreal Engine 5. In fact, Blender can cover nearly the entire character creation workflow, from modeling to rigging to even basic animation. Many indie game developers and hobbyists use Blender as their primary tool to create custom characters for UE5.Here’s how you can use Blender in the pipeline:
- Modeling and Sculpting: Blender’s modeling tools support box-modeling for precise geometry and sculpting for organic details, using brushes similar to ZBrush. Sculpt Mode’s Dynotopo enables dynamic topology for freeform sculpting, while Multiresolution modifiers handle high-detail work. Though ZBrush excels for ultra-high-poly sculpts, Blender’s sculpting is sufficient for most UE5 characters, with recent updates improving brush performance and detail fidelity for professional results.
- Retopology: Blender has tools to retopologize, like the Shrinkwrap modifier and snapping tools that let you retopo manually over a high-poly model. You can also use addons like RetopoFlow for a more interactive retopo experience. This allows you to get a clean low-poly mesh from your Blender sculpt.
- UV Unwrapping: Blender’s UV editor will let you unwrap your character’s UVs. It has features for marking seams and unwrapping, packing islands, etc. This is important for texturing later.
- Texture Painting: With Blender’s Texture Paint mode, you can do some painting directly on the model (useful for hand-painted textures or quick edits). However, many artists export to Substance Painter for advanced painting. Still, Blender can generate normal maps (via baking) and even paint masks or simple colors if needed.
- Rigging: Blender’s armature system supports complex skeletons, with tools for weight painting and inverse kinematics (IK). The Rigify addon generates humanoid rigs with pre-built controls, compatible with UE5’s mannequin skeleton for easy animation retargeting. Epic’s UE to Rigify addon simplifies exporting rigs, ensuring seamless integration with Unreal’s animation systems and marketplace assets.
- Animation: You can animate your character in Blender’s Dope Sheet and Graph Editor. If you prefer animating in Blender (for example, doing a walk cycle or a unique action), you can export those animations to Unreal as FBX. Blender supports exporting baked animation on the rig, which UE5 can import as an animation asset.
- Export to Unreal: Blender’s FBX export is enhanced by addons like “Blender to Unreal” or Epic’s “Send to Unreal,” which automate scale, orientation, and asset transfer. Correct settings (e.g., -Z Forward, Y Up) ensure compatibility with UE5’s coordinate system. These tools minimize import issues, supporting iterative workflows where artists can quickly test and refine characters in-engine.
Blender’s comprehensive features, extensive tutorials, and free access make it a top choice for UE5 character creation. Its ability to handle stylized or realistic designs, combined with Unreal-specific integrations, ensures smooth pipelines and professional-quality results.

How do I export a character from Blender to Unreal Engine 5?
Exporting a character from Blender to Unreal Engine 5 is typically done using the FBX format, as it retains skeletal mesh and animation data. Here are the steps and important settings to ensure a smooth export-import process:
- Apply Transforms: In Blender, orient your character to face -Y to align with Unreal’s +X forward axis, ensuring consistent animation and physics behavior. Apply all transforms (Ctrl+A > Location, Rotation, Scale) to lock in the model’s position, rotation, and scale. This prevents issues like rotated bones or skewed meshes during UE5 import, ensuring the character appears as intended in-game.
- Unit Scale: Set Blender’s Unit Scale to 0.01 (1 unit = 1 cm) in Scene Properties to match Unreal’s centimeter-based units. During FBX export, enable “Apply Scalings” and select “FBX Unit Scale” to maintain consistent sizing. This avoids characters importing 100x too small or large, ensuring proper scale relative to UE5’s 180-unit-tall mannequin.
- Exporting Skeletal Mesh: Select your mesh and armature in Blender, then export as FBX, enabling options like Selected Objects, Mesh, and Armature in the export settings. Set -Z Forward, Y Up to align axes, disable “Add Leaf Bones” to avoid extra bones, and enable “Always Sample Animations” to ensure smooth animation curves. These settings create a clean skeletal mesh compatible with UE5’s import pipeline.
- Shape Keys (Morphs): If your character has shapekeys (blendshapes), you can export them too. In the FBX export, under Geometry, enable Shape Keys. Unreal will import them as Morph Targets on the skeletal mesh.
- Export and Import to UE5: Save the FBX file and import it into UE5’s Content Browser via drag-and-drop or the Import button. In the FBX Import Options, select Skeletal Mesh, enable “Import Mesh” and “Import Skeleton,” and include animations if present. Verify the skeleton’s root bone aligns with Blender’s armature to ensure proper rigging and animation functionality.
- Verify Scale and Orientation: Drag the imported character into a UE5 level to check its scale, which should match the 180-unit height of UE’s mannequin. Confirm orientation to ensure the character faces correctly. If misaligned, revisit Blender’s export settings (e.g., transforms or axes) and re-export. Correct scale and orientation prevent physics or camera issues in gameplay.
- Materials and Textures: FBX imports carry material slots; If you named materials in Blender (e.g. “Body”, “Eyes”), UE5 will have those slots. You can then import your texture PNGs and create UE5 Materials, then assign them to the mesh slots. Note that Blender’s material shader doesn’t transfer – you must recreate it in Unreal using the exported texture maps.
- A pro tip: Epic’s “Send to Unreal” addon automates many of these settings. It can export selected objects and send them directly to the Unreal project with the correct scale and orientation, reducing manual error.
- Common pitfalls to avoid:
- Not applying rotation/scale (leads to odd scales or rotated mesh in UE).
- Wrong unit scale (leading to tiny or giant characters).
- Forgetting to export the armature (leading to a static mesh import rather than skeletal).
- Having multiple actions in Blender’s NLA – better to bake or export one action at a time or use the appropriate settings.
With careful attention to FBX settings, scale, and orientation, Blender-to-UE5 exports become a reliable, repeatable process, ensuring your character integrates seamlessly with full rigging and animation support.

How do I rig and animate a custom character for UE5?
Rigging is the process of creating a skeleton for your character and binding the 3D mesh to that skeleton (so the mesh moves with the bones), and animation is creating movements by manipulating that rig over time. For a custom UE5 character, you have a few options to rig and animate:
- Rigging in DCC Software: In Blender or Maya, create a skeletal armature with bones for the torso, limbs, head, fingers, and optional accessory bones (e.g., for cloaks). Bind the mesh to the skeleton using automatic or manual weight painting, ensuring smooth deformations at joints. Include a single root bone (e.g., pelvis) at the hierarchy’s base for Unreal compatibility. Test bone rotations to verify weights, then export as FBX with armature and mesh for UE5 import.
- Using Auto-Riggers:
- Mixamo: Upload your humanoid mesh to Mixamo’s online platform, marking key joints like knees, elbows, and hips for auto-rigging. Mixamo generates a game-ready rig in minutes, downloadable as FBX with UE5-compatible skeletal structure. Access Mixamo’s vast animation library for instant motions, though skin weights may need refinement in Blender or Maya for optimal deformation quality.
- AccuRIG by Reallusion: A free tool that similarly auto-rigs humanoid models with an easy interface. It’s relatively new but integrates with Reallusion’s ecosystem and exports FBX for Unreal nicely.
- Rigify (Blender): Not exactly auto-rig (you still adjust a template), but Rigify generates a rig and controls that are human-friendly. You’d still need to export the deform bones to Unreal.
- HumanIK (Maya): Maya’s Quick Rig or HumanIK can quickly rig a biped as well.
- Retargeting to Epic Skeleton: Match your rig’s bone hierarchy and naming to UE5’s mannequin skeleton to use marketplace animations without retargeting. Import the mannequin’s FBX into your DCC, align bones (e.g., spine, arms), and weight the mesh to match. Import into UE5 using the mannequin’s skeleton, enabling direct animation compatibility and simplifying workflows for marketplace assets.
- Animating the Character:
- External Animation: In Blender or Maya, animate your rig with keyframes to create motions like walk cycles, jumps, or combat moves. Use tools like Blender’s Graph Editor or Maya’s Trax Editor for precise control. Export animations as FBX, ensuring they use the same skeleton, and import into UE5 as animation assets for gameplay or cinematics.
- Marketplace/Mixamo Animations: If you don’t want to animate from scratch, you can obtain animations from various sources. These will likely be on different skeletons, but you can retarget them in Unreal to your character’s skeleton. Unreal Engine 5’s IK Retargeter makes it easier to transfer animations from, say, the default mannequin skeleton to your custom skeleton by aligning poses and bone mappings.
- Animating in Unreal: UE5’s Control Rig system allows in-engine animation via the Sequencer, using forward and inverse kinematics to pose characters. Create or tweak animations for gameplay-driven effects, like reacting to player inputs. This bypasses external DCCs, offering flexibility for procedural or dynamic character motions.
- Testing the Rig: Before finalizing, always test your rig with some animations or at least rotate the bones to see if deformations look right. Check for issues like collapsing elbows , twisting wrists, etc. It’s easier to fix in the DCC and re-export than to realize it during gameplay.
Rigging can be manual for precision or automated for speed, while animations can be crafted externally or retargeted from libraries. Aligning with Epic’s skeleton or using UE5’s retargeting tools maximizes efficiency, delivering a dynamic, game-ready character.

How do I texture my own character for use in Unreal Engine 5?
Texturing is a critical step in bringing your character to life, as it adds colors, material definition, and fine details. To texture your own character for UE5, follow these guidelines:
- UV Unwrap the Character: your 3D model needs UV coordinates (a mapping from the 3D surface to a 2D image space). Ensure your character’s UVs are laid out efficiently – minimize stretching and seams in obvious places. Common approach: separate UV islands for head, body, limbs, etc. For game characters, make good use of UV space for important areas (face, hands) and less for hidden areas.
- Decide on Art Style: Choose between realistic PBR texturing, which uses Base Color, Roughness, and Metallic maps for physically accurate materials, or stylized texturing, like hand-painted diffuse maps or cel-shaded effects. PBR aligns with UE5’s lighting for photorealism, while stylized textures emphasize artistic expression. Your choice influences tool selection, map complexity, and material setup in Unreal.
- Use a Texture Painting Tool: The easiest way to texture a character is to use a 3D painting program so you can paint directly on the model and see how textures wrap. Substance 3D Painter is one of the best tools for this. You import your mesh then paint on layers. You can fill layers with smart materials, paint masks to apply dirt or color variation, and so on. Substance Painter will let you export all the required maps for Unreal. As you paint, you ensure consistency in style and physically based values.
- Bake Supporting Maps: If you created a high-poly sculpt, bake its detail into maps for your low-poly. The Normal Map captures surface detail (muscle definition, wrinkles). Ambient Occlusion (AO) map helps ground contact shadows. You might also bake a Curvature map (for edge wear) and an ID map, Substance Painter can use these for smart texturing.
- Texture Map Types for Unreal:
- Base Color: A flat color texture defining the character’s inherent colors, free of lighting or shadow information. This map drives the visual appearance, from skin tones to clothing patterns, and is essential for all materials. Export as PNG or TGA for high quality, ensuring compatibility with UE5’s PBR workflow.
- Normal: A tangent-space map baked from high-poly details, adding depth and lighting cues like bumps or creases without extra geometry. Critical for realistic or stylized surfaces, it uses BC5 compression in UE5 for efficiency. Verify the green channel aligns with Unreal’s normal map standards to avoid shading errors.
- Roughness: A grayscale map controlling surface smoothness, where 0 represents a glossy finish and 1 a matte surface. Often packed with Metallic and AO into an RMA texture, it defines how light interacts with materials, like shiny metal or rough fabric. Adjust values to match your art style in UE5.
- Metallic: A grayscale map distinguishing metal surfaces (1) from non-metal surfaces (0), used for elements like armor, jewelry, or weapons. Packed in RMA textures to save memory, it ensures accurate PBR reflections and shading in Unreal’s lighting environment, enhancing material realism.
- Resolution and Texture Budget: Use 2K or 4K textures for main characters to balance detail and performance, dropping to 1K for NPCs or mobile platforms. Generate mipmaps to optimize rendering at varying distances, and adjust LOD bias to reduce aliasing. Limit texture sizes to conserve VRAM, aligning with your project’s performance targets in UE5.
- Apply and Adjust in Unreal: Import your texture maps (BaseColor, Normal, etc.) into Unreal. Create a Material for your character. Plug the maps into the respective inputs of the Material node. If you packed Roughness/Metallic/AO in one texture, you’ll mask out the channels appropriately . Apply the material to your character’s mesh. In UE5, you can use the Material Instance to tweak values quickly.
- Subsurface Scattering (SSS) for Skin: For realistic skin, enable UE5’s Subsurface shading model and paint a reddish Subsurface Color map to simulate light scattering through flesh. This enhances effects like backlit ears or soft skin glow, critical for photorealistic characters. Stylized characters may skip SSS, using simpler diffuse materials for performance.
- Hand-Painted vs Procedural: If your style is hand-painted (like a cartoon character), you might simply paint a diffuse map in Blender or Photoshop with all the details (and use little to no normal/roughness maps, or just simplistic ones). That’s fine – you’d then set the material in UE to maybe unlit or lit with constant roughness. Just ensure consistency in style. Tools like 3D Coat are also popular for hand-painted texturing.
- Using Quixel or Texture Libraries: For clothing or armor, use high-quality textures from libraries like Quixel Megascans (via Quixel Bridge in UE) or overlay scanned textures in Substance Painter for details (e.g., fabric weave, leather pores). Ensure proper UV mapping for correct texture appearance (adequate texel density). Evaluate the character in Unreal under various lighting conditions (bright outdoor, dark indoor) to confirm PBR textures look consistent. Adjust roughness or specular values if the material appears too shiny or flat.
Test your textured character in UE5 under various lighting conditions to ensure PBR accuracy or stylized coherence. Optimize by compressing textures, packing channels, and adjusting mipmaps, creating visually compelling characters that perform efficiently in real-time.

How do I make a game-ready character from scratch for UE5?
Making a game-ready character from scratch means creating a character model that not only looks good but is optimized to run efficiently in real-time within Unreal Engine 5. Here’s how to approach it:
- Start with a Concept: From scratch implies you’re designing the character. Begin with sketches or reference boards for the character’s look, attire, proportions, etc. Having a clear concept helps during modeling.
- High-poly Sculpt or Model: Sculpt the character in high detail (for organic shapes) or model precisely (for hard-surface elements). This is the creative stage where you focus on appearance and detail without worrying about polygon count. For instance, sculpt the musculature, face details, folds in clothing, etc. in ZBrush or Blender. If it’s a more stylized or low-detail character, you might skip high-poly and directly model a mid-poly that can be optimized.
- Retopology for Low-Poly: Create a clean low-poly mesh that captures the silhouette and major forms of the high-poly. Aim for an efficient use of polygons: more where there’s deformation and less where flat or unimportant. A modern realistic hero character on PC/console might be anywhere from 20,000 to 100,000 triangles, depending on complexity and platform, while a mobile game character might need to be under 10k. The exact count depends on your needs . Also create lower LODs if needed, but you may manually make a few decimated versions for better control. The goal is a topology that deforms well.
- Unwrap UVs: Lay out UVs for the low-poly. For game characters, you might use multiple UV sets or UDIMs if extremely high detail is needed, but generally one or two texture sets are used. Pack UVs to maximize usage of space while keeping logical separations.
- Bake Detail Maps: Bake your high-poly detail onto the low-poly UV. This produces the Normal map, Ambient Occlusion map, etc. This way, your low-poly in Unreal will appear to have the detail of the high-poly thanks to normal mapping, even though it’s much lighter. Use a cage if needed to avoid baking errors, and make sure your bake captures fine details.
- Texture Painting: Create your PBR textures as described earlier. Since it’s game-ready, pay attention to material separation – in games you want to minimize the number of material slots on the character. Ideally use one material for the whole character if possible. Sometimes you’ll have two. Keep number of materials low for performance. So, you might make one big texture atlas that includes clothing, skin, etc., to use one material. Paint all necessary maps and test that they look good in a real-time viewer.
- Rigging the Low-Poly: Rig the optimized low-poly mesh (the high-poly is not rigged, it’s just for baking). Ensure the skin weights deform the low-poly smoothly. Sometimes you’ll discover topology changes needed when rigging – it’s an iterative process. Because it’s game-ready, also consider adding any needed helper bones for game animations. For UE, you’ll also later set up physics assets for ragdoll or cloth if needed.
- Animation and Testing: If you have animation data or use existing animations, apply them to see how the character moves. This is where you catch problems: clipping issues, weight paint issues, etc. Tweak accordingly.
- Optimize Mesh and Materials:
- Polycount: Use only as many tris as needed for silhouette and deformation. If the character is a hero viewed up close, you can afford more. If it’s a crowd NPC, use less. Use decimation or manual LOD creation for when the character is far away to drop polycount. Unreal has a feature to auto-generate LODs and even a built-in LOD Sync for characters to swap skeletal mesh detail.
- Materials: Reduce materials. Each material might also require its own draw call and set of textures. But don’t unknowingly have 10 materials on a character with tiny pieces – game-ready means maybe 2-3 at most. Epic’s documentation suggests merging parts and limiting material slots to improve performance.
- Textures: Use appropriate resolution and compress them. Perhaps use an RGB mask texture for metal/roughness/AO to cut down on number of texture files. Generate mipmaps to handle distances. For mobile, consider using compressed formats like ETC or ASTC and smaller size. Also, if multiple characters share texture sets, that could be an atlas approach (but that’s more for crowds).
- Import to Unreal and Finalize: Bring the skeletal mesh into UE5 via FBX. Assign the materials and import animations. Then set up things like Physics Asset for ragdoll collisions, Anim Blueprint for your character’s behavior. If it’s truly game-ready, you likely integrate it with gameplay.
Test LODs, animations, and performance on target hardware to ensure scalability across platforms. A game-ready character combines visual fidelity with optimization, delivering seamless performance in UE5’s real-time environment.

Can I use MetaHuman Creator to make my own character for Unreal Engine 5?
Yes, MetaHuman Creator is a fantastic (and often the easiest) way to create your own realistic human character for Unreal Engine 5. MetaHuman is a cloud-based app by Epic Games that enables you to build highly detailed digital humans that are rigged and ready for animation in UE5. Here’s what you need to know:
- MetaHuman Overview: MetaHuman is a character creator focusing on realism. You use a web interface (via browser streaming) to customize a base human model. You can adjust facial features , skin complexion, hair style, body proportions, and more. The creator ensures everything stays within the realm of realistic human anatomy (so you can’t make extreme cartoon proportions, but you can make a diverse range of ethnicities, ages, and appearances.
- Quality and Rigging: MetaHumans are extremely high-quality. The face rig is especially advanced, supporting detailed facial animations through a combination of bone and blendshape (morph target) controls. The body rig is based on Epic’s UE4 mannequin skeleton, so it works with many animations. The asset includes levels of detail (LOD) from LOD0 (highest, for cinematics) down to LOD7 or 8 (for distance). “MetaHumans come with eight levels of detail (LODs), enabling you to achieve real-time performance on your target platforms.”unrealengine.com. This means they are game-optimized despite the high detail.
- Using MetaHuman Creator: You need an Epic Games account and internet connection. You log into the MetaHuman Creator (via the Unreal Engine website). There, you can start from a preset human and use sliders or blend presets to shape your character’s face. You can blend between different presets . You select skin tone, give them hair , eye color, teeth type, and clothing . It’s intuitive and quite fast. If you have a custom face model, you can use the Mesh to MetaHuman feature in UE5 to convert a sculpt into a MetaHuman by fitting it to MetaHuman topology, though that requires that your mesh matches MetaHuman base topology fairly closely.
- Exporting to UE5: Once satisfied, you export the MetaHuman to Unreal Engine. This is done via Quixel Bridge, In Bridge (logged in with same account), you’ll find your MetaHumans under the MetaHumans section. You download the assets (choose the detail level, usually highest). Bridge will then import the MetaHuman into your UE5 project. This brings in a MetaHumans folder with your character’s assets: the skeletal mesh, body and face materials, textures, the groom hair and eyebrows, the control rig assets, etc.
- Customization and Animation: Once in UE5, you can use the MetaHuman as you would any character. It’s already rigged with a face rig compatible with Epic’s Live Link Face app or the new MetaHuman Animator . You can animate the body with any standard skeleton animations . You can also open the MetaHuman facial rig in Control Rig and create keyframe animations or use the pre-made facial poses. For body, you could retarget UE5 mannequin animations to the MetaHuman.
- Advantages: The biggest advantage is speed and quality. Without needing any modeling, texturing, or rigging, you get a character that could easily take weeks to create by hand. And it’s highly realistic, down to the fingernails and eyelashes. It’s also consistent – since MetaHumans are standardized, you can generate multiple characters and they all work with the same rig and framework.
- Limitations: Limited to realistic humans, not supporting non-human or heavily stylized characters. Hair is restricted to presets (editable with custom grooms), and clothing is preview-only; create custom outfits in UE5 or externally for unique looks.
- Performance: While MetaHumans are optimized with LODs, at the highest quality they are still relatively heavy (the textures and hair particularly). For high-end PC or console, you can use them as main characters. On lower devices, you might use the lower LODs or simplify further. But the fact that Epic uses them in demos suggests they are game-ready for many scenarios.
In summary, MetaHuman Creator simplifies creating realistic human characters for UE5, skipping modeling, UV, texturing, and rigging for a high-quality, animatable asset in minutes. In UE5, customize materials (e.g., add tattoos or dirt) or adjust faces via Mesh to MetaHuman or DNA assets. It’s ideal for developers needing photorealistic humans quickly, leveraging UE5’s capabilities efficiently.

How do I import a custom character into Unreal Engine 5?
Importing a custom character into UE5 involves bringing in the 3D model (skeletal mesh), skeleton, and potentially animations and morph targets, then setting up materials and any needed adjustments. Here’s a step-by-step guide:
- Export from DCC: Ensure you have exported your character from your 3D software in a format UE5 supports. FBX is the most common for skeletal meshes. GLTF or USD can also be used (UE has plugins for them), but FBX is tried-and-true. Your export should include the mesh, the skeleton, and any animations or morphs if you want to bring those in one go.
- Create or Open an Unreal Project: Launch Unreal Engine 5 and open your project. It’s recommended to have a folder structure in your Content Browser for characters (e.g., Content/Characters/MyCharacter).
- Import the FBX: In the Content Browser, click the Import button or simply drag-and-drop the FBX file into the content window. The FBX Import Options dialog will appear. Here’s what to configure:
- Make sure Skeletal Mesh is checked (since this is a character with a skeleton).If this is the first time importing this character (new skeleton), leave Skeleton set to None (Unreal will create a new Skeleton asset).
- If you have a Skeleton asset already (say you are importing another animation for the same character or a mesh to an existing rig), you could select the existing Skeleton to use.
- Check Import Mesh.
- Import Animations: If your FBX contains animations (beyond the bind pose), check this to import them as Animation Sequence assets. If it’s just the character in bind pose, you can leave it off.
- Import Morph Targets: If you have blendshapes (morph targets) in the FBX (e.g., facial blendshapes), enable this so Unreal will import those onto the Skeletal Mesh.
- Material Import: You can let Unreal create Materials and import textures if they were embedded, but typically you’ll import textures separately. It’s often fine to let it create placeholder materials which you can later edit or replace.
- Process the Import: Click Import. Unreal will import the skeletal mesh. You should see in your designated folder at least:
- A Skeletal Mesh asset (often with a skeletal mesh icon) named after your FBX or mesh.
- A Skeleton asset (showing a skeleton icon). This Skeleton asset is essentially the list of bones and their hierarchy.
- If animations were imported, you’ll see Animation Sequence assets (filmstrip icon).
- If morphs were imported, they are embedded in the Skeletal Mesh asset (you can see them in the mesh viewer under the Morph Targets list).
- Materials and textures if imported (globe icon for materials, texture icons for images). Often you’ll end up with a default material using the FBX’s material name, which you can edit.
- Assign Materials: Double-click the Skeletal Mesh asset to open the Mesh Editor. The mesh will appear, hopefully with correct geometry. On the right, you’ll see Materials slots (Element 0, Element 1, etc.). If your character had multiple materials (for different mesh parts), they’ll be listed. Now, create or assign proper Materials to each slot:
- If Unreal auto-created materials, you can edit those (double-click the material, then hook up your textures to the material graph).
- Alternatively, import your texture files (PNG/TGA) into Unreal, create new Material assets, set them up with BaseColor, Normal, etc., then assign those materials to the mesh slots.
- After assigning, the character in the Mesh Editor will update with the material applied. Check that things like transparency (for hair cards) or two-sided materials (for thin cloth, if needed) are set correctly in the material properties.
- Skeleton and Physics Asset: Verify the auto-generated Physics Asset for ragdoll collisions, adjusting capsules, spheres, or boxes to fit the mesh’s contours (e.g., aligning with limbs). Remove unnecessary collision bodies to optimize performance and set joint constraints for realistic limits. Simulate in the Physics Asset editor to test ragdoll behavior, ensuring natural physics in-game.
- Animation Blueprint or Retargeting: Preview imported animations in the Mesh Editor to verify functionality, or retarget marketplace animations using UE5’s IK Retargeter. Map bone chains to align motions, adjusting for proportion differences. Create an Animation Blueprint to drive gameplay animations, integrating with state machines for dynamic character behavior in UE5.
- Testing in Engine: Drag the skeletal mesh into a UE5 level to confirm scale (approximately 180 units tall, matching UE’s mannequin) and orientation (facing +X). If misaligned, adjust export settings in your DCC and re-import. Test animations in a level or Animation Blueprint to catch rigging or deformation issues, ensuring the character functions as intended.
- Fixing Issues:
- Materials look wrong: Perhaps normal map green channel needs inversion (Unreal expects +Y normal maps by default, if coming from some programs you may need to flip Y). Or the skin looks too shiny – adjust roughness.
- Skeleton bone hierarchy warnings: If your skeleton has non-deforming bones or unusual hierarchy, Unreal might warn about bones with no influence. Usually not a problem; you can ignore or check “Import Bones as Dummy” in options to include them.
- Scale issues: If your character appears extremely small or large compared to UE’s mannequin, you may need to scale in DCC and reimport (or adjust import uniform scale). Ideally, one Unreal Unit = 1 cm. So a 180 cm tall character should be ~180 units tall in engine.
- Smoothing errors: Export normals and tangents in the FBX from your DCC, or recompute them in UE5’s Mesh Editor to fix shading artifacts like dark patches. Ensure consistent normals and smoothing groups in your DCC, checking for N-gons or thin triangles that cause rendering issues.
With proper preparation, UE5’s robust importer ensures quick integration of your custom character, creating functional assets for gameplay, cinematics, or interactive experiences.

What is PixelHair and how does it help with 3D hair for Unreal Engine 5?
PixelHair is a solution for creating and using realistic 3D hair, particularly integrating Blender and Unreal Engine 5 (including MetaHumans). Here’s more detail:
- PixelHair Overview: PixelHair is a specialized solution offering pre-made, realistic hairstyles as Blender particle systems, complete with a hair cap mesh and detailed strands for styles like braids, afros, ponytails, or flowing locks. Designed for seamless UE5 integration, these assets are optimized for visual fidelity and performance, making them ideal for MetaHumans or custom characters in games or cinematics.
- How PixelHair Works: Each PixelHair asset includes a Blender file with a pre-built hairstyle and a hair cap mesh. The cap is shrinkwrapped to your character’s head for a precise fit, ensuring no gaps or clipping. The particle system generates hair strands, which can be adjusted for length, density, or shape, providing a flexible starting point for UE5 hair workflows.
- Customization: In Blender, tweak the particle hair system to customize strand density, curl patterns, thickness, or color gradients (e.g., root-to-tip fades). PixelHair exposes comprehensive settings, allowing artists to adapt hairstyles to match character designs, from subtle refinements to dramatic redesigns. This ensures unique, tailored looks for diverse UE5 projects.
- Exporting to Unreal:
- Alembic Groom: Unreal Engine supports importing hair groom data (ABC files). Blender can convert its particle hair to curves and export as Alembic. PixelHair tutorials show how to do this. Once imported, UE5 can render the hair as strands (using the Groom component) and even simulate it.
- Hair Cards: Some PixelHair assets might also include card-based hair (planes with textures). Those can be exported as a standard mesh with a material. However, the primary PixelHair method seems to focus on using Blender’s hair particle (strand) data.
- PixelHair documentation indicates it’s designed to be exported: “Pixel Hair can be exported to Unreal Engine and used with any MetaHuman of your choice.” So it explicitly supports attaching to MetaHuman heads in Unreal. For MetaHumans, one usually binds the groom to the MetaHuman skeleton or attaches it to the head via the Groom component.
- MetaHuman Integration: PixelHair hairstyles integrate with MetaHuman characters via UE5’s Groom component. Import the Alembic groom and bind it to the head socket in the MetaHuman Blueprint, expanding the limited default hair options. This enables unique styles like intricate braids or voluminous afros, enhancing MetaHuman customization.
- Benefits:
- Time-saving: making realistic hair from scratch is notoriously difficult and time-consuming. PixelHair gives you a high-quality starting point (the hairstyles are crafted by an artist who focused on realism).
- Game-ready: The PixelHair assets are built to have a reasonable poly count via hair cards or optimized strands, and they come with a hair cap that helps performance (since hair often uses a translucent material, the cap covers the scalp so not too many layers of transparency overlap).
- Consistent workflow: Artists can follow the PixelHair tutorial to apply any hairstyle to their character, whether it’s a MetaHuman or a unique character, and get a reliable result in Unreal Engine
- Realism: The volume and shading of PixelHair are crafted to look realistic, using techniques like having a dense hair cap mesh (about 18,000 polys) that the hair strands emit from, ensuring no bald spots and proper head coverageflippednormals.com. It also likely uses good hair shaders (with root-to-tip color variation, roughness, etc.).
Using PixelHair Step-by-Step (Simplified):
- In Blender, append or open the PixelHair hairstyle.
- Fit the hair cap to your character’s head (using shrinkwrap modifier as provided).
- Adjust the hair particles if necessary .
- Export the hair. If using Alembic: convert particle to curves and export as .abc (remember to set scale to cm for UE).
- In Unreal, import the Alembic groom (enable Groom plugin). Apply a hair material (PixelHair usually provides a hair material/texture for the strands).
- Attach the groom to your character’s head (if using a MetaHuman, attach via Groom component on the face skeleton; if a custom character, attach to a socket or as a child of head bone).
- Set up physics if desired (Unreal’s Groom can use the Niagara or Chaos physics for hair simulation).
- Tweak shading or density in engine if needed
PixelHair streamlines the creation of realistic, customizable 3D hair, bridging Blender and UE5 to deliver high-quality hairstyles that elevate character visuals with minimal effort.

How do I create facial expressions and blendshapes for UE5 characters?
Creating expressive facial animations for your character often involves using blendshapes (morph targets) and/or bone-driven controls for the face. Here’s how you can create facial expressions and set up blendshapes for use in Unreal Engine 5:
- Modeling Blendshapes (Morph Targets): In Blender or Maya, sculpt alternate mesh shapes for facial expressions like smiles, frowns, squints, or phonemes (e.g., “A,” “O”). Use Blender’s Shape Keys to store morphs or Maya’s duplicated meshes for each expression. Ensure shapes are distinct yet blend smoothly when combined, testing in your DCC to avoid conflicts. Export these as morph targets for UE5’s animation system.
- Following Standards (Optional): Adopt Apple ARKit’s 52 blendshape standard (e.g., “jawOpen,” “eyeBlinkLeft,” “mouthPucker”) for compatibility with motion capture tools like Live Link Face or MetaHuman Animator. These standardized shapes align with MetaHuman rigs, enabling seamless integration with UE5’s facial animation pipelines. Even partial adoption (e.g., 10-20 key shapes) simplifies animation workflows for expressive characters.
- Bone vs Blendshape: Blendshapes excel for detailed lipsync and subtle expressions like smirks or blinks, offering fine control over mesh deformations. Bones are better for broader movements, like jaw rotation or eye tracking, and are lighter for performance in crowd scenes. Hybrid rigs combine both, using bones for major motions and blendshapes for micro-expressions, balancing quality and efficiency in UE5.
- Implementing in the DCC: Create the blendshapes in your DCC and add them to the mesh. In Blender these would appear as Shape Keys on the mesh data. Name them clearly (e.g., “Smile”, “Frown”, or “ARKit_JawOpen” etc.). Test combining them to ensure they work together reasonably (some shapes might need corrective shapes if used together, but that’s advanced).
- Export to Unreal: Enable Shape Keys in Blender’s FBX export settings under Geometry, or include blendshapes in Maya’s FBX export. During UE5’s FBX import, check Import Morph Targets to include them in the Skeletal Mesh. Open the Mesh Editor’s Morph Targets tab to verify all shapes imported correctly, re-exporting from your DCC if any are missing or misnamed.
- Animating Facial Expressions:
- Using Animation Data: In UE5’s Sequencer, create animation sequences and keyframe morph target weights to drive expressions (e.g., ramping “Smile” from 0 to 1). Add audio tracks for lipsync, keying phoneme shapes like “jawOpen” or “mouthWide” to match dialogue. This manual approach offers precise control for cinematic or gameplay animations.
- Using Live Capture / Driving by Blueprint: If you have face tracking (like Live Link Face app), you can drive the morph targets in real-time via LiveLink. That requires mapping the incoming blendshape names to your morph names, often done in a LiveLink anim blueprint. There’s also the new MetaHuman Animator which produces an animation asset you can apply to MetaHuman or even retarget to your custom character if you use same ARKit names.
- Using Control Rig: In UE5, an alternative is to create a Control Rig for your face which directly manipulates morph targets. For example, you can have a control for “Smile” that when moved sets the Smile morph weight. Control Rig allows creating IK-like behavior with morphs too (though morphs are linear, you could e.g. drive a cheek puff morph when jaw bone moves, etc., via Control Rig logic).
- Bone-based Animation: If you have facial bones (like jaw, eyeballs, tongue, maybe bones for eyebrows or lips), those will come in with the skeleton. You can animate them like any bones via animation keyframes or Control Rig. Often, bone and morph are combined (e.g., jaw open might be a bone rotation plus a morph for lip spread).
- Testing Expressions: Make an Animation Montage or Sequence that sets some expressions and play it on your character to ensure everything works. Check that morph targets don’t cause unintended mesh issues (like inner mouth poking through face on extreme shapes – if so, you may need to sculpt a better morph shape or add a corrective morph).
- Performance Consideration: Limit morph targets to 20-40 for main characters to reduce CPU load, using bones for distant LODs or crowd characters. Bake complex animations into sequences for heavy scenes to optimize performance. This balances expressive animations with UE5’s real-time rendering demands.
- Example Workflow: Say you want a basic talking character. You might create blendshapes for JawOpen, MouthSmile, MouthFrown, EyeBlink (L/R), BrowUp, BrowDown. Import them. In Unreal, use Sequencer to animate a dialog: create keys where JawOpen goes to 1 for an “ahh” sound, then back, blink morph triggers occasionally, etc. The result is your character’s face animates. Or for gameplay, use an Animation Blueprint to set morph values based on AI or player input (e.g., press a key to smile, or health low triggers a sad frown expression by setting that morph).
- Metahuman vs Custom: Note that MetaHumans have a very elaborate facial rig (with many morphs following ARKit). For your own character, you can choose how elaborate to get. If it’s important to have a wide range of expression, investing time in creating those blendshapes is worth it. Tools like FaceIt (Blender addon) can generate ARKit blendshapes for your model semi-automatically, which might help if doing it manually is too time-consuming.
Blendshapes provide precise control for lifelike facial expressions, driven by keyframes, motion capture, or Control Rig, enhancing your character’s emotional depth and interactivity in UE5.

Can I use Character Creator or Daz 3D for UE5 character creation?
Yes, you can absolutely use Character Creator (by Reallusion) or Daz 3D to create characters and bring them into Unreal Engine 5. These are dedicated character creation tools that can significantly speed up the workflow by providing premade models, morphs, rigs, and even clothing. Here’s how they fit in:
- Character Creator (CC): Reallusion’s Character Creator 4 (CC4) is a powerful tool for crafting detailed characters, offering morphing sliders for body and face shapes, high-resolution skin texturing, and a library of clothing and accessories. Its Auto Setup plugin for Unreal automates FBX exports, generating UE5-compatible materials, Digital Human Shaders, and skeletal meshes. CC supports ZBrush integration via GoZ for custom sculpts, making it ideal for both rapid prototyping and professional-grade character pipelines in UE5.
- Daz 3D (Daz Studio): Daz Studio is a free platform with Genesis 8 and 9 models, providing customizable morphs for body types, facial features, and expressions, plus extensive texture and clothing libraries. The Daz to Unreal Bridge automates exports, creating rigged characters with materials and morphs for UE5. It’s great for quickly building realistic humans or prototyping, though careful customization (e.g., unique morphs, textures) avoids the generic “Daz look” often seen in default models.
- Workflow Considerations:
- Rigging: Both CC and Daz characters are fully rigged, meaning you don’t need to rig them yourself. They typically come with standard human bone structures (CC has a Unreal preset that matches Epic skeleton well; Daz has its Genesis skeleton which is different but retargetable).
- Importing Animations: If you use CC, you can choose to use its preset that matches the Unreal mannequin skeleton, making retargeting trivial (or directly using Unreal’s skeleton). If using Daz, you will likely retarget animations from the Unreal mannequin to the Daz skeleton using Unreal’s retargeter. The Bridges often handle basic retarget setup but you might need to adjust bone mappings.
- Materials: The Auto-Setup for CC will create materials using a special Digital Human Shader for skin in Unreal, which looks quite good (subsurface scattering etc.). Daz’s bridge will set up materials too, but you might need to tweak them (e.g., adjusting skin translucency or converting Daz’s shader maps to a proper UE skin material).
- Morphs: Both CC and Daz can bring in a bunch of morph targets (for facial expressions, body morphs). If you enable them on export, these will be available in Unreal, which is great for character customization in-game or for facial animation. For example, you could drive Daz facial morphs in UE to do lip sync, or allow the player to adjust their character’s appearance via morph sliders if you plan a character customization screen.
- Performance: CC and Daz characters can be on the heavier side polycount-wise (especially Daz, which often caters to rendering). Usually, you might want to decimate or create LODs for them. CC4 has LOD tools and can create LODs on export. Daz doesn’t automatically LOD, so you might rely on Unreal’s auto LOD. Also, hair from these sources is often high poly (or lots of cards); consider optimizing if needed.
- Combining with MetaHumans: Interestingly, some workflows use CC or Daz to create a base and then convert to MetaHuman using Mesh to MetaHuman (for the head) to get the best of both worlds (the unique look from CC/Daz plus the MetaHuman rig). This is advanced but shows you can mix pipelines.
Character Creator and Daz 3D accelerate UE5 character creation with pre-rigged, textured assets, saving significant time. Verify licensing terms for commercial use to ensure compliance in your game projects.

How do I retarget animations to a custom character in Unreal Engine 5?
Here’s how you retarget animations to your custom character:
- Ensure Both Skeletons are Compatible: Verify that the source skeleton (e.g., UE4 Mannequin) and your custom target skeleton share similar humanoid hierarchies, with comparable bone counts, naming conventions, and joint placements. Aligning your rig with UE’s mannequin skeleton enables direct use of marketplace animations without retargeting. For differing skeletons, detailed bone mapping in UE5’s IK Retargeter ensures accurate motion transfer.
- Use the same skeleton asset as the source (in which case no retarget needed – if you rigged your custom character to the UE mannequin skeleton, you can directly use the animations).
- Or have a unique skeleton but with similar structure.
- Set Up Retargeting Assets (IK Retargeter in UE5): In UE5, you’d create an IK Retargeter asset. Here’s the procedure:
- Go to Content Browser, click Add New -> Animation -> IK Retargeter.
- It will ask for a IK Retargeter Root (target IK Rig). Actually, you must first have IK Rig assets set up for both the source and target skeletons. So:
- Create an IK Rig asset for the source skeleton (e.g., UE4 Mannequin IK Rig). Unreal might provide one for the Mannequin in the engine content, or you can create it and add goals for hands and feet.
- Create an IK Rig for your custom character’s skeleton.
- In each IK Rig, define the Retarget Root (usually the pelvis or root bone) and add IK goals if needed (like hand_l, hand_r, foot_l, foot_r). Even if you don’t use IK, these rigs serve as a definition of bone chains.
- Now create the IK Retargeter asset using the source IK Rig (say UE4 mannequin IK Rig). In the Retargeter editor, choose your target IK Rig (your character).
- The editor will show both skeletons side by side. You then map bone chains: e.g., map “spine” chain of source to “spine” of target, “leftArm” to “leftArm”, etc. If you named bones similarly, it auto-maps a lot. If not, you assign them. Also set up root translation retargeting (like how to handle root motion scaling).
- You can also adjust settings like scale compensation if characters have different proportions (for example, if target is much taller, you might enable root retargeting to adjust stride).
- After mapping, you can preview any source animation on the target in that IK Retargeter window by selecting an asset from the dropdown. Adjust settings until the motion looks right (you may need to tweak things like pelvis height or IK offsets to make feet plant correctly, etc.).
- Export Retargeted Animations: In the IK Retargeter, you can select multiple animations from the source and click “Export Selected Animations”. It will generate new animation assets for your target character. These new animations are now tied to your custom character’s Skeleton asset.
- Alternate Method (Legacy Retarget Manager): Use UE5’s older Retarget Manager for simpler retargeting, mapping humanoid rigs by selecting source and target skeletons in the Content Browser. Right-click animations to retarget them directly. While less precise than the IK Retargeter, it’s effective for basic setups or quick tests, though it may require more manual cleanup.
- Post-Retarget Cleanup: Refine retargeted animations to address issues like finger misplacements, foot sliding, or unnatural limb rotations. Adjust bone mappings or IK settings in the IK Retargeter, or edit keyframes in UE5’s Animation Editor for precision. Thorough cleanup ensures animations look natural and polished on your custom character.
- Using the Retargeted Animations: Apply retargeted animations in UE5’s Animation Blueprints for gameplay-driven motion or Animation Montages for cinematic sequences. These animations function like native assets, adapting to your character’s proportions and enabling dynamic behaviors, such as state-driven transitions in combat or exploration.
- Example: If you have a bunch of marketplace animations for a standard mannequin (running, jumping, etc.) and you made a custom character in Blender with a different rig, you import your character and then use IK Retargeter mapping. After retarget, your character can play all those animations as if they were made for it. This saves a huge amount of animation work.
- Retargeting Between Different Species: If the skeletons are very different (e.g., a wolf quadruped vs. a human), retargeting is more complex and might not yield perfect results since the motions are fundamentally different. Retargeting is best for similar structure characters (human to human, maybe human to MetaHuman, etc.). For creatures, you often need bespoke animations.
Retargeting with UE5’s IK Retargeter streamlines the process of adapting animations across humanoid skeletons, saving animation creation time while ensuring smooth, compatible motions for your custom character in Unreal Engine 5.

What file formats work best when sending characters to UE5?
When transferring characters (and their associated data like meshes, rigs, animations) from external software to Unreal Engine 5, certain file formats are more suitable. Here are the most common and “best” formats to use:
- FBX (Filmbox): FBX is the workhorse format for getting skeletal meshes and animations into Unreal. It’s widely supported by all major 3D software (Blender, Maya, Max, etc.). Use FBX for skeletal meshes (characters with rigs), static meshes, and animations. It supports skinning (bone weights), morph targets, and animation keyframes. Unreal’s FBX import pipeline is robust and well-documented. Typically:
- glTF (GL Transmission Format): glTF is a lightweight format supporting meshes, PBR materials, textures, and basic rigs, ideal for static or simple characters. Use UE5’s glTF importer plugin (available in Project Settings) for import, but note its limited support for complex skeletal animations compared to FBX. glTF suits cross-platform workflows or lightweight assets, though FBX remains more robust for full character pipelines.
- OBJ (Wavefront OBJ): OBJ is a simple format for static meshes, such as clothing, accessories, or non-animated props, but it lacks support for rigging, animations, or morph targets. Use OBJ for character components that don’t require deformation, combining with FBX for rigged assets. Its universal compatibility makes it a reliable choice for basic geometry imports into UE5.
- Alembic (.abc): Alembic is specialized for dynamic assets like hair grooms, cloth simulations, or vertex animations (e.g., facial caches). Export Blender’s particle hair as Alembic curves for UE5’s Groom component, enabling strand-based rendering with physics. Alembic is not suited for standard skeletal animations but excels for specific dynamic elements, complementing FBX in UE5 workflows.
- USD (Universal Scene Description): USD supports complex scenes with meshes, materials, animations, and hierarchies, making it ideal for virtual production or collaborative pipelines. Use UE5’s USD importer for advanced workflows, such as importing entire character setups with environments. For game-ready characters, FBX is simpler and more practical, as USD’s complexity can be overkill for standard asset imports.
- Reallusion and Daz Bridges: While not exactly general-purpose formats, it’s worth mentioning that Character Creator and Daz have their own pipeline formats (essentially FBX under the hood, but automated). They might use JSON configs along with FBX. If you’re using those, you’d follow their specific instructions rather than manually dealing with formats.
- Materials and Textures: No matter what 3D format you use, materials usually don’t transfer 1:1. You will typically import texture images (PNG, TGA) separately. FBX does carry basic material info and will create material slots in Unreal, but you often need to manually hook up textures. glTF can carry PBR material definitions (baseColor, normal, etc.), and Unreal’s glTF importer might use that to set up a material automatically. But it’s always good to know you’ll do some touch-up on materials in engine.
- Animations: For animations, FBX is again the best format (each animation can be an FBX). Alternatively, Unreal can import BVH motion capture files via third-party plugins or convert them to FBX first. But most animation data exchange is via FBX.
- Rigging Data: One thing to note: some rig controls (like IK handles, constraints) from Maya or Blender don’t export to FBX. FBX only retains the skeleton and baked animation. So if you have complex control rigs in Maya, you bake the animation to bones before export. Formats like USD might one day carry rig logic, but currently, you assume the need to bake animations.
FBX is the most versatile and reliable format for UE5 characters, with Alembic for hair grooms and OBJ for static components, ensuring seamless integration into Unreal’s real-time rendering pipeline.

How do I optimize a custom 3D character for real-time performance in UE5?
Optimizing a character for real-time means ensuring it looks good and runs efficiently, especially if you have many characters on screen or are targeting lower-end hardware. Here are key areas to focus on:
- Polygon Count and LODs: Target 30k-60k triangles for main characters and 10k-20k for NPCs, with under 10k for mobile platforms to ensure real-time performance. Create multiple Levels of Detail (LODs) using UE5’s auto-LOD tools or manual decimation in Blender/Maya, reducing polygons at greater distances. LODs are critical for large scenes, maintaining visual quality while minimizing rendering costs, as seen in MetaHuman’s 8-LOD approach.
- Materials and Draw Calls: Combine UVs into a single texture atlas to use one material, significantly reducing draw calls for efficient rendering. Limit characters to 2-3 materials, using Material Instances for variations like color or dirt levels without recompiling shaders. Avoid excessive material slots, as each increases rendering overhead, impacting UE5’s performance in complex scenes.
- Texture Optimization: Use 2K or 4K textures for main characters, dropping to 1K for NPCs or mobile to balance detail and memory usage. Apply BC1 compression for color maps and BC5 for normals, packing Roughness, Metallic, and AO into one RMA texture to save VRAM. Generate mipmaps and adjust LOD bias in UE5’s Texture Editor to optimize rendering at various distances, aligning with platform-specific budgets.
- Animations and Bones: Restrict skeletons to ~100 bones, removing non-essential ones (e.g., facial bones for NPCs) to reduce skinning costs. Limit vertices to 4 bone influences and implement bone LODs for distant characters, simplifying calculations. This optimization is crucial for crowd systems or large-scale scenes, ensuring smooth animation performance in UE5.
- Physics and Cloth: Apply Chaos cloth simulation selectively for elements like capes or skirts, disabling it at distance via LODs to save processing power. Use simplified physics bodies (capsules or boxes) for ragdolls, optimizing collision complexity in the Physics Asset editor. Test Chaos cloth stability in various poses to balance realism with performance, avoiding costly simulations in dense scenes.
- Shadow and Lighting Costs: Use lower LODs for shadow casting to reduce shadow rendering costs, enabling Capsule Shadows for distant characters in Project Settings. Adjust Max Shadow LOD in the Skeletal Mesh settings to prioritize performance. Test in UE5’s Shader Complexity view to identify shadow bottlenecks, ensuring efficient lighting interactions for your character.
- Groom/Hair Optimization: Implement LODs for strand-based hair, transitioning to hair cards at distance to minimize transparency and rendering costs. Optimize hair cards with minimal polygon counts and efficient opacity maps, reducing overdraw. PixelHair’s optimized grooms offer a model for balancing realism and performance, suitable for UE5’s hair rendering pipeline.
- Blueprint and Tick: Disable unnecessary Animation Blueprint nodes, like IK or complex logic, for off-screen characters to reduce CPU load. Enable Visibility Based Anim Tick in the Skeletal Mesh settings to limit animation updates to visible characters. Optimize tick events in Blueprints, using event-driven logic to minimize performance impact in gameplay.
- Testing and Profiling: Use UE5’s profiling tools (stat unit, stat skeletalmesh, ProfileGPU) to monitor CPU and GPU usage, identifying bottlenecks in polygon counts, texture sizes, or animation complexity. Adjust assets based on profiling data, testing on target hardware (e.g., mid-range PCs, consoles) to ensure smooth frame rates. Regular profiling catches issues early, optimizing for real-time performance.
- LOD for Animations: Use simplified animations (e.g., idle poses) or static meshes for distant characters, leveraging UE5’s crowd systems or impostors for large groups. This reduces animation processing costs while maintaining visual coherence. Test LOD transitions in-game to ensure seamless shifts without noticeable pops or stuttering.
- Use Unreal Engine Features: Enable GPU Skinning in Project Settings to offload skinning calculations to the GPU, improving performance for rigged characters. Use Ref Pose optimization to reduce skeletal mesh updates for static poses. Ensure frustum culling is active to prevent rendering off-screen characters, leveraging UE5’s built-in features for efficiency.
Optimization ensures your character delivers high visual quality while maintaining real-time performance in UE5, using LODs, material efficiency, and physics tuning to meet gameplay demands across platforms.

How do I set up physics and cloth simulation for characters in UE5?
Unreal Engine 5’s physics (Chaos physics system) can simulate ragdoll physics, cloth, and other physical elements on characters for added realism. Setting up physics and cloth involves a few different systems:
- Physics Asset for Ragdoll and Rigid Bodies: Unreal generates a Physics Asset (PHAT) with collision bodies (capsules, spheres, boxes) mapped to bones (e.g., pelvis, spine, forearms). To set up:
- Open the Physics Asset editor by double-clicking the asset in the Content Browser, displaying the mesh with wireframe colliders overlaid.
- Adjust collider shapes to tightly fit mesh parts, ensuring minimal gaps and slight overlaps at joints (e.g., upper/lower arm capsules overlapping at the elbow to mimic a hinge joint). Remove non-critical bodies (e.g., individual finger or toe colliders) to optimize performance.
- Configure constraints for each joint: set angular limits (e.g., knee restricted to 0-130 degrees to prevent backward bending), twist limits (e.g., ±10 degrees for spine rotation), and swing limits (e.g., shoulder cone angles) to enforce realistic motion boundaries.
- Test ragdoll by clicking “Simulate” in the editor; the character should collapse with natural weight distribution (e.g., torso heavier than arms). Fine-tune mass (e.g., 20kg for torso, 5kg for limbs) and damping (e.g., 0.1-0.5 for fall resistance) to achieve lifelike falling behavior.
- In-game, activate ragdoll by calling Set Simulate Physics on the skeletal mesh component (e.g., on character death or impact). Leverage collision bodies for precise hit detection (e.g., bullet impacts) or interactions with environmental forces (e.g., explosions or wind).
- Cloth Simulation (Chaos Cloth): For cloth elements like capes, skirts, or flowing coat tails:
- Import cloth as part of the skeletal mesh, weight-painted in your DCC (e.g., Blender, Maya) to follow key bones (e.g., pelvis, spine, or shoulder bones for initial positioning, ensuring the cloth tracks the rig before simulation).
- In the Skeletal Mesh asset, access the Cloth Paint tool via the toolbar. Select the cloth section by material or LOD mesh section (e.g., cape material), then click “Create Clothing” to generate a Clothing Data asset defining simulation vertices.
- Paint vertex weights: assign 0 for fixed areas (e.g., skirt’s waistband pinned to the rig), 1 for fully simulated areas (e.g., skirt’s hem for free movement), and gradients (e.g., 0 at waist to 1 at hem) or partial values (e.g., 0.5 for semi-rigid sections) for nuanced flexibility and natural drape.
- Assign the Clothing Data to the mesh section (auto-assigned if created from selection). Adjust properties in the Clothing Data or Chaos Cloth Material, such as cloth thickness (e.g., 0.1-0.5 cm), gravity scale (e.g., 0.8 for lighter flutter), bending stiffness (e.g., 10-50 for resistance), and damping (e.g., 0.1 for motion settling).
- Enable Teleport mode in cloth config to prevent jitter or stretching during character teleports (e.g., level transitions). Test in-editor with character movement, iterating to minimize issues like excessive stretching, unnatural flapping, or simulation instability.
- Physics for hair or accessories: For dynamic elements like ponytails, chains, belts, or dangling jewelry:
- Use the Physics Asset to create rigid bodies with constraints (e.g., ponytail segments as a chain of small capsules, each linked by constraints with limited swing angles, such as ±30 degrees, for controlled swaying motion).
- Implement AnimDynamics or RigidBody nodes in the Animation Blueprint for spring-based physics on specific bones (e.g., tail, cape, or earring bones). Configure parameters like stiffness (e.g., 100 for tight response), damping (e.g., 0.7 for settling), and angular limits (e.g., ±45 degrees) for natural jiggle or bounce.
- Treat hair cards as cloth using the Cloth Paint tool, applying weights (e.g., 0 at scalp, 1 at tips) for flexible simulation, or use strand-based Groom physics by adding a Groom Component to the skeletal mesh, enabling Niagara/Chaos physics in the Groom asset properties, and adjusting parameters like strand stiffness or gravity for realistic hair motion.
- Collisions with character: Ensure cloth or hair collides with the body to prevent penetration. In cloth config, enable collision with Physics Asset’s primitives (e.g., torso, arm, or leg capsules). Increase collision thickness (e.g., 1-2 cm) or add simplified capsules (e.g., cylindrical leg colliders under a skirt) for robust collision. Test in various poses (e.g., sitting, running) to verify no clipping, adjusting collider placement or thickness as needed.
- Chaos vs Legacy PhysX (Apex): UE5 defaults to Chaos Cloth (confirm Chaos Cloth plugin is enabled in Project Settings). As Chaos is still maturing, fine-tune parameters like stiffness, damping, or collision settings to achieve stable simulations, addressing issues like stretching or jitter through iterative testing.
- In-Game Activation: Cloth simulates automatically if configured (ensure “Clothing Simulation Factory” in Project Settings is set to Chaos). Trigger ragdoll with Set Simulate Physics for full physics takeover (e.g., on death) or use a Physical Animation Component to blend physics with animation for partial effects (e.g., arm flopping during hit reactions), adjustable via blend weights.
- Physics-driven animation: Apply Physical Animation to make bones follow target animations with spring-like dynamics, ideal for secondary motion (e.g., heavy armor pieces swaying) or dynamic hit reactions. Adjust strength (e.g., 500 for tight tracking) and damping (e.g., 0.5 for smooth settling) for desired responsiveness.
In a practical sense, for a character with a cape, rig it with minimal bones, import to UE5, and create a cloth asset, pinning the top (weight 0) while simulating the rest for natural flow. Set up Physics Asset bodies for a ponytail or skirt for realistic ragdoll on death. Optimize by limiting simulations and using LODs to disable distant physics, ensuring performance. In summary, configure the Physics Asset for ragdoll, use Cloth Paint for cloth, apply AnimDynamics or Groom for other physics, and test in-game for seamless, clip-free dynamics, enhancing character realism.

Can I make stylized characters for Unreal Engine 5?
Absolutely, you can create stylized characters for Unreal Engine 5. UE5 isn’t only for realism; it’s a flexible engine that can handle a wide range of art styles from cel-shaded cartoons to Fortnite/Overwatch-style semi-realism, to anime or any unique style you envision. The process of making a stylized character is fundamentally the same as a realistic one, with differences mostly in artistic approach and materials:
- Modeling Stylized Characters: Stylized characters often have exaggerated proportions (e.g., large heads, small bodies), simplified anatomy, or unique, non-realistic shapes. Model or sculpt these in tools like Blender or ZBrush to align with your concept art. UE5 renders chibi, cartoon, or abstract characters as effectively as photorealistic ones. Focus on fewer, smoother curves and minimal micro-detail (e.g., avoiding pores or wrinkles) to maintain a cohesive stylized aesthetic, which can reduce poly counts due to simpler, cleaner forms suitable for games or animation.
- Texturing and Materials: Stylized characters typically use flat colors, hand-painted textures, or simplified material setups instead of realistic PBR skin shaders. Create stylized looks in UE5 by:
- Using unlit shading (Material Shading Model “Unlit”) for pure flat colors unaffected by lighting, or building custom lighting models with node networks to simulate toon shading with controlled light falloff.
- Implementing cel-shading by calculating dot(Normal, LightVector) and thresholding it to create hard shading bands, or using post-process materials to render black outlines by detecting edges via scene depth and normal buffers.
- Utilizing marketplace assets or plugins for pre-built toon shading solutions to streamline development if custom shader creation isn’t preferred.
- Applying hand-painted PBR textures with vibrant, stylized details, minimal normal map usage, and non-physical material properties (e.g., matte finishes, colored specular) for looks like Fortnite’s vibrant surfaces or Breath of the Wild’s toon aesthetic, while still fitting within UE5’s standard PBR pipeline.
- Animations for Stylized Characters: Design snappier, exaggerated animations to emphasize the stylized aesthetic (e.g., bouncy walks or over-the-top reactions). Disable or stylize motion blur in UE5 for a crisp, cartoon-like feel, avoiding realistic blur. Incorporate advanced cartoon techniques like smear frames or stretch effects by scaling bones or using morph targets on specific frames, supported by UE5’s animation system, to mimic classic 2D animation dynamics (e.g., squash-and-stretch).
- Outline Rendering: Achieve comic or anime-style outlines by using a post-process material that detects edges via scene depth and normals, rendering black contours around characters. Alternatively, duplicate the character mesh, scale it slightly outward, invert its normals, and apply a solid black material to create a bold outline effect, customizable for thickness and style.
- Performance: Stylized characters are often performance-friendly, bypassing resource-intensive features like subsurface scattering, high poly counts, or complex shaders. Use unlit materials for minimal render cost, ideal for mobile or low-spec platforms. Be cautious with heavy post-processing (e.g., outline effects or bloom), which can add overhead, but simpler shading typically allows for efficient rendering in large scenes.
- Case Study Examples: Fortnite, built on UE5, uses vibrant, flat-shaded characters with tuned materials and occasional outlines for cosmetics, relying on skeletal meshes with stylized art direction. Genshin Impact’s anime-style (achievable in UE5) employs cel-shading and vibrant textures, demonstrating how UE5’s material flexibility supports diverse, non-photorealistic looks with custom shaders.
- Lighting for Stylized: Opt for simpler lighting setups to complement the art style, such as a single directional light with high ambient terms, baked lighting into textures, or non-physical colored lights for artistic tinting. Disable eye adaptation and tonemapping in UE5 for consistent toon shading, and use ambient cubemaps or custom tonemapper settings to achieve stylized color grading and contrast tailored to your vision.
- Niagara VFX: Enhance stylized characters with cartoon-inspired visual effects, such as “pow” impact bursts or stylized hit sparks, using UE5’s Niagara system. Create 2D animated sprite sheets within Niagara to emulate comic-book effects, ensuring VFX align with the character’s aesthetic for a cohesive stylized presentation.
In summary, UE5 enables stylized characters through customized asset creation and material setups, following the same import, skeleton, and animation pipeline as realistic characters. Optimization techniques like LODs remain relevant, and stylized designs often optimize naturally due to simpler materials. Craft your character art (cartoon, low-poly, etc.), then use UE5’s flexible rendering to create looks from Borderlands to claymation, dialing realism up or down as needed.

What are the common mistakes when making your own character for UE5?
When creating a custom character, especially if you’re newer to the process, there are several common pitfalls to watch out for. Avoiding these mistakes will save time and ensure your character works well in Unreal Engine 5. Here are some frequent issues and how to avoid them:
- Incorrect Scale: Mismatched units between your DCC and UE5 (e.g., Blender’s meters vs. Unreal’s centimeters) cause characters to import too small or large, disrupting physics, camera, and gameplay systems. Set Blender’s Unit Scale to 0.01 (1 unit = 1 cm) or adjust FBX export scaling to match UE’s standard. Compare your character to UE’s 180-unit-tall mannequin to ensure proper sizing and avoid issues in-game.
- Not Applying Transforms: Failing to apply rotations, scales, or locations in Blender leads to skewed or misoriented characters in UE5, causing animation or physics errors. Apply transforms (Ctrl+A > Location, Rotation, Scale) to lock in the model’s state, ensuring -Y forward aligns with UE’s +X axis. This prevents issues like rotated bones or distorted meshes during import.
- Excessive Poly Count / No LODs: High-poly models (e.g., 200k+ triangles) without Levels of Detail (LODs) degrade performance, especially in scenes with multiple characters. Retopologize to 30k-60k triangles for main characters and create LODs using UE5’s tools or DCC decimation. LODs reduce rendering costs at distance, ensuring scalability for real-time rendering in UE5.
- Too Many Materials: Using multiple material slots (e.g., separate materials for skin, clothing, hair) increases draw calls, slowing rendering performance. Merge UVs into a single texture atlas for one material, limiting to 2-3 materials maximum. Assign placeholder materials in your DCC to avoid Unreal assigning default materials, streamlining the import process.
- Bad Weight Painting: Poorly painted weights cause deformation issues, such as collapsing joints, stretching skin, or unnatural bends during animation. Test bone rotations in your DCC to identify problem areas, normalizing weights and adding geometry at joints (e.g., elbows, knees) for smooth deformations. Proper weights prevent visual artifacts in UE5 animations.
- Ignoring Engine Constraints: Exceeding UE5’s limits, like using more than 4 bone influences per vertex or bone names longer than 30-40 characters, causes import errors or performance issues. Limit to 4 influences in your DCC’s skinning settings and use concise bone names. This ensures compatibility with UE5’s skinning pipeline and animation systems.
- Not Using a Root Bone: Omitting a single root bone (e.g., pelvis or root at (0,0,0)) complicates root motion, physics, and component attachments in UE5. Include a root bone as the parent of all other bones in your skeleton hierarchy. Most auto-riggers (e.g., Rigify, Mixamo) include this, ensuring proper setup for Unreal’s animation and physics systems.
- Poor UV Layout: Overlapping UVs, stretched textures, or inefficient layouts cause visual artifacts like blurry or misaligned textures in UE5. Create non-overlapping UVs in the 0-1 space, prioritizing resolution for key areas (e.g., face, hands). Use tools like Blender’s UV Editor or Maya’s UV Toolkit, and consider UDIMs for complex characters to maintain texture quality.
- Overlooking Collisions: Missing collision components cause characters to fall through floors or fail to interact with environments. Use a CapsuleComponent in UE5’s Character Blueprint for basic collisions, or configure a Physics Asset for detailed ragdoll collisions. Test collisions early in a level to ensure proper environmental interactions, leveraging UE5’s default capsule for simplicity.
- Neglecting Testing in Engine: Delaying in-engine testing hides issues like sliding feet, clipping geometry, or incorrect scale until late in development. Import prototype assets into UE5 early to test animations, rigging, and proportions. Early testing identifies problems like weight errors or topology issues, saving time and ensuring a polished character.
- High-Resolution Textures Unoptimized: Using uncompressed or oversized textures (e.g., 8K or BMP formats) bloats VRAM usage, slowing performance. Use 2K or 4K textures with BC1 compression for color maps and BC5 for normals, packing RMA maps (Roughness, Metallic, AO) into one texture. Export as PNG or TGA and generate mipmaps in UE5 to optimize for target platforms.
- Importing Unnecessary Blendshapes: Including unused morph targets (e.g., test shapes or redundant expressions) increases asset size and processing costs. Export only essential morphs for animation or customization, cleaning up Shape Keys in Blender or blendshapes in Maya. This reduces memory overhead and improves performance in UE5’s rendering pipeline.
- Topology Issues: Using N-gons, thin triangles, or non-manifold geometry causes shading errors, rendering artifacts, or deformation issues in UE5. Model with clean quads or triangulated meshes, recalculating normals in your DCC to ensure consistency. Add edge loops at joints and fix symmetry issues to avoid seams or visual glitches during animation.
- Not Following Epic’s Skeleton: Custom rigs that deviate significantly from UE’s mannequin skeleton (e.g., different bone names or hierarchy) complicate retargeting marketplace animations. Match bone names and hierarchy to the mannequin in your DCC, or use UE5’s IK Retargeter for mapping. This simplifies animation reuse and reduces retargeting errors in UE5 workflows.
- Performance Over-sights: Unoptimized cloth simulations, excessive bone counts, or heavy Blueprint tick logic slow performance, especially in scenes with multiple characters. Disable cloth or physics at distance via LODs, limit skeletons to ~100 bones, and optimize Blueprint ticks with event-driven logic. Profile with UE5’s stat tools to ensure smooth gameplay on target hardware.
Most common mistakes arise from scale mismatches, rigging oversights, or lack of optimization. Regular in-engine testing, clean DCC setups, and adherence to UE5’s constraints prevent these issues, ensuring a functional, high-quality character ready for gameplay or cinematics.

Where can I find tutorials for creating custom characters for Unreal Engine 5?
There are many resources available online to learn character creation for UE5. Here are several reliable places to find tutorials and guidance, ranging from official documentation to community-driven content:
- Unreal Engine Official Documentation and Tutorials: Epic Games’ official documentation, accessible via the Epic Developer Community, provides comprehensive guides on skeletal mesh creation, rigging, animation workflows, and MetaHuman integration for UE5. The Unreal Learning portal offers video courses and written tutorials covering character pipelines, including FBX imports, retargeting, and material setup. These resources are regularly updated for UE5, ensuring accurate, authoritative guidance for beginners and professionals.
- Blender to Unreal Tutorials: YouTube channels like Unreal Sensei, CG Geek, and Grant Abbitt offer detailed Blender-to-UE5 workflows, covering modeling, texturing, rigging, and FBX export settings. Epic’s YouTube channel includes official tutorials on addons like “Send to Unreal,” which automates asset transfers with correct scale and orientation. Search “Blender to Unreal character pipeline” for community-driven videos, troubleshooting tips, and step-by-step guides tailored to common character creation challenges.
- Autodesk (Maya/Max) to Unreal: Epic’s ARTv1 toolkit, available on the Unreal Marketplace, includes Maya-specific rigging and export workflows, supplemented by GDC talks on YouTube detailing professional pipelines. Autodesk’s AREA blog provides tutorials on preparing Maya or 3ds Max assets for UE5, focusing on FBX settings and animation integration. Search “Maya to Unreal character tutorial” for artist-created guides, and explore Live Link documentation for real-time animation workflows.
- Community Forums and Q&A: Unreal Engine’s official forums and Answers hub host active discussions on character creation, with threads addressing issues like import errors, rigging challenges, or material setups. Search for specific topics (e.g., “UE5 skeletal mesh import scale”) or browse the tutorial section for user-submitted projects. These platforms offer practical, community-driven solutions, often with example assets or workflows for UE5 character pipelines.
- YouTube Channels & Playlists:
- Virtus Learning Hub and Ryan Laley: These channels provide extensive UE5 character tutorials, covering modeling, rigging, texturing, animation, and optimization. Their beginner-friendly videos include practical examples, like setting up Animation Blueprints or retargeting marketplace animations, with regular updates for UE5’s evolving features.
- Mixamo to Unreal tutorials: YouTube hosts numerous guides on using Mixamo for rapid rigging and animation imports into UE5. These tutorials detail auto-rigging, FBX exports, and retargeting Mixamo animations to custom skeletons, ideal for prototyping or small projects needing quick motion setups.
- Reallusion tutorials: Reallusion’s YouTube channel and online Magazine offer tutorials on Character Creator (CC) for UE5, focusing on the Auto Setup plugin, morph exports, and material integration. These resources guide artists through CC’s automated pipeline, providing practical tips for efficient character creation and Unreal compatibility.
- 80 Level Articles: 80.lv publishes in-depth artist interviews and workflow breakdowns on UE5 character creation, such as “Building Characters with Character Creator for Unreal.” These articles explore professional techniques, tool integrations (e.g., ZBrush, Substance), and optimization strategies. While not step-by-step, they provide high-level insights and inspiration for advanced UE5 character pipelines.
- Online Courses and Platforms: Platforms like Udemy, Yelzkizi, GameDev.tv, and Pluralsight offer paid courses on UE5 character creation, covering modeling, rigging, texturing, and animation workflows. Ensure courses are UE5-specific, as core principles (e.g., FBX exports) apply across versions. Free YouTube “full courses” by creators like CG Fast Track provide structured alternatives, breaking down character pipelines into manageable lessons for all skill levels.
- Epic Marketplace Samples: Unreal’s Action RPG and Lyra Starter Game, available in the Marketplace, include sample characters with documented setups for rigging, Animation Blueprints, and physics. MetaHuman samples showcase advanced facial rigs and morph-driven animations. Study these projects’ Blueprints and assets for practical insights, using their documentation to understand UE5’s character systems.
- Polycount and 3D Art Communities: Polycount’s forums and wiki offer extensive resources on game character modeling, with threads discussing Unreal-specific rigging, optimization, and texturing. Search for “Unreal character rig” or “UE5 skeletal mesh tips” for detailed advice. Weta’s exp-points.com provides style-specific guides (e.g., stylized vs. realistic), offering engine-agnostic techniques applicable to UE5 workflows.
- Stack Exchange: Blender and Unreal StackExchange sites provide targeted answers to character creation questions, from export settings to rigging errors or material setups. Post specific queries or search existing threads for solutions, leveraging these communities’ technical expertise. Answers are concise and often include code snippets or workflow tips for UE5.
- Official MetaHuman Documentation/Tutorials: Epic’s MetaHuman documentation, accessible via Unreal’s website, details the creation, customization, and animation of MetaHumans in UE5. YouTube tutorials and livestreams explore advanced workflows, like Mesh to MetaHuman or Live Link Face integration. These are essential for artists using MetaHumans as a base or integrating custom assets with MetaHuman rigs.
- Blogs and Personal Websites: Indie developer blogs or Medium posts chronicle UE5 character creation journeys, often as multi-part series detailing modeling, rigging, and import challenges. Search for “UE5 character creation workflow” to find narrative-driven guides with practical tips, such as optimizing polycounts or troubleshooting imports. These offer unique perspectives and real-world problem-solving.
Leverage a mix of official UE5 tutorials, community YouTube videos, and forums to master character creation, combining structured courses with practical, hands-on resources to address both foundational and advanced techniques.

Conclusion
Creating your own character for Unreal Engine 5 is a challenging but rewarding journey. We started by breaking down the step-by-step workflow – from initial modeling and texturing through rigging, animation, and finally import into UE5 – emphasizing the importance of planning and using the right tools for each stage. Whether you choose a completely manual pipeline (sculpting in ZBrush or modeling in Blender) or leverage tools like MetaHuman Creator, Reallusion Character Creator, or Daz 3D, the goal remains the same: to bring a fully realized, moving character into Unreal Engine.
Throughout this guide, we addressed both realistic and stylized characters. For realistic characters, we highlighted workflows like MetaHuman that can save time and provide AAA-quality rigs and materials. For stylized characters, we noted that Unreal’s flexibility with materials and lighting allows you to achieve any look – you’re not limited to realism, and many successful UE5 games use highly stylized art styles. The fundamentals of import, skeleton setup, and animation apply regardless of style.
We also covered the nuts and bolts of making your character game-ready: optimizing polygons and textures, setting up LODs, and ensuring good performance through careful material and physics setup. We looked at how to give your characters life with facial blendshapes, physics simulations for cloth and hair (using Chaos physics), and how to retarget animations so you can reuse the vast library of motions available rather than animating everything from scratch. These techniques bridge the gap from a static model to a dynamic in-game character. Some key takeaways and final tips:
FAQ
- How do I create a custom 3D character workflow for UE5?
Start with concept art or reference images, then sculpt or model your high-poly mesh in a DCC tool (Blender, ZBrush, Maya). Retopologize to a low-poly mesh, unwrap UVs, and bake detail maps (normals, AO). Paint PBR textures (Base Color, Roughness, Metallic) in Substance Painter or Blender. Rig your low-poly mesh with a skeleton (Mixamo, Rigify, Maya ART), test deformations with basic animations, export everything as FBX, and import into UE5. Finally, set up materials, physics assets, and test animations in-engine. - Which software tools are best for character creation before importing to UE5?
- Modeling/Sculpting: Blender (free), ZBrush (for high-poly detail), Maya/3ds Max (industry standard).
- Retopology & UVs: Blender’s Shrinkwrap & RetopoFlow, Maya’s Quad Draw, TopoGun.
- Texturing: Substance 3D Painter, Quixel Mixer, Blender Texture Paint.
- Rigging/Animation: Blender Rigify, Maya HumanIK/ART, Mixamo or AccuRIG for auto-rig, and Epic’s Control Rig in UE5.
- Can I use Blender for the entire character pipeline?
Yes. Blender can handle modelling, sculpting, retopology, UV unwrapping, baking, texture painting, rigging (with Rigify), and animation. Use Epic’s “Send to Unreal” or a free Blender-to-Unreal addon to export FBX with correct scale (1 unit = 1 cm) and orientation (–Z forward, Y up), ensuring seamless UE5 integration. - What’s the correct way to export a Blender character to UE5?
Apply all transforms (Ctrl + A), set unit scale to centimeters, orient your mesh to face –Y, then export selected mesh and armature as FBX with “Apply Scalings: FBX Unit Scale,” “–Z Forward, Y Up,” disabled leaf bones, and enabled Shape Keys if using morphs. In UE5’s import dialog, enable Skeletal Mesh, import morph targets, and assign or create a new skeleton. - How do I rig and animate a custom character for UE5?
In your DCC, build a bone hierarchy with a root bone, bind the mesh with weight painting, and test deformations. Animate in-DCC or use Mixamo/Maya animations, exporting them as FBX. In UE5, import animations directly or retarget them from the Unreal mannequin to your custom skeleton using the IK Retargeter, generating new animation assets that play seamlessly on your character. - How do I texture my character for proper PBR materials in UE5?
After baking normal and AO maps, paint Base Color, Roughness, Metallic, and optionally Ambient Occlusion in Substance Painter or Blender. Export textures (PNG/TGA) at appropriate resolutions (2K–4K for heroes, 1K for NPCs). In UE5, create a Material, plug each map into its respective node, enable subsurface scattering for skin if needed, and adjust material instance parameters for fine-tuning. - Is MetaHuman Creator a faster option for realistic human characters?
Absolutely. MetaHuman Creator is a cloud-based tool that generates fully rigged, photorealistic humans with eight LODs, facial blendshapes, and groom hair in minutes. Export via Quixel Bridge directly into UE5, then drive animations with Live Link Face or retarget mannequin animations, skipping weeks of manual modeling, texturing, and rigging. - How do I optimize my character for real-time performance in UE5?
- Use 30k–60k triangles for hero characters and create LODs to reduce polycount at distance.
- Limit materials to 1-3 slots by atlasing UVs and packing Roughness/Metallic/AO into a single texture.
- Compress textures (BC1 for color, BC5 for normals) and generate mipmaps.
- Restrict bones to ~100 influences total, use 4 weights per vertex, and enable GPU skinning.
- Cull off-screen physics and animations, and profile with UE5’s stat tools to catch bottlenecks.
- What’s the process for adding cloth and hair simulation?
Import your cloth mesh with proper weight-painted anchors (waist or shoulders), then use the Cloth Paint tool in the Skeletal Mesh editor to create a Clothing Data asset. Paint simulation weights, configure Chaos Cloth settings (stiffness, damping, collision), and test. For hair, export Blender particle hair as Alembic groom, import it via the Groom plugin, assign a hair material, and enable Niagara/Chaos physics for realistic motion. - How do I retarget existing animations onto my custom character?
Create IK Rig assets for both the source (e.g., Unreal mannequin) and target skeleton. In an IK Retargeter asset, map corresponding bone chains between rigs, adjust root motion and scale options, then preview animations. Select source animations and click “Export Retargeted Animations” to generate new UE5 Animation Sequences bound to your character’s skeleton, ready to use in Animation Blueprints.

Sources:
- Unreal Engine Documentation – Setting Up a Character (Epic Games)dev.epicgames.comdev.epicgames.com
- Unreal Engine Documentation – Animation Retargeting in Unreal Enginedev.epicgames.com
- Unreal Engine MetaHuman Documentation – Overview & Workflows (Epic Games)unrealengine.comunrealengine.com
- FlippedNormals – PixelHair product description (YelzKizi, 2023)flippednormals.comflippednormals.com
- Reallusion Magazine – Character Creator to Unreal Engine 5 Best Practice (2022)magazine.reallusion.commagazine.reallusion.com
- Epic Games Blog – “Download our new Blender addons” (Blender to Unreal workflow)unrealengine.comunrealengine.com
- Wingfox Blog – Game Art: Difference between Realistic and Stylized (2021)blog.wingfox.comblog.wingfox.com
- Polycount Forum – Unreal Engine Optimization Guide (user thread)polycount.com
- Daz 3D Documentation – Daz to Unreal Bridge (2021)daz3d.com
- 80.lv Article – Using Character Creator 4 to Make High-Quality 3D Characters for Unreal (Cho Hyungmo, 2025)80.lv80.lv
Recommended
- Metahuman and NVIDIA Omniverse Audio2Face : Create Real-Time Facial Animation for Unreal Engine 5
- The Outer Worlds Companions – How to Recruit, Abilities, and Best Combinations
- Nudity in Love, Death and Robots: Episode Guide, Artistic Use, and Viewer Insights
- Why Does Hair Look Bad in Video Games? Exploring the Challenges of Realistic Hair Rendering
- The Best Beards in Video Games: Iconic Characters and Their Legendary Facial Hair
- How to Simulate Water-Drenched 3D Hair in Blender
- What’s New in Blender 4.4: A Comprehensive Overview of Features and Improvements
- How do I create a depth map using the Blender camera?
- How do I smooth camera motion in Blender animations?
- Metahuman Facial Motion with Faceware: Complete Workflow for Realistic Face Animation in Unreal Engine 5