Photogrammetry, facial scanning, and Unreal Engine’s MetaHuman tools allow you to create a realistic digital avatar by capturing a 3D head model, converting it into a rigged character, adding custom hair, and animating it with motion capture. This guide details the process using accessible tools like cameras, smartphones, RealityCapture, and PixelHair. Precision in each step ensures an accurate likeness. The final MetaHuman can be used in films, games, or virtual experiences.
Photogrammetry
Photogrammetry captures a 3D model from multiple photos, ideal for detailed facial scans. Overlapping photos with consistent lighting and a neutral expression are key. Software like RealityCapture creates a textured mesh from these images. Cleanup in Blender ensures a usable model for MetaHuman conversion.
Here are key steps for success:
- Equipment and Software: Use a smartphone or DSLR and photogrammetry software like Epic’s RealityCapture for high accuracy, or free alternatives like Meshroom. Epic’s RealityScan mobile app allows phone-based scanning.
- Photography Tips:
- Take dozens of overlapping photos (front, sides, 45-degree angles, above, below) to ensure each facial part appears in multiple images.
- Use consistent, soft lighting (e.g., overcast sky, softbox) to avoid shadows or bright spots.
- Maintain a neutral expression and stay still; ideally, have someone else photograph or use a turntable and timer.
- Use high-resolution camera settings, keep the face in focus, and frame the head closely for detail.
- Processing: Feed photos into photogrammetry software to generate a dense 3D point cloud and textured mesh. A powerful PC with a good GPU speeds up this compute-intensive process.
- Cleanup: Use Blender or ZBrush to refine the high-poly mesh, filling holes (e.g., top of head, under chin), smoothing noisy areas (like facial hair), and removing background elements. Aim for a watertight mesh of the head and some neck/shoulders.

Pro Tip: Hair is notoriously difficult to capture with photogrammetry, because it lacks distinct features and can appear as a fuzzy cloud in the scan. Don’t worry about getting a perfect scan of your hair – focus on the face and head shape. You can handle hair separately later (we’ll cover that in the PixelHair section). If you have long hair, tie it back, and if you have very shiny or reflective accessories (glasses, earrings), remove them for the photo shoot to avoid artifacts in the scan
Facial Scanning
Facial scanning refers to using specialized 3D scanning devices or depth sensors to capture your face. This can be faster or more convenient in some cases. If you have access to an iPhone with FaceID (or newer models with LiDAR), a structured-light scanner, or even a full 3D scanning rig, you can leverage those for capturing your head.
Here are some options:
- Smartphone Apps: Apps like Epic’s RealityScan, Polycam, or Trnio use a phone’s camera to capture a sequence of shots, cloud-processing them into a 3D model via streamlined photogrammetry. Quality is lower than professional rigs but sufficient for MetaHuman workflow with neutral expression and even lighting.
- Structured Light/Laser Scanners: Devices like Artec Eva or Revopoint POP project patterns to capture depth directly, producing precise meshes quickly without needing many photos. Apple’s FaceID TrueDepth sensor (iPhone X and later) enables apps like Bellus3D or Polycam to create detailed depth maps in seconds, though with less texture detail than high-res photogrammetry.
- Professional Rigs: Multi-camera photogrammetry rigs (e.g., 86 DSLRs) capture high detail (pores, wrinkles) instantly but are costly and typically used in VFX studios. A single camera or phone is adequate for MetaHuman creation.
Scan Requirements:
- Neutral Pose: Capture a relaxed, neutral expression with slightly open eyes and closed mouth for MetaHuman’s base pose.
- Proportionate Scale: Ensure the mesh is near real-life scale (~20-25 cm tall); scale in Blender or Unreal if needed.
- Mesh Simplicity: Remove extraneous elements (e.g., background, stand) and ensure a single, connected head mesh, excluding separate eyeball meshes.
The goal is a 3D head model with accurate geometry; texture is useful for reference but secondary. This model is ready for the Mesh to MetaHuman pipeline.
Mesh to MetaHuman
Mesh to MetaHuman transforms a scanned 3D head into a rigged MetaHuman using Unreal Engine 5’s plugin. The mesh is imported, aligned with a template, and processed via cloud services for rigging. The output is a customizable character editable in MetaHuman Creator. This ensures realistic facial controls and body integration.
Here’s how it works:

- Prepare the Mesh: Export the head mesh as FBX or OBJ with a reasonable polygon count (decimate high-poly scans to hundreds of thousands of triangles). Include texture if available for alignment reference.
- Import into Unreal Engine 5: Enable the MetaHuman Plugin, import the mesh as a Static Mesh into a UE5 project’s Content Browser.
- Create a MetaHuman Identity Asset: Right-click in Content Browser to create a MetaHuman Identity asset, add the imported mesh via “Add Components > From Mesh” in the Identity editor.
- Choose a Body and Set Neutral Pose: Select a body type/height in the Identity asset. Align a template mesh to the scan’s neutral expression in the viewport, adjusting wireframe overlay to match jaw, eyes, and mouth, then capture the pose with the “Promote Frame” button.
- Run the Identity Solve: Execute the Solve function to fit the MetaHuman template to your scan’s shape via cloud computation (requires Epic account login and Quixel Bridge sign-in). A conformed MetaHuman mesh appears, matching your face.
- Submit to MetaHuman Backend: Use the Mesh to MetaHuman option to upload the solved face for auto-rigging, naming the MetaHuman. The cloud grafts the head onto the chosen body, creating a complete character.
- Refine in MetaHuman Creator: The new MetaHuman appears in the online MetaHuman Creator. Adjust skin tone, eye size, ears, or other features using sliders, ensuring physical plausibility. Add eye color, eyebrows, and hairstyle (or plan for custom hair).
- Download to UE5: Use Quixel Bridge to download the customized MetaHuman into the UE5 project, adding all assets (face/body meshes, materials, hair, eyes) to the Content folder.
The process, from mesh import to download, can take under an hour with a good scan. Tips:
- Facial Feature Alignment: Correct misaligned eyes/nose/mouth in MetaHuman Creator if the neutral pose was off.
- Texture Transfer: MetaHuman uses its own skin texture, not the scan’s; manually edit texture maps or use decals for freckles/marks.
- Organization: The Identity asset’s name becomes the MetaHuman’s; manage multiple iterations clearly.
The result is a fully rigged MetaHuman with your face, ready for hair customization and animation.
PixelHair
Recreating realistic hair for a MetaHuman digital double is challenging, as MetaHuman Creator’s preset hairstyles may not match unique or culturally specific hair types. PixelHair, a collection of high-quality 3D hair assets for Blender and Unreal Engine, offers diverse, realistic hairstyles (e.g., straight, wavy, braids, afros) to closely resemble your hair. Built on Blender’s particle hair system, PixelHair uses strand curves or hair cards with a hair cap mesh (~18k polygons) that shrink-wraps to fit any head shape, ensuring a snug fit on MetaHumans.
Workflow:
- Select/Create Hairstyle: Choose a PixelHair style from asset packs (e.g., on BlenderMarket, FlippedNormals) matching your hair, or tweak it in Blender (comb, cut digitally).
- Export to Unreal: Export as an Alembic (.abc) groom or hair cards with materials, importable into Unreal Engine 5’s Groom system or as meshes.
- Attach to MetaHuman: Replace default MetaHuman hair by adding the PixelHair groom to the Head attachment point using a Groom Component or Binding asset for proper scalp movement.
- Adjust Materials/Physics: Apply PixelHair’s realistic hair shaders (shading, scatter, specular) and enable physics for movement if needed.
- Fine-Tune: Adjust hairline/position for natural alignment, leveraging customizable strand density, length, or shape.
PixelHair enhances realism over MetaHuman’s card-based hair, offering reusable, professional-grade hair assets. Alternatives like Unreal’s Groom Editor or Maya XGen exist, but PixelHair’s ready-made styles are convenient.
PixelHair provides realistic, customizable hairstyles for MetaHumans, enabling a personalized look with diverse presets, significantly improving your digital double’s realism.

Motion Capture Tools
Motion capture allows you to record real human movements and translate them onto your MetaHuman character, making it perform those actions with realistic timing and nuance. Unreal Engine 5 and the MetaHuman framework are designed to work seamlessly with mocap data. Here are the main motion capture components you might use for a personal MetaHuman:
Facial Motion Capture:
- MetaHuman Animator: Uses an iPhone’s TrueDepth camera to record facial expressions and lip-sync, transferring nuanced performances to MetaHumans. Record a video of yourself acting/speaking, and MetaHuman Animator (with Live Link Face) processes it into high-fidelity keyframe animation, compatible with ARKit blendshapes. Indie creators use an iPhone with Live Link Face to stream or record facial data in real time.
- Alternatives: Webcams with DroidCam, Faceware, or Dynamixyz, though iPhone-based capture is highly accessible and effective.
Body Motion Capture:
- Mocap Suits: Inertial suits (e.g., Xsens, Rokoko) with sensors or optical systems track skeletal motion, driving the MetaHuman’s skeleton. Wear a suit (e.g., Rokoko Smartsuit with gloves for fingers), perform actions, and stream data to Unreal via Live Link after calibration (e.g., T-pose) and retargeting.
- Alternatives: AI-based mocap (e.g., Move.ai, Rokoko’s video capture) uses regular video for less precise but viable results.
Animation without Mocap:
- Keyframe Animation: Use Unreal’s Control Rig for manual face/body animation in Sequencer.
- Pre-made Animations: Retarget animations from Unreal Marketplace/Mixamo to MetaHumans using UE5’s IK Retargeter.
- Audio-Driven Lip-Sync: MetaHuman Animator generates facial animation from audio, saving time over keyframing.
Workflow:
- Facial: Use Live Link Face/MetaHuman Animator to stream or record iPhone facial data, refining with Take Recorder for high-quality sequences.
- Body: Stream suit data via plugins/Live Link, retarget to MetaHuman skeleton, and record/edit in Sequencer for live or offline use.
- Full Performance Capture: Combine iPhone facial capture and a mocap suit for face, body, and hand animation, or mix captured face with pre-made body animations.
- Sequencing/Editing: Unreal’s Sequencer allows fine-tuning, blending, or layering animations (e.g., walk cycle with captured head movement).
Unreal supports third-party tools (e.g., Xsens, Manus gloves), and community tutorials guide integration. Mocap saves time for lifelike results, but keyframing or animation libraries are viable alternatives.
Motion capture tools like MetaHuman Animator (iPhone-based) and mocap suits (e.g., Rokoko) enable realistic facial and body animation for MetaHumans, with non-mocap options like keyframing, pre-made animations, or audio-driven lip-sync available, opening content creation possibilities.

FAQs
- What is a MetaHuman and how do I turn myself into one?
A MetaHuman is Epic Games’ technology for creating highly realistic, fully rigged, customizable digital humans. To create a MetaHuman of yourself, capture your face in 3D using scanning or photogrammetry, then use the MetaHuman Creator (a cloud app) or the Mesh to MetaHuman plugin in Unreal Engine 5 to convert the 3D model into a MetaHuman. Import the scan into Unreal, where the system fits a template to produce a digital character with your likeness, which you can customize with features, hair, and clothing. - Do I need a special 3D scanner to create a MetaHuman of myself, or can I use my smartphone?
A smartphone is sufficient for creating a MetaHuman; an expensive 3D scanner isn’t required. Use a phone to take photos for photogrammetry or scanning apps like Epic’s RealityScan (free on iOS/Android) to generate 3D models. Photogrammetry software (some free or trial) can process multiple photos into a 3D mesh. Dedicated scanners (e.g., structured light or LiDAR) improve quality but aren’t necessary. Good technique, many photos, proper lighting, or a reliable app is key. - Can I use MetaHuman Creator without scanning my face (i.e., manually create my likeness)?
Yes, MetaHuman Creator allows manual creation of your likeness without scanning. Start with preset faces and use sliders to adjust features, guided by reference photos, similar to a video game character creator. While scanning offers greater accuracy, manual editing with the Creator’s wide range of facial features and skin tones can achieve a decent likeness. Experienced artists can get close, and textures (e.g., beard, freckles) can be refined manually, though it’s more time-consuming. - How accurate will my MetaHuman look compared to me?
A well-executed MetaHuman can be very accurate, often strikingly resembling you, especially with a good scan and Mesh to MetaHuman process. It captures face shape details but may smooth or average features for plausibility, missing tiny asymmetries or pores unless manually added. Minor tweaks in the Creator (e.g., eyes, jawline) and accurate hair/eyebrows enhance likeness. Community examples show recognizable results with minimal adjustments, though not a perfect clone, requiring some fine-tuning for perfection. - How do I capture my hair and unique features (like tattoos or beard) for my MetaHuman?
Hair and unique features need separate handling:- Hair: MetaHuman Creator offers premade hairstyles; choose the closest match. For custom styles, use PixelHair for realistic grooms (e.g., dreadlocks, braids).
- Facial Hair: Select and adjust beard/stubble options in the Creator. If a scan includes a beard, smooth the mesh and apply a Creator beard for better results.
- Tattoos/Skin Marks: Manually edit the MetaHuman’s skin texture in Photoshop/Substance Painter or use Unreal’s Decals/Mesh Decals to project tattoos/marks. Raised scars may appear in the mesh, but color patterns require texture work.
- What tools and software are required for the Mesh to MetaHuman workflow?
Required tools include:- Unreal Engine 5 with MetaHuman Plugin enabled for Mesh to MetaHuman conversion.
- MetaHuman Creator account (free) for online access.
- Quixel Bridge (integrated in UE5) to download MetaHumans.
- Photogrammetry/Scanning software: RealityCapture, Metashape, Meshroom, or apps like RealityScan/Polycam for 3D model creation.
- 3D editing software (optional): Blender, ZBrush, or Maya for scan cleanup.
- Motion capture tools (optional): Live Link Face app or Rokoko Studio for animation, not needed for creation.
- Hardware: Decent PC with good CPU/GPU, RAM; camera or smartphone for capture; GPU with CUDA cores speeds up photogrammetry. Essential software is free (Unreal, Blender, RealityScan), with hardware and time being the main barriers.
- Does Mesh to MetaHuman work on Mac, or is it Windows-only?
Mesh to MetaHuman is Windows-only, as the MetaHuman Plugin doesn’t support Mac versions of Unreal Engine. Options for Mac users include using Boot Camp (Intel Macs) for Windows, borrowing a PC for conversion, or sending the mesh to a PC user for processing. Generated MetaHumans can be used elsewhere, but creation requires Windows. - Is MetaHuman creation free, and can I use my avatar commercially?
MetaHuman creation is free via Unreal Engine and MetaHuman Creator. MetaHumans can be used royalty-free in commercial Unreal Engine projects (games, films, VR) without fees. Using MetaHumans outside Unreal requires checking Epic’s EULA, as assets are intended for Unreal. Selling MetaHuman data itself is prohibited, but monetizing projects (e.g., films, YouTube) is unrestricted, with standard Unreal royalties applying for high-earning games. - How can I animate my MetaHuman once it’s created (if I don’t have motion capture gear)?
Animation options without mocap gear include:- Manual Keyframe Animation: Use Unreal’s Control Rig or Maya (via Bridge) to animate face/body by setting keyframes in Sequencer.
- Pre-made Animations: Retarget body animations from Unreal Marketplace/Mixamo or blend preset facial poses using Unreal’s IK Retargeter. Audio-to-animation plugins (e.g., MetaHuman Animator, FaceFX) generate facial animations from dialogue.
- Live Link with Devices: Use webcams, Xbox Kinect, or iPhone/iPad (via Live Link Face app) for basic motion capture, especially facial.
- Mixing Techniques: Combine pre-made body animations with phone-captured facial animations or use video references for manual animation.
Unreal’s tools and libraries enable animation, with iPhone-based MetaHuman Animator offering high-quality results without professional setups.
- What can I do with a digital MetaHuman of myself once it’s made?
Use cases include:- Film/VFX: Use as a stand-in, stunt double, or for previsualization in films.
- Games: Feature as a protagonist or cameo in games, especially VR/AR.
- Content Creation/Streaming: Use as a photorealistic VTuber-style avatar for YouTube or streaming.
- Marketing/Social Media: Create content with digital clones or virtual styling (e.g., outfits).
- Training/Simulation: Use in VR training as a virtual instructor.
- Archiving/Digital Legacy: Preserve your likeness for future use.
- Art/Experimentation: Incorporate into interactive art or explore modified versions (e.g., aged self).
MetaHumans are versatile for any 3D character need, enhancing virtual world applications, but should be used thoughtfully as they represent you.

Conclusion
Creating a MetaHuman merges photogrammetry, scanning, Mesh to MetaHuman, PixelHair, and motion capture to craft a hyper-realistic digital avatar, transforming a detailed 3D scan into a rigged character with personalized hair and lifelike movements. Once exclusive to big studios, these accessible tools now empower enthusiasts to produce avatars for films, games, or virtual production, requiring careful attention to detail for accuracy. The workflow democratizes 3D artistry, enabling creators to explore endless possibilities, from cinematic previsualization to interactive experiences, as technology continues to lower barriers. MetaHumans blur the line between reality and digital worlds, offering a versatile digital double ready for any creative vision.
Creating a MetaHuman of yourself blends artistry and technology, transforming your likeness into a high-fidelity digital character in Unreal Engine 5. The process involves:
- Photogrammetry/Scanning: Capturing your face in 3D with a smartphone or camera to create a detailed model.
- Mesh to MetaHuman: Converting the scan into a fully rigged MetaHuman using Unreal Engine 5’s workflow, with fine-tuning in MetaHuman Creator for accurate features, skin tone, and eyes.
- PixelHair: Adding realistic, customized hair from PixelHair’s diverse hairstyle library to match your unique look, enhancing recognizability.
- Motion Capture: Animating the MetaHuman with facial (via iPhone-based MetaHuman Animator or Live Link Face) and body (via mocap suits like Rokoko or AI-based solutions) capture to replicate your movements and expressions, or using keyframe animation and pre-made libraries.
This workflow, once exclusive to big studios, is now accessible to 3D artists, freelancers, small studios, and enthusiasts using tools like Unreal Engine (free), RealityScan, Live Link Face, and affordable mocap options. Accuracy is key:
- A thorough scan ensures a strong base.
- Careful Mesh to MetaHuman setup and Creator refinements elevate likeness.
- Personal touches like hair, scars, or moles add personality.
- Motion capture infuses your mannerisms, making the avatar truly “you.”
Applications include previsualization, virtual production, filmmaking (e.g., stunts, stand-ins), games (playable characters/NPCs), content creation (VTubing, streaming), commercials, training simulations, and digital influencer roles. Agencies can use MetaHumans for virtual talent or AI-driven marketing videos. The technology’s democratization enables services like digitizing clients or including developers in games.
The toolset RealityCapture, MetaHuman Creator, PixelHair, Live Link Face offers unprecedented capabilities for creating hyper-realistic digital humans. Use cases span practical (VFX, simulation) and creative (art, immersive experiences). As tools evolve, possibly with AI assistance, barriers will lower further. Your MetaHuman, looking and moving like you, is ready for Unreal Engine projects, whether starring in films, games, or virtual worlds, limited only by your imagination.
Recommended
- Unreal Engine for Beginners: A Step-by-Step Guide to Getting Started
- Understanding PixelHair: A Comprehensive Guide to Realistic 3D Hair Assets
- Camera Switching in Blender Animations: Mastering The View Keeper
- How do I align the camera to an object’s surface in Blender?
- Can you control a camera in Blender with game controllers?
- Mastering Water Simulation in Unreal Engine: Techniques, Tools, and Best Practices
- Metahuman and NVIDIA Omniverse Audio2Face : Create Real-Time Facial Animation for Unreal Engine 5
- Is Arcane 3D Animated? Unveiling the Secrets Behind Its Revolutionary Style
- How Do I Lock the Camera to the View in Blender?
- How do I create a dolly zoom effect in Blender?