Cinematic Motion Capture with Move One and MetaHuman Animator
This guide explains how to achieve film-quality motion capture using Move One (an AI motion capture tool) and MetaHuman Animator in Unreal Engine 5 without a motion capture suit. It covers body and facial capture, animation syncing, and realistic hair integration with PixelHair, suitable for all experience levels and creators.
What is Move One and how does it work for motion capture?
Move One by Move AI is a markerless motion capture solution using a single camera (e.g., iPhone) to capture full-body motion via AI and computer vision. The process includes:
- Record: Film one actor at a time using the Move One iOS app on a stable smartphone.
- Upload & Process: Upload video to Move AI’s platform for cloud-based AI processing to create a 3D animated skeleton.
- Download Motion Data: Receive an animation file (FBX or USD) in minutes for retargeting to 3D characters.
Move One is portable and affordable, ideal for indie projects and prototyping, though it may miss subtle nuances compared to high-end systems like Vicon.
SEO Tip: Move One is a markerless AI mocap tool ideal for creators seeking budget motion capture without suits.

Can I use Move One for cinematic-quality motion capture without a suit?
Yes, Move One enables cinematic-quality mocap without suits, capturing complex actions like fights and dances for lifelike digital characters. Examples include:
- “Gigantic Joe” short film: Used Move One for body motion capture and MetaHuman Animator for facial expressions, achieving AAA-level animation.
- “Banished” by David Stapp: A UE5 short film with Move One for all character motion, completed by a solo creator in ~2 months.
- “Divine Dose” by Buse Simon: Showcased believable full-body acting with Move One on UE5.5 MetaHumans.
For cinematic quality, ensure:
- Good Video Input: Record in well-lit areas with minimal occlusion at 60 FPS for smoother capture.
- Post-Processing: Polish animations in Unreal (IK, Control Rig) or tools like Cascadeur to fix jitter or foot sliding.
- Combine with Facial motion capture : Use MetaHuman Animator for face capture to complement body data.
Note: Cinematic quality is achievable, but manual cleanup is often needed. Move One’s accuracy is slightly below top-tier systems, requiring in-engine refinement.
How do I set up AI mocap with Move One and Metahuman Animator in Unreal Engine 5?
Setting up AI motion capture with Move One and MetaHuman Animator in UE5 involves:
- Capture Body Motion with Move One:
- Record actor’s full-body performance using Move One app (iPhone recommended, tripod for stability).
- Process in Move AI’s cloud, download _moveai.fbx file.
- Import and Retarget to MetaHuman:
- In UE5, create a folder for MoveOne assets, add Move AI’s retargeting assets (IK rigs).
- Import Move One FBX.
- Use IK Retargeter (MoveOne_Metahuman.uasset):
- Target IK Rig: IK_Metahuman.uasset.
- Source Preview Mesh: Move AI skeletal mesh (_moveai).
- Auto-map bones, preview on MetaHuman.
- Export retargeted animation to MetaHuman rig.
- Capture Facial Performance with MetaHuman Animator:
- Record face with iPhone (Face ID recommended) via Live Link Face or MetaHuman Animator.
- Process video in MetaHuman Animator (UE 5.2+), generating facial animation using GPU 4D solver.
- Apply to MetaHuman (via Blueprint or animation sequence).
- Sync Body and Face Animations:
- In Level Sequencer:
- Add body animation to Body track.
- Add facial animation to Face track (via Animation Sequence, Anim Blueprint, or Control Rig).
- Key point: Combine body and face rigs using “Use Animation Asset” for body and ARKit for face. Adjust “Face Retarget Root” or use Animation Blueprint if neck issues occur.
- Tip: Sequencer blends animations; check Blueprint for unified mesh if head detaches.
- In Level Sequencer:
- Playback and Adjust:
- Play sequence; body (Move One) and face (MetaHuman Animator) should sync.
- Use Sequencer to trim or adjust timing for lip-sync and body alignment.
This pipeline enables quick, high-fidelity mocap for small teams using an iPhone and PC, merging Move One’s body motion with MetaHuman Animator’s facial animation in UE5.

Is it possible to animate Metahuman characters without wearing a motion capture suit?
You can animate MetaHuman characters realistically without a mocap suit using AI tools and Unreal Engine:
- Move One for Body: Quality body animations from a phone camera, retargeted to MetaHuman.
- MetaHuman Animator for Face: iPhone captures facial nuances, no markers or rigs needed.
- Control Rig for Hand-Keyed or Touch-ups: Manually adjust bones (e.g., fingers) in Unreal, suit-free.
- Other AI motion capture Alternatives: Plask (free webapp, video to FBX), DeepMotion, or RADiCAL (real-time to Unreal) use webcams/videos.
Suitless workflow:
- Record body with one camera, face with iPhone.
- Process body with Move One (or similar), face with MetaHuman Animator.
- Import, retarget, and combine in Sequencer.
No suits or markers, just standard devices and AI. Move AI claims suits are obsolete, cutting costs and complexity. Developers confirm MetaHumans animated in UE5 with these methods are commercially viable under Unreal’s license, enabling short films, cutscenes, and ads without big studio setups.
What are the advantages of AI mocap tools like Move One for indie creators?
AI motion capture tools (Move One, RADiCAL, DeepMotion, Plask, etc.) provide indie creators and small studios with key benefits:
- Low Cost Entry: Move One offers a free tier (30 seconds/clip, 30 credits) and affordable plans ($15 or $35), far cheaper than $5,000+ mocap suits or $20k optical setups. Plask is free for basic use, RADiCAL has a free trial.
- No Special Hardware: Requires only a smartphone or webcam (e.g., Move One uses one iPhone). No gloves, helmets, or calibration gear needed.
- Flexibility and Portability: Capture motion anywhere, living room, backyard, or on location, without a suit or fixed volume.
- Fast Turnaround: No lengthy setup; record instantly, with cloud processing (e.g., Move One) delivering data in minutes for quick iterations.
- Scalability for Solo or Small Teams: Solo devs can capture all characters themselves, no extra actors required.
- Integrated Workflows: Exports (e.g., Move One to FBX, USD, .blend) integrate with Blender, Unreal, etc., via plugins and retargeting setups.
- Continuous Improvement: AI updates (e.g., Move AI’s Gen-2) enhance accuracy and features without new hardware costs.
Example: A freelance animator can use Move One to act out a scene, process it, and deliver a rough Unreal sequence in a day, previously impractical without suits or stages. AI motion capture lowers cost, equipment, and skill barriers, enabling indie creators to produce quality animations efficiently, aligning with Move AI’s mission to democratize mocap for immersive storytelling at consumer price points.

How do I record body motion with Move One using just a phone or webcam?
Recording with Move One is simple:
- Equipment: Use an iPhone (11 or newer recommended). Move One’s app is iOS-only (Android users: consider browser-based tools like Plask).
- Setup the Shot:
- Place phone on a stable surface/tripod at chest/head height.
- Ensure good lighting, plain background.
- Frame full body, 8–12 feet away; stay in frame.
- Use the Move One App:
- Open Move AI app, select Move One capture.
- Follow calibration prompts if needed.
- Record up to 30s (free tier; paid plans allow 60s+).
- Avoid extreme occlusion or spins.
- Uploading and Processing:
- End take; app uploads to Move AI cloud.
- Wait for processing notification.
- Log into Move AI platform on PC; export as FBX, USD, or video.
- Using a Webcam (Alternate):
- Record with webcam/DSLR, upload via Move AI web platform.
- Video specs: ideally 1080p 60fps, MP4.
- Use Single File Upload; process/export as above.
Tips for best results:
- Start with a T-pose if possible.
- Move One is single-camera (Move Pro offers multi-cam).
- Position phone at waist height or above, avoiding extreme angles.
How do I sync Move One body motion capture with MetaHuman facial animation?
- Common Timeline/Reference: Record body/face simultaneously with a sync point (e.g., clap); or use audio timing.
- Apply Animations in Unreal:
- Import body (FBX) and face (animation asset) to Sequencer.
- Add tracks for body and face; align using sync point.
- MetaHuman Blueprint Trick:
- Bake facial anim to Face Control Board.
- Set body to Use Animation Asset, blend in facial rig.
- Checking Sync: Adjust facial track in Sequencer if off.
- Keep Head from Double Transforming:
- Disable head rotation in facial capture or use IK Retargeter.
- Test with Dialogue: Ensure body gestures and lips align; tweak in Sequencer.
Syncing combines body and face seamlessly using Unreal’s Sequencer.

Can I use Metahuman Animator for facial capture alongside Move One for body movement?
Using MetaHuman Animator for face and Move One for body is an effective workflow in Unreal Engine 5:
- Move One: Captures full-body motion from video (no suit).
- MetaHuman Animator: Captures facial performance from video (no markers), providing high-fidelity facial animation.
Integration steps:
- Capture Performances: Ideally simultaneously; if solo, record body first, then face while matching timing.
- Process: Body in Move AI, face in MetaHuman Animator (requires UE 5.2+ and MetaHuman Plugin).
- Import/Apply in Unreal:
- Import Move One’s FBX, retarget to MetaHuman (body).
- Use MetaHuman Animator to solve face:
- Import facial video or use take recorder (e.g., iPhone).
- Outputs Anim Sequence for face, applied via AnimBP or control rig.
- Sync both anims on the same MetaHuman.
- Import/Apply in Unreal:
Considerations:
- MetaHuman Animator animates facial rig (jaw, eyes, head rotation if enabled); Move One animates full skeleton, including head/neck. Choose which drives the head, often body motion capture for continuity, face capture for facial bones.
- Workaround: Zero out neck rotation in face anim if it conflicts with body.
- Success: Epic Games supports pairing MetaHuman Animator with mocap; a Reddit user combined it with Move AI for film-quality results.
Benefits:
- Cost-effective: Cheaper than face camera rigs or XSens suits.
- Simple: Uses iPhones (one for face, one for body).
- High quality: MetaHuman Animator delivers movie-quality facial animation fast.
Tip: Use iPhone 12+ for MetaHuman Animator (selfie depth sensor) and calibrate with a neutral pose for accurate results. This creates a MetaHuman with natural body movement and convincing facial expressions, no markers needed.
How can PixelHair be used to add cinematic-quality hair to Metahuman characters animated with AI motion capture tools like Move One?
PixelHair is a library of realistic hair assets (grooms) created in Blender for MetaHumans or custom characters, exportable to Unreal Engine to enhance film-quality hair simulation and rendering.
To use PixelHair with a MetaHuman (with or without mocap):
- Obtain a PixelHair Asset: Offers hairstyles (braids, dreadlocks, buns, etc.) as Blender files with particle hair systems, available from BlenderMarket or Yelzkizi. Includes a hair mesh (cap) and particle system data.
- Export from Blender to Unreal:
- Open the .blend file in Blender, adjust if needed (length, style).
- Export via Blender’s Alembic exporter (.abc) for hair grooms or as a static mesh + groom asset using the Blender to Unreal pipeline.
- Some packages include Unreal projects or FBX with .uasset files.
- Import into Unreal:
- Import the Alembic groom, creating a Groom Asset and Skeletal Mesh (if cap included).
- Enable the Groom Plugin in UE5 for rendering.
- Create a Groom Binding Asset to bind hair to the MetaHuman’s scalp mesh.
- Attach to MetaHuman:
- In the MetaHuman Blueprint, remove default hair or set to None.
- Add a Groom Component, assign the PixelHair groom, and attach to the head bone (Head or NeckTop socket).
- Import and attach the hair cap mesh, replacing the default if needed.
- Cinematic Quality Tweaks:
- Adjust LOD and Hair Materials for realism (high strand counts, volume, lighting reaction).
- Enable Chaos Hair simulation for physics on long or moving hair; static grooms work for cinematics.
- Use high-quality settings in Movie Render Queue for hair geometry.
PixelHair enhances MetaHumans animated by Move One and MetaHuman Animator with realistic, thick, clumped hair (e.g., braids, afros) that outperforms default hair, matching big-budget production quality. It’s an affordable way for indie filmmakers to elevate character visuals, especially in close-ups, boosting production value when paired with mocap animation.

How do I combine Move One motion capture data with Metahuman in Sequencer for cinematic scenes?
Here’s a shortened summary of the text, preserving all key details and meaning:
Combining motion capture data with MetaHuman in Sequencer:
- Import Mocap and Retarget: Retarget Move One motion to MetaHuman skeleton as “MyMoveAnim”.
- Prepare MetaHuman Actor: Place MetaHuman Blueprint (BP) in level, using BP_MetaHuman_Base class.
- Open Level Sequencer:
- Add MetaHuman BP to Sequencer, showing sub-tracks (Body, Face, Control Rig, etc.).
- Add Body Animation:
- On Skeletal Mesh track (e.g., “SKM_(MetaHumanName)_Body”), add MyMoveAnim clip.
- Add Facial Animation (if any):
- On Face track, add facial anim (e.g., “MyFaceAnim”) or use Control Rig/Take Recorder.
- Alternatively, bake face anim via Live Link or export from MetaHuman Animator to Anim Sequence.
- Align and Blend:
- Body anim might override face; use Sequencer to override or blend animations.
- Camera and Scene:
- Add Cine Camera to frame MetaHuman’s performance; scrub timeline to check motion.
- Adjusting if Something’s Off:
- Fix foot sliding/jitter via animation cleanup or IK retargeter.
- Blend animations if head is static (e.g., Layered Blend per Bone or exclude neck rotation).
- Playback in Cinematic Quality:
- Render via Movie Render Queue for high-quality output with realistic details.
For multiple MetaHumans, repeat per character. Sequencer treats motion capture as timeline clips, enabling flexible, non-linear blending and complex scenes. Conclusion: Import, retarget, assign, and play Move One mocap to drive a photoreal MetaHuman in Unreal’s cinematic pipeline.
What’s the workflow for editing and refining AI motion capture animation in Unreal Engine?
AI motion capture (e.g., Move One) provides raw animation data that may need refinement for issues like foot sliding, jitter, or timing. Here are key methods to edit animations in Unreal Engine:
A. Control Rig Editing (In Unreal):
- Open Animation Sequence in editor, bake to MetaHuman Control Rig.
- Adjust keyframes (e.g., fix foot sliding with IK foot control).
- Bake back to Animation Sequence.
- Allows tweaks without external software.
B. Take Recorder & Animation Layers:
- Add Control Rig track in Sequencer to layer adjustments (e.g., head turn, finger curl).
- Use Animation Motion Warping or Pose Editing (UE 5.1+) to fix foot sliding or root motion.
C. Export to DCC for Cleanup:
- Export to Maya/Blender for heavy cleanup using animation layers, HumanIK, or Graph Editor.
- Re-import FBX to Unreal.
- Preferred by some (e.g., MostHost_LA) for advanced f-curve editing.
D. Tools like IKINEMA or Cascadeur:
- IKinema (deprecated) or Move AI’s Gen-2 for foot stability.
- Cascadeur (free for indie) offers physics-based cleanup; import FBX, correct motion, export back.

E. Smoothing Filters:
- For jittery data, export to MotionBuilder or use UE plugins to smooth curves.
- In Control Rig, adjust tangents or apply Python smoothing.
F. Foot Sliding Fixes:
- Use IK foot planting in Control Rig to lock feet.
- Apply Motion Warping to enforce foot position.
- Manually adjust foot bone curves to flatten movement.
General Workflow:
- Apply motion capture, list issues (e.g., foot slide at frames 100-120).
- Choose in-Unreal (Control Rig for small tweaks) or external fix.
- For in-Unreal: add Control Rig in Sequencer, correct poses, blend or bake.
- For retargeting issues, adjust IK retargeter mapping.
- Save polished animation, test from multiple angles for cinematics.
- Keep original motion capture file, use non-destructive edits (new assets/layers).
Example: Adjust hand trajectory in Control Rig to align with a cup in mocap, preserving other motions. Test thoroughly, balance fixes across angles, or hide minor flaws off-camera. Unreal’s tools increasingly support in-engine fixes for efficient workflows.
How accurate is Move One compared to traditional motion capture suits for film-quality animation?
Move One, a single-camera AI motion capture system, has notable accuracy but limitations compared to high-end mocap suits or multi-camera optical systems:
- Positional Accuracy: High-end optical (e.g., Vicon) and inertial suits (e.g., Xsens) offer sub-millimeter precision and capture complex interactions. Move One loses some depth and minor motions, with tests showing Vicon outperforming it, though it’s adequate for prototyping.
- Stability: Mocap suits may drift but have filtering; Move One can jitter with poor video quality. Gen-2 models improve, but suits handle fast motions (e.g., sprinting) better due to no blur or occlusion issues.
- Occlusion Handling: Multi-camera systems and suits manage occlusion well; Move One struggles with overlapping limbs from a single view, leading to potential errors during extended occlusion.
- Cinematic Nuance: Move One captures subtle acting (e.g., shrugs) well for a markerless system, but high-fidelity suits (e.g., OptiTrack, Rokoko) excel at micro-movements. Suits may have noise, while Move One’s AI smooths some motions.
- Hands/Fingers: Move One tracks body only, no fingers. Many suits also require gloves for finger tracking, often needing separate animation (e.g., Leap Motion, manual).
- Face: Move One excludes face capture. Traditional systems sync face and body, but MetaHuman Animator matches or beats marker-based face capture, used separately.
- Bottom Line: Move One suits film-quality animation with cleanup, offering convenience and cost benefits. Suits and Vicon lead in precision. A Reddit user noted no system matches video perfectly for VFX, favoring keyframes, but Move One suffices for original animation. Short films prove its adequacy for audiences if paired with storytelling.
- Additional Angle: No markers enhance natural acting, potentially offsetting minor accuracy losses with lifelike movement.
- Conclusion: Traditional suits retain an edge in precision and complex scenarios, but Move One is “accurate enough” for cinematic use with cleanup. AI advancements narrow the gap, making it ideal for indie projects and previz, balancing slight accuracy trade-offs with cost and speed gains.

Can I livestream performance capture using Move One and Metahuman Animator?
Livestreaming with Move AI and MetaHuman tools varies by product:
- Move One: Not real-time; records first, processes data later. Unsuitable for live-streaming into Unreal.
- Move Live: Real-time markerless motion capture (multi-camera, likely enterprise). Could stream into Unreal via plugin/API.
- MetaHuman Animator (Face): Offline processing, but Live Link Face (iPhone) streams facial performance live to MetaHuman rig for puppeteering/broadcast.
- Alternative Body Options:
- Rokoko Video/Smartsuit: AI video (beta) or suits for real-time (costly).
- Live Link via Kinect: Low-quality live capture with Nuitrack/IKINEMA Orion.
- RADiCAL: Real-time plugin for Unity, Unreal via Canvas (cost/quality trade-offs).
Likely Livestream Setup:
- Face: Live Link Face (free).
- Body: Kinect (rough) or avoid full-body if quality matters, unless using Move Live.
With Move Live (or future Move One real-time):
- Cameras feed Move software for real-time skeleton data.
- Pipe via plugin/API to Unreal (Live Link source).
- MetaHuman follows body, with Live Link Face for simultaneous facial capture.
Latency & Hardware: Expect slight lag (~2 frames); needs strong PC/GPU for MetaHuman rendering + live solving.
Use Cases:
- VTubing/virtual livestreams.
- Live events/theater with digital characters.
- On-set visualization.
Summary: Move One and MetaHuman Animator are offline. Use Move Live or alternatives (Rokoko, Kinect) for body, Live Link Face for facial. Hybrid setups (e.g., Rokoko suit + Live Link) enable live MetaHuman puppeteering. Move Live’s potential public release could simplify suitless real-time mocap.
What budget setup do I need to start cinematic motion capture with no suit?
Here’s a summarized budget setup for AI motion capture:
Camera for Body:
- iPhone 11+ (Move One is iOS-only); older models (e.g., iPhone X) work, or any 1080p camera with manual upload to Move AI’s web app.
- Alternative: Plask (free, web-based) with webcam/phone.
Tripod or Stand:
- $20 smartphone tripod for stability.
PC or Laptop:
- CPU: Intel i7/Ryzen 7+.
- GPU: GTX 1660 minimum, RTX 3060+ recommended for MetaHumans (DLSS needed for Animator).
- RAM: 32GB ideal, 16GB minimum.
- Desktop preferred over laptop for power-per-dollar.
iPhone for Face:
- iPhone 12+ best for MetaHuman Animator; iPhone X/11 with TrueDepth still works.
Software & Accounts:
- Unreal Engine 5: Free (Epic account).
- MetaHuman: Free (via MetaHuman Creator).
- Move AI: Free plan, or $15/mo Starter (180 credits, ~3 min animation).
- Blender: Free (optional for hair/cloth edits).

Additional:
- Lighting: Softbox lights ($50) or bright daylight.
- Backdrop: Plain sheet ($15) for clear AI detection.
- Motion Props: Cardboard/replicas for acting (not captured).
Budget Estimate:
- With mid-range PC/iPhone: ~$100 (tripod, lights).
- Need iPhone: ~$400 (used iPhone XR ~$300 + accessories).
Notes:
- Single-actor capture; multi-actor needs separate recordings or enterprise setup.
- Audio: Separate lavalier/shotgun mic or dub later.
Case Study: Small agency uses iPhone, MetaHuman, and minimal gear for a rough commercial previz in a day.
This setup (smartphone, PC, free/cheap software) is far cheaper than Xsens (~$13k) or OptiTrack systems, making AI motion capture affordable.
How do I avoid common issues like foot sliding and jitter in AI motion capture?
Foot sliding and jitter are frequent mocap issues, especially in markerless systems. Here’s how to address them:
- Foot Sliding:
- Proper Retargeting: Fix sliding from mismatched skeletons in IK Retargeter; set root to translation and map feet IK for MetaHumans.
- Enable Root Motion: Use root motion to keep feet planted if animation includes movement.
- IK Foot Lock: Pin feet with Two-Bone IK in MetaHuman Control Rig during ground contact in Sequencer.
- Adjust in Graph Editor: Flatten foot translation curves and stabilize rotation for stillness.
- Use Anim Warping: Retarget with “Globally Scaled” root or adjust root/foot effectors via Control Rig layer.
- Jitter (Shaky Motion):
- Smoothing in Move AI: Apply filters if available; Gen-2 may improve output.
- Skeleton Bone Filter: Add Smooth Transform in Unreal AnimBP or export to Blender for noise filtering.
- Reduce Keys: Smooth noise in Blender (Smooth Keys) or Maya (Butterworth filter) by reducing keyframes.
- Add Inertia via Physics: Use physics constraints in UE to settle jitter (complex, rarely needed).
- High Framerate Capture: Record at 60fps to reduce jitter; 30fps increases it.
- Motion Compression: Avoid over-compression in FBX import settings to prevent frame jitter.
- Other Artifacts:
- Floating/Sliding Root: Adjust root bone’s vertical curve for ground contact.
- Arm or Leg Pops: Add keys or tweak interpolation to ease IK snapping.
- Finger Twitch: Constrain small rotations to zero if fingers jitter.
- Tools:
- Unreal Control Rig Baker: Bake to Control Rig, smooth jitter by adjusting controllers.
- motion capture Cleanup Tools: Use MotionBuilder’s filters for foot contact and jitter removal.
- Game IK: Apply FABRIK/CCD IK at runtime for games; fix animations for cinematics.
- Validate at Source:
- Check Move One setup for camera/actor movement or scale errors; calibrate with reference markers.
- Use Move Pro multi-cam for better depth and less sliding.
- Bake to MetaHuman Control Rig and constrain IK feet to world to eliminate sliding.
Summary: Prevent foot sliding with retargeting and IK locking; reduce jitter with filtering and keyframe edits. These fixes are standard in mocap pipelines, requiring motion editing to refine raw data into polished animations.

Can Move One work for group performances or multi-character shots?
Move One (single-camera) captures one actor at a time, not multiple simultaneously, as the AI tracks one human figure and gets confused by occlusion if two are in frame. For group performances:
- Sequential Capture: Record each actor separately with Move One, then combine in Unreal (e.g., Actor A, then Actor B on separate MetaHumans). Use stand-ins or objects as references for interaction (e.g., handshake), aligning timing with rehearsal.
- Multi-Person Capture Solutions: Move Pro (higher-tier) supports multi-camera, multi-person capture with multiple iPhones, but it’s costlier. Budget alternatives include using multiple iPhones with Move One, though less integrated.
- Scene Assembly: In UE5 Sequencer, assemble multi-character shots with individual animation tracks, adjusting manually (e.g., hand touching shoulder) for interaction.
- Livestream Multi-Characters: Move One drives one rig at a time, not multiple unique motions simultaneously (duplication possible but not group performance).
Practical Tips:
- Sync actions with a metronome/clap or pre-recorded dialogue audio.
- Example: For a dance, capture one dancer, then the partner (referencing first capture), tweaking in-engine (e.g., IK for hand-holding).
Other Options: Record two actors with separate iPhones running Move One simultaneously for non-interacting motion, processed separately.
Summary: Move One is single-actor only. For multi-character scenes:
Or capture separately and compose in Unreal, common for indie teams with planning and tweaks.
Upgrade to multi-cam (Move Pro),
How do I add camera animation and lighting to enhance cinematic motion capture scenes?
To elevate MetaHuman animations for cinematic quality in Unreal Engine 5, focus on camera animation and lighting:
Camera Animation:
- Cine Camera Actors: Use in Sequencer with real camera settings (focal length, aperture).
- Techniques:
- Framing: Close-ups for emotions, wide shots for movement.
- Movement: Keyframe pans, tilts, dollies, or shakes; use Rig Rail/Crane for complex shots.
- Depth of Field (DoF): Enable to blur backgrounds, adjust aperture and focus distance.
- Cuts and Transitions: Use multiple cameras with Camera Cut track for shot sequencing.
Lighting:
- Cinematic Setup:
- Key Light: Primary source (e.g., Directional or Spot) for highlights/shadows (e.g., 3/4 Rembrandt angle).
- Fill Light: Soft light (e.g., Sky Light) to reduce harsh shadows.
- Back Light (Rim): Rear light to outline character, separate from background.
- Practical/Environmental Lights: Add torches, neon for color/atmosphere.
- Dynamic vs Static: Use Moveable lights for realistic shadows in cinematics.
- Shadows: High-quality soft shadows via source radius.
- Global Illumination: Enable Lumen for bounce lighting.
- Volumetrics: Use Exponential Height Fog for rays/haze.

Post-Processing:
- Apply Post Process Volume for bloom, color grading, vignette; set 24fps motion blur in Movie Render Queue.
Example Setup: Nighttime MetaHuman dialogue:
- Key Light: Warm streetlamp above.
- Fill: Dim cool moonlight.
- Rim: Car headlights outlining characters.
- Camera: Close-up and over-shoulder shots with dynamic DoF (rack focus), slight sway or dolly.
- Animation: Lights (e.g., match strike) and camera focal length animated in Sequencer.
Tips:
- Split shots into Level Sequences/Sub-Sequences.
- Use cinematic viewport to pilot cameras.
- Refer to Epic’s Cinematic Lighting docs for real-world light values/HDRI tips.
Camera and lighting enhance mood and storytelling, complementing motion capture motion with emotion and focus.
What formats and plugins are required for Move One to UE5 pipeline?
For the Move One to Unreal Engine 5 pipeline, here are the key formats and plugins:
Formats:
- FBX (.fbx): Primary Move One export format with skeletal animation; UE5 supports FBX import.
- USD (.usdc/.usdz): Move AI supports USD; UE5 imports via USD Importer plugin (enable if needed); FBX is more common.
- Blender (.blend): Move One outputs .blend; export FBX from Blender or use Send to Unreal plugin for UE5.
- Video (MP4): Preview MP4 from Move One, not needed for UE5 motion capture import.
Plugins and Tools:
- Move AI Unreal Plugin: No compiled plugin needed; uses provided .uasset IK rig files for manual retargeting.
- Unreal Engine IK Rig / IK Retargeter: Built-in UE5 tools; use Move AI’s provided IK rigs (e.g., IK_MoveOne, IK_Metahuman) in Content folder.
- MetaHuman Plugin: Enable in UE5.2+ for MetaHuman Animator; import characters via Quixel Bridge.
- Apple ARKit Face Plugin: Enable Live Link and Live Link Face plugins for live face animation.
- Control Rig: Enable plugin (default in UE5) for MetaHuman rigs.
- Alembic Groom Importer: Enable for PixelHair (Alembic hair).
- USD Importer: Enable for USD files.
- Blender Plugin (optional): Move AI’s GitHub addon for Blender retargeting.
- Python (optional): For scripting batch processes.
Character Formats:
- MetaHumans: Delivered as .uasset via Quixel Bridge, not FBX.
- Non-MetaHuman: Import FBX, ensure retargetable skeleton with defined Retarget Pose.
Pipeline Recap:
- Record in Move One -> export FBX with “_moveai” skeleton and animation.
- Add Move AI’s IK rig .uassets to UE5.
- Import FBX (create new skeleton or use _moveai).
- Use IK Retargeter (MoveOne_to_MetaHuman .uasset) to retarget animation.
- Export new animation asset for MetaHuman; use in Sequencer.
- No compiled plugins needed for offline workflow.
File Format Pitfalls:
- Ensure FBX is 2019 ASCII/binary; “Import Animations” must be on.
- Delete or keep “moveai avatar” skeletal mesh; focus is Animation Sequence.

Plugins Summary:
- Move AI: No plugin for offline; use provided assets.
- MetaHuman: Enable MetaHuman Plugin, Live Link for face, Control Rig; all free in UE5.
- Note: UE5.2+ MetaHuman Animator UI uses iPhone video for assets.
In Summary:
- Basic pipeline: iPhone -> Move One -> FBX -> UE5 -> IK Retarget -> MetaHuman.
- Required Formats: FBX (core motion capture), USD/Alembic (optional for hair/environment).
- Required Plugins: None from Move AI; use UE5’s built-in Control Rig, IK Rig, MetaHuman plugins.
- Optional: Blender plugin, Live Link for real-time.
Are there licensing restrictions when using Move One with Metahuman in commercial projects?
Move One Licensing:
- Animation data is yours to use (subscribers), licensed royalty-free for any purpose except Prohibited Use (e.g., creating competing AI, violating IP).
- Free tier may have limits (not explicitly commercial restrictions).
- Ensure performer rights if filming others.
- Safe for commercial projects (films, games); no runtime distribution issues.
MetaHuman Licensing:
- Free for Unreal Engine projects (commercial included) if rendered in Unreal.
- Can’t export to other engines or sell MetaHuman assets.
- No royalties for films/videos; 5% royalty for interactive products after $1M revenue.
- Allowed: short films, client projects. Not allowed: selling MetaHuman files.
Move One + MetaHuman Combined:
- No special restrictions; follow each license: MetaHuman in Unreal only, Move One output in any engine (Unreal here).
- Attribution not required but appreciated (e.g., “Animated with Move AI”).
- Other assets (e.g., PixelHair) have separate licenses (usually commercial use OK, no reselling).
One Gotcha:
- Move AI: Don’t use output to train AI or compete with Move AI; avoid illegal/hateful use.
- MetaHuman: Likeness rights apply if resembling real people (legal, not license issue).
Engine Royalties:
- Unreal: Free for films, no royalties for non-interactive content; 5% for games after $1M.
Summary:
Broad license from Move AI (they own algorithms, not your animations); indie-friendly for Unreal cinematics, just avoid AI training or non-UE use.
Move One: Output licensed for commercial use (check free tier limits).
MetaHuman: Free in Unreal for commercial projects, no exports/sales.
No Fees: Beyond Move AI subscription (if exceeding free tier), no royalties for films/videos.

Where can I find free or affordable tools to supplement AI motion capture workflows?
- Blender: Free 3D software for fine-tuning animations (NLA/graph editors), adding physics, and customizing PixelHair assets. Offers motion capture add-ons and experimental markerless tracking.
- Plask (plask.ai): Free AI mocap web tool; upload video, get FBX, edit online. Cloud-based, limited but a Move One backup.
- DeepMotion (Animate 3D): AI motion capture with free tier (limited seconds), captures video/props, outputs FBX, comparable to Move AI.
- Radical (RADiCAL Motion): AI mocap cloud tool, once free, now limited; offers UE plugin for real-time streaming (subscription-based).
- Cascadeur (by Nekki): Free for non-commercial ($8/month indie), physics/AI animation cleanup, new 2023 Video Mocap feature for rough poses from video.
- VRoid Studio / Ready Player Me: Free cartoon avatar creators for testing motion capture ; requires UE IK Rig retargeting.
- UE Marketplace: Free/cheap animations (e.g., Mixamo, Epic’s free walks) for gaps or background characters, retargetable to MetaHumans.
- FaceGood / FaceCap: Facial motion capture alternatives; FaceGood uses iPhone, FaceCap ($15) exports CSV/FBX for UE ARKit blendshapes.
- Audio2Face (NVIDIA): Free audio-to-facial animation tool, less accurate than MetaHuman Animator, mappable to ARKit curves.
- UE5 Control Rig: Free, built-in, powerful for rigging/editing in Sequencer without extra tools.
- OpenPose / AI Pose Estimation: Free libraries (Mediapipe, OpenPose) for coders to track poses via webcam, requires coding to integrate with Unreal.
- Community Resources: Cheap UE Marketplace tools (< $50) like Allright Rig, legacy IKINEMA Orion, free ALS for games, and GitHub projects (e.g., Move AI Blender plugin).
- Twinmotion/Sketchfab: Free Quixel Megascans/Sketchfab models for environments, not mocap-related.
Summary: Free/affordable tools like Blender, Plask, DeepMotion, Cascadeur, and Mixamo supplement mocap workflows, enabling a no-cost pipeline (e.g., Plask mocap, MetaHuman in UE, Blender edits). Check licenses: Blender, Mixamo, MetaHuman are commercial-free.
What are examples of short films or projects using Move One and Metahuman without motion capture suits?
Here are notable projects showcasing AI mocap + MetaHumans:
- “The Darkest Age” (JSFilmz): UE5 short film using Move.AI for body motion capture and MetaHuman Animator for faces in a medieval fight scene. High quality, with a YouTube breakdown video.
- “Gigantic Joe” (Ilya Nodiya): 2023 sci-fi test short with a MetaHuman animated via Move AI, captured by one person with an iPhone. Includes behind-the-scenes.
- “Banished” (David Stapp): ~4-min UE5.4 adventure short made solo, using MetaHumans, Move AI for body, and possibly Faceware/MetaHuman Animator for faces. BTS on LinkedIn.
- “Divine Dose” (Buse Simon): UE5.5 artistic short with MetaHumans, using Move AI’s markerless capture (Move One/Pro), showcased on Move AI’s X/Instagram.
- Astronuts (Amulet Studios): Animated series testing Move AI’s multi-camera motion capture, exploring suitless workflows.
- User Demos:
- Reddit’s LordEmil1: MetaHuman test with Move AI body and MetaHuman Animator face, showing natural gestures.
- CoPilotCo YouTube: Move One capture demo in Unreal.
- Move AI YouTube: Sample scenes (e.g., martial arts, dance).
- Half-replacements: Some use Rokoko suits for body and MetaHuman Animator for face, but listed projects are suitless.
Searchable Examples:
- “MoveAI Metahuman short film” on YouTube: Encounter (orc fight with Move AI).
- “Move AI Metahuman Cinematic”: HACKER short concept.
- Community shares on Unreal forums, YouTube, Move AI Discord.
Summary of Key Projects:
- Gigantic Joe: Sci-fi, single character.
- Banished: Action-adventure, multi-character.
- Divine Dose: Artistic, possibly dance/drama.
- Encounter: Fantasy orc battle.
- Forum tests: E.g., John Wick-style fight with Move AI and MetaHumans.
These range from tests to polished shorts, proving small teams can achieve professional animation without traditional motion capture stages, with workflows often shared online.

FAQ Questions and Answers
- Do I need an iPhone for Move One, or does Android work?
Move One’s app is iOS-only (iPhone/iPad with good camera). Android users can upload video via Move AI’s web platform. For MetaHuman Animator facial capture, an iPhone with Face ID is recommended. - How do I retarget Move One animations to MetaHuman?
In Unreal Engine, use IK Retargeter. Import Move One FBX, load provided IK rig assets, retarget to MetaHuman in IK Retargeter, then export the new animation sequence. - Can I use a webcam or DSLR instead of a phone?
Yes, record with any camera (e.g., DSLR) and upload via Move AI’s web app. Ensure good quality (1080p+, steady). - What about hand/finger movements?
Move One tracks body joints (not fingers). Add finger animation manually in Unreal’s Control Rig, use Leap Motion/gloves, or skip for scenes not needing detail (e.g., running). - How do I handle props or interactions in mocap?
Use stand-ins during recording. Animate props in Unreal to match hands. For interactions (e.g., hugging), capture separately or adjust with IK for contact. - What’s the difference between MetaHuman Animator and Live Link Face?
MetaHuman Animator processes iPhone recordings offline for polished results. Live Link Face streams real-time face data to Unreal for live use. Use both: Live Link for preview, Animator for final. - Why is my MetaHuman’s face expressionless despite body animation?
Body and face animations are separate. Apply facial animation via MetaHuman Animator or Anim BP in Sequencer, ensuring the Face component uses your facial asset, not default ARKit BP. - Does Move One capture root motion?
Yes, FBX includes root motion (hip translation). Retarget with IK Retargeter to keep or ignore it, depending on whether the MetaHuman moves or stays in place. - What are Move One’s free tier limitations?
30 lifetime credits (1 credit ≈ 1 second, so ~30 seconds total). Free tier uses Gen-1 models; Gen-2 likely paid. Possible watermark on exports; serious projects need a paid plan. - How do I animate hair, clothing, or secondary motion with mocap?
MetaHumans have physics-enabled cloth (Chaos) and hair (groom). Enable simulations in Unreal, tweak parameters for realism (e.g., coat flaring). For precision, bake in tools like Marvelous Designer.
Conclusion
Cinematic motion capture on a budget is possible with Move One’s AI body capture and MetaHuman Animator’s facial animation in Unreal Engine 5, using just a smartphone and PC. This setup delivers film-quality character performances, running, fighting, emoting, without expensive suits or stages. The workflow includes capturing body motion with Move One, syncing with MetaHuman facial animation, and refining via IK Retargeters, Control Rig editing, and additions like PixelHair, cameras, and lighting.
Ideal for indie filmmakers, small studios, and freelancers, it bridges the gap with traditional mocap at low/no software cost (Unreal Engine free for linear content, plus tools like Blender, Plask, Cascadeur). Issues like foot sliding or jitter can be corrected for professional results, scalable from previz to final render. Globally, creators are producing shorts, medieval epics to sci-fi, proving budget doesn’t restrict storytelling. Move One and MetaHuman enable realistic digital human animation with minimal gear, making cinematic mocap accessible for unique visions.

Sources and Citation
- CoPilot Co. – Move One allows you to do single-person motion capture using only an app and one iPhonecopilotco.io. This blog confirms the core concept of Move One’s single-camera, suitless approach.
- Puppetier Blog – Move One’s key advantage is portability, allowing mocap in virtually any location, though tracking quality doesn’t match multi-cam or Vicon. Useful for understanding Move One’s strengths and limitations.
- Move AI Pricing – details on free and paid tiers of Move One (e.g., free 30s limit, credit system).
- Epic Games – MetaHuman Animator announcement (capture high-fidelity facial animation using just an iPhone and a PC), highlighting ease of use and quality.
- Move AI EULA – outlines usage rights (Move AI grants a license to use output, with some restrictions on prohibited uses like training AI).
- Epic Forum – MetaHuman commercial use clarification (free to use in Unreal renders for commercial projects).
- Move AI LinkedIn – “No more mocap suits. Move Live eliminates need for suits, extracting natural motion from video in real time”linkedin.com, showing the company’s stance on suitless capture.
- Move AI Comparison – (Juan Leon) stating Move One excels in portability and convenience… despite lower tracking quality vs. high-end systems.
- Unreal Engine Forums – user discussion on editing mocap with Control Rig in Unreal (“Control Rig is used mostly to do animation cleanup after you baked data onto the Controls”), confirming the post-mocap cleanup workflow in-engine.
- Reddit r/mocap – user feedback indicating current AI mocap tools (Move One, Radical) weren’t accurate enough for certain VFX matching tasks, which sets realistic expectations about needing some manual tweaking for precision.
Each of these sources adds context or specifics to the techniques and considerations in the guide, ensuring the information is up-to-date and grounded in industry practice.
Recommended
- How do I smooth camera motion in Blender animations?
- How to Animate in Blender: The Ultimate Step-by-Step Guide to Bringing Your 3D Scenes to Life
- How to Create Realistic Hair Clumping in Blender: A Step-by-Step Guide
- How to Fix Missing Textures/ (or Linked) Files Detected in Blender: A Complete Troubleshooting Guide
- Managing Multiple Camera Settings in Blender with The View Keeper
- How to Download MetaHuman: The Ultimate Step-by-Step Guide to Accessing Digital Humans
- How to Modify MetaHuman: A Comprehensive Guide
- What is the “Follow Path” constraint for cameras in Blender?
- Unlocking Cinematic Power: The Best 5 Blender Camera Add-ons to Elevate Your Workflow
- How do I create a depth map using the Blender camera?