Live Link Face

Live Link Face

free

Capture real-time facial performances on iPhone and stream them directly into Unreal Engine. Free app by Epic Games supporting MetaHuman Animator and ARKit.

About

Live Link Face is a free iOS app developed by Epic Games' Unreal Engine team that transforms an iPhone or iPad into a powerful facial performance capture tool. Leveraging Apple's ARKit and the TrueDepth front camera, it records facial expressions, blendshape data, and depth information to drive 3D characters inside Unreal Engine. The app supports two primary workflows. For real-time animation, it streams ARKit blendshape data live over a network to an Unreal Engine instance using the MetaHuman Live Link Plugin, enabling immediate visualization of facial performances on MetaHuman or custom characters. For high-fidelity offline work, it integrates with MetaHuman Animator — capturing raw video and depth data that Unreal Engine's AI processing pipeline then converts into premium facial animations applied to any MetaHuman in just a few clicks. Live Link Face also includes professional production features such as multi-device timecode synchronization (NTP, system clock, or Tentacle Sync hardware), remote OSC control for triggering recordings without interrupting performers, slate/take management, and on-device review of captured footage. Rest pose calibration allows fine-tuning to individual performers for improved animation quality. Ideal for indie animators, game developers, film and VFX studios, and virtual production teams, Live Link Face democratizes professional-grade facial capture by removing the need for expensive dedicated hardware. It requires an iPhone 12 or later and a Windows 10/11 PC running Unreal Engine.

Key Features

  • Real-Time Facial Streaming: Streams ARKit blendshape and depth data live over a network into Unreal Engine, driving MetaHuman or custom 3D character facial rigs instantly.
  • MetaHuman Animator Integration: Captures raw video and depth data for offline processing by MetaHuman Animator, producing high-fidelity facial animation applied to any MetaHuman with a few clicks.
  • Timecode & Multi-Device Sync: Supports system clock, NTP servers, or Tentacle Sync hardware for frame-accurate synchronization across multiple devices on set.
  • Remote OSC Control: Allows remote triggering of recordings, slate naming, and take numbering via OSC or the MetaHuman Plugin so actors can stay focused on their performance.
  • Rest Pose Calibration: Enables per-performer tuning of capture data to improve facial animation quality and correct neutral-pose tracking drift.

Use Cases

  • Indie game developers animating MetaHuman characters for Unreal Engine projects without costly dedicated mocap hardware.
  • Virtual production studios capturing actor facial performances on set and streaming them live to a real-time rendered digital human.
  • Animated film and VFX teams recording high-fidelity facial takes for post-production processing with MetaHuman Animator.
  • Educators and students learning character animation and real-time rendering pipelines using accessible iPhone hardware.
  • Content creators and streamers animating digital avatars in real time for live broadcasts or pre-recorded video content.

Pros

  • Completely Free: The app is free with no subscription required, making professional facial capture accessible to indie developers and small studios.
  • Deep Unreal Engine Integration: Seamlessly connects to Unreal Engine via Live Link and the MetaHuman Plugin, supporting both real-time and high-fidelity offline animation workflows.
  • Professional Production Features: Timecode sync, OSC remote control, slate/take management, and AirDrop sharing rival dedicated hardware capture solutions.

Cons

  • Requires iPhone 12 or Later: The TrueDepth camera and A14 chip requirements limit use to relatively recent iPhone hardware, excluding older devices.
  • Windows PC Required: Full workflow (both real-time and MetaHuman Animator) requires a Windows 10/11 desktop running Unreal Engine, with no macOS Unreal support for these workflows.
  • Eyebrow and Subtle Expression Tracking Limitations: User reviews note that fine asymmetric expressions (e.g., independent eyebrow movement) and neutral-pose calibration can produce inaccurate results.

Frequently Asked Questions

Is Live Link Face free to use?

Yes, Live Link Face is completely free to download and use on the App Store. There are no in-app purchases or subscription fees, though you will need Unreal Engine (also free up to revenue thresholds) on a Windows PC.

What iPhone model do I need?

Live Link Face requires an iPhone 12 or later (or a compatible iPad with TrueDepth camera) to enable depth-based facial capture. Older models are not supported.

What is the difference between real-time streaming and MetaHuman Animator?

Real-time streaming sends live ARKit blendshape data to Unreal Engine over a network for immediate character animation. MetaHuman Animator captures raw video and depth data for later processing by Unreal Engine's AI pipeline, producing higher-quality animations.

Can I use Live Link Face with non-MetaHuman characters?

Yes. The app can stream raw ARKit animation data to any Unreal Engine character via the Live Link protocol, not just MetaHumans, though MetaHuman integration offers the most streamlined pipeline.

How do I synchronize Live Link Face with other cameras on set?

The app supports timecode sync via the system clock, an NTP server, or a Tentacle Sync hardware device. Video reference footage is frame-accurate with embedded timecode for use in editorial.

Reviews

No reviews yet. Be the first to review this tool.

Alternatives

See all