Delivering high-quality facial animation in minutes, MetaHuman Animator, from Epic Games, is now available. Jump straight in and explore—or learn the ropes, talk to other MetaHuman users, and read the documentation on Epic Games’ new MetaHuman hub on the Epic Developer Community.
Since 2021, when MetaHuman Creator was first announced as early access, the makers of Unreal Engine have continued to expand the tools available for use by creatives in different platforms and last March we revealed here at PVC that a revolution was about to happen, as Epic Games announced the next step for MetaHumans. As we wrote then, “everything changes next Summer, when MetaHuman Animator allows to capture with fidelity an actor’s performance, with just an iPhone or stereo helmet-mounted camera.”
Well, Epic Games just announced that MetaHuman Animator is here. MetaHuman Animator is a new feature set that enables you to capture an actor’s performance using an iPhone or stereo head-mounted camera system (HMC) and apply it as high-fidelity facial animation on any MetaHuman character, without the need for manual intervention.
Epic Games claims that “every subtle expression, look, and emotion is accurately captured and faithfully replicated on your digital human. Even better, it’s simple and straightforward to achieve incredible results—anyone can do it.”
The company says that “if you’re new to performance capture, MetaHuman Animator is a convenient way to bring facial animation to your MetaHumans based on real-world performances. And if you already do performance capture, this new feature set will significantly improve your existing capture workflow, reduce time and effort, and give you more creative control. Just pair MetaHuman Animator with your existing vertical stereo head-mounted camera to achieve even greater visual fidelity.”
Just point the camera and press record
What would previously take a team of experts months to faithfully recreate – every nuance of the actor’s performance on a digital character -, can now be done in a fraction of the time and with far less effort, thanks to MetaHuman Animator, which offers high-fidelity facial animation…the easy way!
The new feature set uses a 4D solver to combine video and depth data together with a MetaHuman representation of the performer. The animation is produced locally using GPU hardware, with the final animation available in minutes. Epic Games says that “all happens under the hood, though—for you, it’s pretty much a case of pointing the camera at the actor and pressing record. Once captured, MetaHuman Animator accurately reproduces the individuality and nuance of the actor’s performance onto any MetaHuman character. “
What’s more, the animation data is semantically correct, using the appropriate rig controls, and temporally consistent, with smooth control transitions so it’s easy to make artistic adjustments if you want to tweak the animation.
Epic Games has shared a short film, Blue Dot, created by Epic Games’ 3Lateral team in collaboration with local Serbian artists, including renowned actor Radivoje Bukvić, who delivers a monologue based on a poem by Mika Antic. The performance was filmed at Take One studio’s mocap stage with cinematographer Ivan Šijak acting as director of photography.
These nuanced results demonstrate the level of fidelity that artists and filmmakers can expect when using MetaHuman Animator with a stereo head-mounted camera system and traditional filmmaking techniques. The team was able to achieve this impressive level of animation quality with minimal interventions on top of MetaHuman Animator results.
Adapted for creative iteration on set
Epic Games says that “the facial animation you capture using MetaHuman Animator can be applied to any MetaHuman character or any character adopting the new MetaHuman facial description standard in just a few clicks.” What this means is that you can design your character the way you want, safe in the knowledge that the facial animation applied to it will work.
As Epic Games says, it all happens under the hood. But if you want to understand a bit more of the process, the company says, it is “achievable because Mesh to MetaHuman can now create a MetaHuman Identity from just three frames of video, along with depth data captured using your iPhone or reconstructed using data from your vertical stereo head-mounted camera. This personalizes the solver to the actor, enabling MetaHuman Animator to produce animation that works on any MetaHuman character. It can even use the audio to produce convincing tongue animation.”
The goal with MetaHuman Animator is clear: Epic Games wants to take facial performance capture from something only experts with high-end capture systems can achieve, and turn it into something for all creators. At its simplest, MetaHuman Animator can be used with just an iPhone (12 or above) and a desktop PC. That’s possible because, Epic Games says, “we’ve updated the Live Link Face iOS app to capture raw video and depth data, which is then ingested directly from the device into Unreal Engine for processing.”
MetaHuman Animator is perfectly adapted for creative iteration on set because it enables you to process and transfer facial animation onto any MetaHuman character, fast. It’s another tool for the Virtual Production toolset for all creators.
Follow the link to know more about MetaHuman Animator, and dive into the documentation for all aspects of the MetaHuman framework on Epic Games’ new MetaHuman hub. Located on the Epic Developer Community—the one-stop shop to learn about Epic tools and exchange information with others—it also hosts forums where you can showcase your work or ask questions, and a tutorial section that contains both Epic and user-made tutorial content.