

View the tutorial to find out more about how it works. While some elements like hair and skin textures will need to be reauthored or reapplied either in MetaHuman Creator or within a tool of choice, Mesh to MetaHuman means artists can get a starting point for a fully rigged digital human from their unique static mesh in minutes.

Once they have transformed their static mesh into a MetaHuman, artists can immediately download it, or open it in MetaHuman Creator, where they can play through the animations and see it instantly come to life, and continue to tweak and enhance it. This mesh is then used to drive the facial rig, while the deltas from the original mesh come along for the ride. The NVIDIA MDL (Material Definition Language) works in tandem with the USD plugin. The USD plugin enables import, export, and editing of Pixar USD formatted stages, meshes, cameras, lights, animations, etc. Simplygon environment variables needs to be setup. This template is then submitted to the cloud, where it is matched to a best-fit MetaHuman derived from the database. This video is a brief introduction to the Omniverse Unreal Engine 4 (UE4) Connector which consists of two plugins, a USD and an MDL plugin. Follow the Simplygon install guide (the Unreal Engine plugin for Simplygon is part of the installer). Starting with a textured mesh created using scanning, sculpting, or traditional modeling tools, Mesh to MetaHuman uses automated landmark tracking in UE5 to fit the MetaHuman topology template to it, combining it with a body type of the artist’s choice from the MetaHuman options. They can then further refine their character in MetaHuman Creator. The MetaHuman Plugin for Unreal Engine’s first major feature, Mesh to MetaHuman, enables artists to take their own custom facial mesh and convert it into a MetaHuman, fully rigged and ready to animate.
