Skip to content
Technical Dive

Note

This page explains the internals of how groom bindings work in Unreal Engine: what the solver actually does, what can go wrong, and the reasoning behind the offset mode. It is optional reading. If you just want to get bindings created, head straight to Usage.


How Groom Bindings Work

Grooms are authored against a specific mesh

When the MetaHuman Character tool (or the Auto Assembler) exports grooms for a character, each groom's strand roots are fitted to that character's head shape and world position. The data is baked in: every strand knows exactly which point on that specific mesh it grows from.

If you take grooms exported for character A and try to use them on character B without creating a new binding, the results range from slightly off to completely broken: strands may clip through the skin, float above the surface, or (if the body heights are very different) end up nowhere near the head at all.

A groom binding solves this by retargeting the strand root positions from a source mesh's surface to a target mesh's surface, adjusting for differences in head shape and position.

How strand roots are retargeted

The binding process does not work by measuring 3D distances between meshes. Instead, it relies on two pieces of structural data shared between the source and target:

  • Polygon sections: Unreal divides a skeletal mesh into named sections, each corresponding to a material slot. Groom strand roots are anchored to a specific section index (the face skin section). The binding tool uses this index to know which part of the target mesh to project onto.
  • UV coordinates: each strand root stores a UV position within its section. During binding, the tool finds the matching UV location on the target mesh and places the root there. This is what allows the binding to correctly map a root from one head shape to a different (but topologically equivalent) head shape.

Topology requirements

Because the binding relies on section indices and UV layout, the source and target meshes must share the same topology:

  • Same number of polygons (or very close)
  • Same polygon section order (face skin must be the same section index on both meshes)
  • Same UV layout

The head shapes and positions can differ freely; only the topology needs to match. This is exactly the scenario the tool is designed for: an optimized face mesh built from the same base MetaHuman will always have the same topology as any other optimized face mesh built from the same base, even if the characters look completely different.

Binding between meshes with different polygon counts is generally unreliable. The one practical exception is retargeting between two LODs of the same character, where the topology diverges gradually rather than abruptly. Even then, projection quality degrades at lower LODs; the result is acceptable for background characters but not for close-up facial capture work.

Why section index matters for Face Optimizer meshes

In the native MetaHuman pipeline, the face skin geometry can land at any material slot depending on the pipeline type, anywhere from slot 0 to slot 7. The Face Optimizer normalizes this: the face skin is always placed at material slot 0, regardless of which pipeline produced the source character. This means the default Face Material Slot setting of 0 works correctly for all Face Optimizer meshes without any manual adjustment.


Why the Auto Offset Mode Exists

This section is relevant if you are using the ARKit 52 Blender workflow and re-rigging your optimized face mesh to a new body skeleton.

The problem: two compounding mismatches

When you re-rig a face mesh to a different body skeleton in Blender, you typically translate the head to align it with the new rig's neck joint. The result is an optimized face mesh that lives at a different world position than the one the MetaHuman tool used when it originally generated the grooms.

The obvious fix seems to be: use the original SKM_FaceMesh (the full-resolution MetaHuman face exported by the Character tool) as the source mesh for the binding. But this runs into two problems at once:

Polygon count and topology mismatch. When using the Face Optimizer or the ARKit 52 Blender workflow with LOD1 or lower, you will end up with less polygons than the original SKM_FaceMesh at LOD 0. As described above, the binding solver relies on matching topology, so with a significantly different polygon count the UV mapping diverges, and the solver produces garbage results or fails entirely.

Section index mismatch. Even setting aside polygon count, the native SKM_FaceMesh does not guarantee that its polygon sections are ordered the same way as the Face Optimizer output. The face skin material slot can vary depending on which pipeline produced the source character. When the section index the groom is anchored to does not match the section index the solver is looking at on the target, the projection is applied to the wrong geometry entirely.

Either of these problems alone would make the original face mesh a poor source. Together, combined with the position offset, the solver has no reliable foundation and the binding will be broken.

How the offset approach solves this

Instead of reaching for an external mesh, the tool generates a source mesh directly from your optimized face mesh. It duplicates the target mesh, applies the inverse of the offset you specify (moving it back to the position where the grooms were originally authored), and uses that repositioned copy as the projection source.

Because the source is derived from the same mesh as the target, topology matches exactly: same polygon count, same section order, same UV layout. The only difference is position, which is precisely what the binding solver is designed to handle. The result is a correct binding even though the final mesh lives somewhere entirely different in world space.

© 2026 Accent Game Dev. All rights reserved.