Skip to main content

Integration with existing projects

Please watch the video with integration of Instant Dialogues and Game Animation Sample Project, it will give you a good overview of the necessary steps.

Steps required to make an existing character Blueprint compatible with Instant Dialogues:

Add components to character Blueprint:

  • Audio component (standard UE component)
  • ID_Character
  • AC_ID_EyeFocus

Add BPI_ID_Speaker to implemented interfaces section of character Blueprint.

Implement interface methods:

  • GetVoiceAudioComponent should return recently added Audio component
  • GetLeaderPoseMesh should return a body mesh which is driving animations
  • GetFaceMesh should return MetaHuman face mesh
  • GetIdCharacter should return the IDCharacter asset reference stored on the ID_Character component (NOT a duplicate variable)

Best Practice: The ID_Character component is only a holder/helper for the actual Dialogue Character asset. Do not mirror that asset in a separate Blueprint variable and risk divergence—always source the identity from the component when implementing GetIdCharacter in BPI_ID_Speaker.

Adjust Animation Blueprints:

For head movement, body animation blueprint should implement ALI_ID_MetahumanHeadSync. Place two layers near output pose:

The class implementing both layers should be ABP_ID_MetahumanHeadSync. If it's not availabe, you have to retarget this ABP to your skeleton (see video above).

Similarly, face animation Blueprint should implement the same interface, but only one layer is needed in Anim Graph.

The class implementing the layer should be ABP_ID_EyeFocus.

Linking Dialogue Characters to MetaHuman Actors

This section has moved to its own dedicated page for clarity and depth.

➡️ See: Linking Dialogue Characters to MetaHuman Actors

That page covers: identity contract, required components, matching flow, troubleshooting, and best practices.

Disable head movement in original Metahuman Face Post Process ABP

InstantDialogues are using different method of head movement than original MetaHuman characters. By moving it to body ABP and using some extra logic we are able to keep face in sync with body, and layer small had movements on top of where the character is looking.

To avoid glitches between the original metahuman head movement, we have to locate Face Post Process ABP (easiest way is to find any metahuman face SKM, open asset and find Post Process Animation in details). Here we have to disable the HeadMovement by unchecking the bool variable EnabeHeadMovementIK.