Currently, when using controllers (instead of hand tracking) with the CharacterRetargeter component in the Movement SDK, the avatar's finger movements for gripping appear binary (0 or 1, fully open or fully closed). This results in a mechanical feel for the hands.
The request is to implement the functionality seen in the Interactions SDK's Synthesized Hands or Hand Visual prefabs, where the analog input from the controller triggers (e.g., index trigger, grip trigger) smoothly translates to the corresponding finger articulation on the avatar.
This feature would greatly enhance the fidelity and immersion for users utilizing controllers with the Movement SDK.
Thanks!