Like many others before me I am trying to mirror hand movements.
i.e. when a left hand is detected a right hand appears that is driven by left hand movement but is mirrored across the z axis of the world.
I have managed to achieve this visually by reflecting a hand model (for example a left hand model) at the level of the palm of another model (a normal right hand model) using the Vector3.Reflect functionality in the Unity API.
It must be noted that the mirrored hand model (left hand model) has been modified in the Rigged Hand Script. I set handedness in the Rigged Hand script to right so that it would be detected as a right hand and I inversed palm facing and finger pointing directions so that the hand model was flipped 180 degrees.
Then in my C# script I reflected position at the level of the palm of the right hand model so that the mirrored model was no longer attached at the level of the palm.
I then rotated the left hand model 180 degrees in world space so that it had the correct orientation.
However, my current dilemma is that I want the mirrored hand to be able to interact with objects that have interactable behaviour. Therefore, this requires the interaction engine. At the moment, interaction hands are a separate GameObject from the graphical hand models I have been working with and I have found no simple way to mirror the physical interaction hands due to the contact bones being created at runtime.
Has anyone had success on this front or can anyone provide any help? I am at a crossroads.