Hi all!
I'm using the Leap Motion in an Unity project.
What I'm trying to achieve, is to apply a coordinate transformation automatically depending on the position of the sensor in my scene. Means: If the sensor is flipped upside down (along its height axis, not length axis ) and I also consider this rotation in my scene, you'll get the same information from the sensor. Or: If I move the sensor 15 cm to the left and also translate the GameObject with the sensor, I'll get the same effect. The project won't have any dependency on the sensor positon then.
What I actually got: When trying the same for a RealSense camera, I used to apply Unity's transform.TransformPoint(Vector3 sensorInput) function (this worked well) and a self calculated TransformRotation function (didn't work that well, but it is because the Realsense rotations are messed-up). When targeting the Leap Motion, I noticed, that this is done automatically when using the .ToVector3() method on the sensor data. Can you guys confirm that?
But: This works fine for rotations along the depth axis (I am able to move the sensor 15 cm to the left, 30 cm up and then flip it clockwise along it depth axis and my sensor data are still correct when I do the same in Unity). When I try to flip the sensor along its height axis (means USB-Port to the right), this doesn't work. I discovered the "orientate tracking automatically"-flag in the config window (may be named slightly different, I'm using a localized version) and disabled it because I thought it might interfere with this issue. When disabled, I don't get any hands at all using the sensor upside-down.
Now my question: Can you confirm, that either there is a bug when applying the sensor's transform on the data or it depends on the "auto orientation" and using the sensor upside down just doesn't work at all? Or am I just doing something plain wrong?
Thanks