Does anyone know how to get joint angles then reproduce the positions hand from them?
My current Position to Angle solution is:
1. Get the basis vectors for the hand, and each bone in the hand in x,y,z
2. Generate rotation matrices from basis vectors
3. For each bone pair, multiply the rotation matrix for bone i with the transpose of the rotation matrix of bone i-1. This, I believe performs a basis transformation and gets the rotation matrices from one bone to another.
4. I then use this derived matrix to generate pitch, yaw and roll angles.
As suggested by this paper.
I believe this is a suitable direction as I noticed a mention of creating quaternions from a rotation matrix in the python docs for Bone but could not find an implementation, did I miss it?
From inspection, the derived angles look plausible, however when I try to convert them back to positions, I do not get the hand I began with, just a large mess.
(Leap data plotted on left, failed attempted reconstruction is on the right)
The angles generated seem correct, giving 70 degrees as the metacarpal joint angle, however as shown I am struggling to convert them with lengths back to positions.
If anyone is interested in the code for both transformations, here is a stationary example that includes measures positions and angles. A live version is here.
The code for the above-mentioned paper is here too, however, I believe they do not convert from Angles and lengths back to positions.
My solution for Euler Angle (Radian) to Position is:
Rs = [ rotation matricies for a single finger ]
origin = np.array([0,0,0])
joints =  # will hold xyz coord for each joint on a single finger
for R in Rs:
bone = R.dot([0,10,0]) # assume each (edit) BONE has length 10
origin += bone
Initially, I tried getting the angles using the dot product, however, I realised this gives one angle, an angle defined between the planes made by the points in 3d space, not pitch, yaw, roll.
I then tried projecting each 3d point into planes for front on, side and above and computed 3 angles, but I believe these angles would suffer when the hand is rotated. I have also considered using spherical polar coordinates.
I am unsure on pitch, yaw and rolls suitability with respect to gimbal lock/quaternions, but if the rotation matrices are right everything else can be derived.
I am currently using the V3 version of Python generated as a wrapper with SWIG, if there is a better way, please let me know.
Thank you in advance!