I'm suprised this idea didn't get more tracktion. Imho this is essential for a good VR experience. If you could track arbitrary shapes you could not just have a cheap joystick / gamepad like controller like the sixense stem controller or the Valve VR controller, you could also track the headset. Or attach the leapmotion to the headset and track your hands and the headset by having a "root object" on your desktop (e.g. your monitor or your keyboard. That way you could track hands and your head in sync but could also render a VR representation of your keyboard to use. Or even your mouse. But I'm getting off topic.
The main thing is to combine multiple fingers to control a pointer or controller more accurately AND to have multiple clicky or analog buttons on a controller. Critical for games. But while holding the controller you could still spread off your thumb and index finger and use them for gesture like scaling, turning or hitting a virtual button. So tracking tools would be a game changer for the leap motion. I bet that is what nimble VR are doing now at occulus rift.
I have a notion how the hand tracking in the leap motion works based on a paper I once read (synthetic image using quasi newtonian NLP optimization algorithm). Afaik it should in fact be rather easy to track arbitrary shapes like a keyboard, a VR headset or a keyboard. The developer could supply a textured 3D model representation of the real world object to track and voila.
I'm not very active in these forums, are there any plans or discussions about this topic?