This is just a follow-up post for anyone interested in knowing how we solved these issues:
First of all there is an infrared tracking system to keeping the position of the primary user and be able to put that position into the running project. This system seems to interfere a bit with the Leap Motion sensor.
- This was not actually the case. While there is some interference, it is minimal. It causes a slight decrease in performance with the leap motion sensor, but is still feasible to be used in this application.
Second, the Leap Motion's infrared emitters are interfering with the infrared glasses being used to view the project in true 3D rendering.
- This was a pretty simple fix, since we had multiple leap motions in the lab, we removed the infrared emitters from one that will be used exclusively in the CAVE.
Third and finally, we have created a menu system based on event callbacks very similar to the Leap Motion UI example. Our system works perfectly with a VR headset, however it does not correctly calculate collisions with the menu system in the CAVE.
- This was a fix that caused tears. We were using the leap event system created with the UI module. The module is now deprecated and has been replaced by the Interaction Engine. The issue was that we were forced to rotate the leap motion controller by 180 degrees on the z-axis in our Unity project. This caused the event system to incorrectly track the position of the leap motion hands in the scene, meaning when the hands we could see touched or went through the button on our menu, there was no interaction. We have replaced everything with the Interaction Engine and this has solved our issues. The menus we are developing for use in the CAVE are now being interacted with correctly and our project can move forward. The interaction is not as intuitive as a real-world interaction and with current technology can not reach that, however it does work for what we need.