We are working with Orion 3.2.1 in Windows with the Unity Core Assets 4.3.3. We build our applications in Unity. We wanted to still be able to use Unity's built-in UI system with the Leap Hands, so we are also using the UIInput package, v1.2.1.
It seems that in the latest version of Unity, when building for VR, the camera frustrum is no longer updated when VR is enabled, thus causing all UI interaction to fail. It directly has effect on the LeapInputModule script that relies on the camera viewport for raycasting. When VR is not enabled in the XRSettings, the camera viewport is normally updated. We have found that if the application is quickly toggled out of VR and then back into VR, and the camera is not moved, the viewport updates and one may interact with the UI. This is obviously not a viable solution.
Has anyone found a fix for this?
We have experimented with the InteractionEngine, but because we need to have both a VR and non-VR mode, we do not want to support two completely separate user interfaces. Plus, there are some limitations to some of the UI effects with the Interaction Engine, such as sprite swapping for hover states, clicks, etc.
Any help on this matter would be greatly appreciated!