@AlexColgan tested again in Blocks, disabling in the settings the Reliable tracking for high luminosity status and setting in Diagnostic Visualizer VR mode (didn't know if useful or maybe the application set it by himself).
You can see in the video the various elements I find hard to manage
UNEXPECTED BEHAVIOR
* Last smaller fingers seems to be poorly tracked, so that often are shown bent instead of straight
* There is jitters so that fingers are shaking and this return an unnatural feeling
* Maybe since last small fingers are poorly tracked it's more difficult to grab smaller parts or parts with a small border
* Leave the objects without launching them it's sometimes unreliable
* Rotating the hands result even the index to be returned bent to the palm even if in reality it's straight.
* Also finger tracking with hands as profile and not in front of the camera result in unreliable tracking even coming from a perfectly clear status
* Launching an object putting it back of the head (as normal in real world) result in tracking loss (expected) and the inability to launch it even when tracking is restored when I try to launch
* You need to have careful slow movements, it's really not natural
* Seems right hand is consistently tracked worse than left one
UX IMPROVEMENT MAYBE NEEDED
* Grab objects work better not closing the fingers but leaving them more distanced. This is physically correct since you are not penetrating the object in the reality, but without any physical feedback grab will be better working even with fingers full closed
* Need to have the hands at an unnatural and a bit wearisome height. Will be better to have visualized the tracking cone so you know where they need to be
* When I launch without physical feedback it's difficult to understand where I will launch. Launching far away it's more a topic of suddenly open your hand than giving a push. Will be better some smoothed indicator showing your palm direction (similar to the controller "laser" guide)
Where I tried to split the elements that might be a real problem and the one that will be maybe resolved trough a custom development for our project.
This didn't prevent ME to enjoy the news of using my hands, but I'm concerned about the ability from people not interested at all in the tech or experience and needing to follow a training program where a friction in the interaction can break a carefully created realistic hazard environment simulation.
There is any LM supported application you suggest to test opening doors, picking up objects and activate switches? Need I to try recalibrating the LM? Or I'm just over expecting things that are still a research topic?