I'd like to share a project focused on exploring novel interactions with medical images in an operating room (sterile environment). By novel, I mean interactions that go beyond simply viewing 2D/3D patient data or mapping gestures to mouse/keyboard interactions. I also wanted to enhance the visualization and not simply introduce a touchless interface. This was collaborative research project between computer science folks and neurosurgeons, so the emphasis is on viewing MRI/CT data during surgery.
Below is a demo video that illustrates some of the ideas I've mentioned. The work on this project was really in late 2013 / early 2014, but I couldn't really share it until recently. I have not kept up with the latest advancements to the SDK or other projects, but I hope this is still interesting to the community!