Hi there!
I am a computational neuroscience student at the University of British Columbia and although it isn't exactly my thesis, my supervisor and a coworker are interested in seeing whether your impressive Orion software can be adapted to be used with an auto-regressive Hidden Markov Model to track mouse behaviour at sub-second levels. We are specifically interested in tracking paw movement at this time using the approach developed by the group linked to.
We are aware the Harvard group used a Kinect so could anyone inform me as to how feasible modifying the Leap's tracking skeleton to that of a mouse paw can be tracked instead as she grabs and pulls a lever? Please let us know if you have any concerns or time-saving tips regarding this exploration.
Edit: Nevermind, I read more recent replies and it looks like multi-leap support is not going to be a thing and the current workaround will probably kill me or was taken offline since it came from the old forum.
Secondly, we are interested in tracking behaviour longitudinally. i.e. mice are left in their living enclosure and tracked continuously. For this we think we could surround their cage with Leaps and hook them all up to largely eliminate occlusion issues. Is this even possible? I realize there are threads that say you don't support it, but those threads are from 2014 so am wondering if anything has changed and/or whether you are working on multi-leap support. I have a similar question for human hands. Couldn't you eliminate occlusion problems (at least for desktop apps) by having a second leap hanging from the ceiling? Sounds silly, but for a lab experiment that is totally not a problem to set up. The problem is whether it's possible to have two leaps talking together to give a single view of the hands.
Rest assured that given how impressed we are with the Orion Blocks demo that we may be interested in purchasing many more leaps if mutli-leap support becomes a thing!