Hi everyone,
I’d like to take a moment to talk about a series of developments we’ve been working on specifically for virtual reality. The first set involves our existing peripheral device and new things developers can do with it starting today. The second is a hint of some of our next-generation hardware and software efforts that we’re currently building from the ground up for this new and exciting space.
If virtual reality is to be anything like actual reality, we believe that fast, accurate, and robust hand tracking will be absolutely essential. We believe in the concept of other specialized controllers as well, but our hands themselves are the fundamental and universal human input device.
One of the most exciting things to us about virtual reality is that our technology can be more than just your hands -- it can be your eyes as well. We’ve recently hinted at this with the release of a new API which opens up raw infrared imagery straight from our sensors. When mounted directly onto a head-worn display, these images become stereoscopic windows into the world around you. What it sees, you see.
This expands the tracking space to be in any direction you’re facing. You can reach forward, turn around, look up and down, and the tracking follows you wherever you go. Because our device’s field of view exceeds that of existing VR displays, you’ll find it can start to track your hands even before you even see them.
To help people explore this paradigm with us, today we’re releasing a VR Developer Mount, which allows you to easily and consistently attach and remove your Leap Motion Controller from a VR headset. We’re also releasing a software update for our beta SDK, which includes a massively improved ‘top-down tracking’ mode, as well as Unity and C++ examples. These show how to use both the image overlays and the tracking data from a head-mounted position, then give further examples of more sophisticated 3D interactions.
We think that this path fundamentally leads to a place where the digital and physical mediums of today blur together in deep and profound ways. Where bits and bytes become concepts and feelings and substances -- giving rise to digital interactions that are physically indistinguishable from real life.
To this end, I’d like to also give a hint of what we’re working on for the future. One prototype sensor that we’re beginning to show today (and will be giving out more information on in the future) is codenamed “Dragonfly.” It possesses greater-than-HD image resolution, color and infrared imagery, and a significantly larger field of view.
With next-generation “mega-sensors” like this, a Leap Motion device can literally become your eyes into the digital and physical realms -- allowing you to seamlessly mix and mash, fade and blend, between virtual environments and the sharpness of the real world.
Beyond the hardware, it’s important to stress that all forms of tracking are themselves software -- in our case, a deeply complex and rapidly evolving intelligence for tracking hands. We hope you will appreciate its growth as we continue to release updates every few weeks.
We’ve always been in awe of people’s deep and abiding passion for the dream of a digital and physical convergence. It’s this energy which drives us to work tirelessly until it is a reality, and we hope you continue to travel with us (or join us now) in this quest.
David