This site is in read only mode. Please continue to browse, but replying, likes, and other actions are disabled for now.
1 / 8
Jun 2017

Hey everyone,

Our mission at Leap Motion is to make it seamless for people to interact with the digital world. Last year, we released an early access beta of the Leap Motion Interaction Engine38, a layer that exists between the Unity game engine and real-world hand physics. Today, we're excited to release a major upgrade to the engine, which features fundamental object interactions, user interface design tools (including support for wearable interfaces!), and more.

We've also opened up massive improvements to cross-platform developer workflows by making it work seamlessly with hands and PC handheld controllers alike. This means if you’re already developing for the Oculus Touch or Vive controllers, you can easily support a range of input solutions.

You can learn more about this release on our blog34, download the new Core Assets and Modules11, and check out our new Unity documentation15. To celebrate, we’re also offering 15% off the Leap Motion VR Developer Kit for one week in our web store13.

We see the Interaction Engine (and newly introduced Graphic Renderer) as fundamental tools for all of VR that enable you to create natural and compelling interactions in a reliable and performant package. Now that it’s in your hands, we can’t wait to see what you build with it.

  • created

    Jun '17
  • last reply

    Jun '18
  • 7

    replies

  • 4.7k

    views

  • 4

    users

  • 6

    likes

  • 7

    links

This is awesome! There's so much here, and it makes a bunch of my present and future work obsolete, freeing me up to work on other things. I'm not looking forward to the tearing out & retrofitting part but it will be well worth it.

I am intensely curious though - how does the curved space system handle collisions? Distorting meshes isn't too hard but one of the things that always prevented me from even trying something like this was figuring out how to get collisions to work in a way that wasn't woefully infeasible.

@JCorvinus The physical representation of the objects still exist in rectilinear space. The hands check for nearby objects in uncurved space and all of the nearby curved spaces. If the "closest object" exists is in a curved space, the hand's representation will "unwarp" to the object's rectilinear space, allowing for seamless physical interactions using PhysX.

This also allows for fun tricks, like picking up an object in a curved space, throwing it, and watching it curve all the way back around towards you. In reality, a "reverse-distorted" hand was throwing that object in rectilinear space while (just the mesh of) that object is forward-distorted into user space for viewing.

The Button Builder8 scene contains an example of how to set this up; it works almost entirely transparently to the developer (as long as the objects you desire to be curved are rendered using the Leap Graphic Renderer and exist as children of a "LeapSpace" component).

You may find the definitions for the forward and inverse spatial transformations inside of LeapMotion/Core/Scripts/Space/Definitions.

Installed LeapDeveloperKit_3.2.0+45899_win.zip to get the Leap setup.

Then created a new Unity project 5.6.1f1

Imported:
SteamVR
Leap_Motion_CoreAssets_4.2.0.unitypackage
Leap_Motion_Interaction_Engine_1.0.1.unitypackage

Setting the Timestep to 0.011111

Trying to run a demo scene now gets me

"INPUT AXIS NOT SET UP. Go to your Input Manager and add a definition for RightVRTriggerAxis on the 10th Joystick Axis.
UnityEngine.Debug:LogError(Object)"

Is there a ready to go "InputManager.asset" file? I don't want to import the Oculus package, all I want to use is the Vive and Vive controllers. And for experiments the Leap in a desk setting.

Thanks!

PS:

Some general things I would love to see ...

A VR demo scene that assumes a desk Leap. It might be counter intuitive to have this - but the reality is that not that many VR devs with a Leap have the VR mount ready to go.

A demo scene of the interaction Engine that really targets the Vive controller users so we get to check out what we can do without having to dive into the documentation first. First let us see the awesome so we know what we get if we do the work. wink

A ready set-up InputManager can be found here: https://github.com/leapmotion/UnityModules/blob/master/ProjectSettings/InputManager.asset49

Unity won't let us access the trigger inputs coming off of the Vive/Touch controllers natively without there being an Input mapping setup. We chose not to overwrite the projects' input mappings because we understand that devs often spend a significant amount of time setting those up in their existing projects. The new Input system is that is coming should help with this little hiccup down the road...

Desk-Leap setups are tricky because each dev would have to calibrate/register the position of the peripheral inside of their VR Headset's coordinate space; we could ship tools that would assist with this, but that doesn't solve your initial issue of devs having the tools ready to go.

that is fine - but you should consider adding one zipped with a mention in the readme.txt.

I asked a Touch dev for one so I wouldn't have to start exploring what Oculus stuff I need. It certainly isn't a great situation when you run into the problem when you actually just wanted to give it a quick spin!

1 year later

unpinned Jun 25, '18