This site is in read only mode. Please continue to browse, but replying, likes, and other actions are disabled for now.
1 / 10
Apr 2014

Oi!

Seems pretty clear people are interested in hacking Leap and creating a scanner in the line of Skanect1.0k or similar. Bought my unit with just this goal in mind. I firmly believe this implementation is possible, though I have just now come on to the Leap scene and have much work to do before I understand the SDK and APIs.

Just a bit about me: I am a seasoned fabricator with some coding skills and some CNC knowledge - looking for math nerds and code junkies to collaborate on the software end. I'll be posting some Sketchup drawings for what I have in mind in the coming days. And since this is possibly a large scale endeavor, I figure the scannable volume should be human-sized - full body.

At its heart, this machine is simply a motorized Lazy Susan. The mechanical end should be very straightforward - a few gear calculations and some trigonometry makes the math, and a stepper motor, xl timing belt - these are relatively cheap and plug-n-play. I'd love a unit that spun around a body like an airport scanner, but we should stick with the simplest workable implementation before we decide to get fancy.

So, is anyone interested? Open Source 3-D Leap Hack?

We should be interested in nailing down the best code language approach in the beginning - whatever works best to process the raw data and convert it to a usable 3-D image. Python and Blender seem appropriate, but I'd love to hear from the community. It also depends on final use for the 3-D image - will it be a model? Used online or in a game? 3-D printed or otherwise CNC'd into existence?

If there's interest, I'll put together a Github page to keep track of progress.

Cheers to everyone who wants to lend a hand to see this created.

Disclaimer: This project is not (currently) endorsed by Leap Motion in any official manner.

  • created

    Apr '14
  • last reply

    Sep '21
  • 9

    replies

  • 30.6k

    views

  • 8

    users

  • 6

    likes

  • 20

    links

This is a fundamental misunderstanding of how the Leap software works. See here:

https://community.leapmotion.com/t/leap-patent-application-and-other-references/717

specifically patent application 20130182079. The LMC hardware is just two cameras with a Cypress FX3 USB3 controller, read as a simple USB camera:

https://community.leapmotion.com/t/protocole-usb-for-robotics-apps/731/2

If you want to build a 3D scanner from scratch, the Leap Motion hardware is not the place to start. If you want to use the Leap Motion approach of reconstructing edges from edge pairs in scanlines in stereoscopic images, the Leap SDK will provide absolutely zero assistance here. You would also be restricted to scanning convex objects, or - as with the human hand - objects that can be represented as ellipses or circles arranged in a manner that avoids obstruction/occlusion.

There are no depth maps, no point clouds etc. Here is an image that illustrates the "capture" that the Leap Motion (according to the patents) generates:

https://patentimages.storage.googleapis.com/US8638989B2/US08638989-20140128-D00014.png1.4k

If the Leap SDK was reasonably layered at the lower level - with interfaces to (a) controller setup and image acquisition, (b) contrast enhancement and edge detection, (c) ellipse reconstruction - then you could use a Leap (or, if multiple Leaps would be supported at the lower level, several) and a rotating base (and an IR-absorbent enclosure) to generate your own polygon slice/mesh reconstruction from silhouettes, but there has been no indication that Leap is considering - or even able to deliver - any such refactoring of the SDK.

As the Irish say, if you want to go there, you shouldn't start from here, but if you are really determined, a USB capture tool and OpenCV are your likely starting points. To my knowledge, nobody has reverse engineered and documented the controller setup USB protocol so far.

Its rather a fundamental misleading, if we look at LM's introduction video617 ... the part starts at 00:51

More a fundamental misunderstanding of how much the Leap relies on its software to perform its little magic.

I don't see any reason that raw stereoscopic image sets can't be algorithmically resolved into a usable 3D image - in fact, the patent documents claim it can be done. Then again, I haven't seen the raw data - has anyone really?

From the patent document:

"As another example, locations of points on an object's surface in a particular slice can be determined directly (e.g., using a time-of-flight camera), and the position and shape of a cross-section of the object in the slice can be approximated by fitting an ellipse or other simple closed curve to the points. Positions and cross-sections can be correlated to construct a 3-D model of the object."

That image you provided from the patent documents doesn't look like raw data to me. All that mathematics cited in the same document represents how the software resolves the camera information into an easily trackable hand-like object, if I'm reading it correctly.

So the question seems to be, is anyone willing/able to hack down past the SDK and find that raw data, so we can use it to generate usable meshes and objects. As stated before, I am not the man for that job, this post is intended to ferret out those who might be able.

Leap has already made it clear they have little inclination to develop this at present. But I'm not yet convinced it's a no-go.

I haven't seen the raw data - has anyone really?

For your convenience:
http://www.youtube.com/watch?v=oIG5ceez2_E751
http://www.youtube.com/watch?v=QQMGvWaFhuo395

The video streams of the two cameras are pixel-wise interlaced:
http://tinypic.com/view.php?pic=53pvl5&s=6366
https://forums.leapmotion.com/forum/support/community-support/linux/1140-started-linux-hacking-effort/page6179

9 months ago by David Holz LM: "We're working on providing lower level data for applications like 3D scanning, but right now there aren't any computations in our software/hardware that would be useful for this."
https://developer.leapmotion.com/forums/forums/6/topics/raw-data--2154
http://www.mtbs3d.com/phpBB/viewtopic.php?p=87757#p87757118

It is amazing that two years later the insistence that the " stereoscopic point-cloud visualization" has anything to do with the raw data has not abated at all in the forums. The Leap is a straight-forward stereo camera. The Leap software attempts to reconstruct fingers from silhouette edges. If you are hoping for some other "raw data", you are reading too much into the void between the [scan]lines.

If anybody wants to implement their own software stack to process the "raw" image data, this still appears to be the place to start:
https://github.com/openleap/OpenLeap440

There will be an official image API at some point, but it will probably take a lot of work to make a good 3D scanning experience.

The Leap has size and cost advantages over other hardware that could theoretically do 3d scanning, but it doesn't have a color camera for object texturing and it's wide FoV makes it impractical to scan things at medium-to-long distances (unless you do monocular SLAM maybe, but then you might as well use a color camera).

14 days later

If you want a full body scanner, one has been successfully done using multiple Raspberry Pi's with the camera modules. The spherical framework is set up with the cameras and they simultaneously capture multiple images rather than using a rotating platform.
http://www.pi3dscan.com1.3k
Then of course, connect it to a 3D printer and have an army of mini-mi's

4 years later

Sorry for restarting a 5 year old thread. I've been using OpenCV's stereoscopic reconstruction functions to generate a point cloud from the raw images and the results have been okayish so far but, I think the accuracy can improve with better calibration. Has anyone here had better luck with the intrinsic/extrinsic camera parameters? I've had to make do with the values I found in this45 paper.

You can find my project here265

2 years later

Thank for update and quick reply. I'll be sure to keep an eye on this thread. Looking for the same information.