This site is in read only mode. Please continue to browse, but replying, likes, and other actions are disabled for now.
14 / 14
Apr 2015

Thanks for the response bPatrick, but can I control it with the Leap Motion as that is the plan, is it possible to set up the controller to control any program?

Uhm, not by default. But combined with an os control app it works nicely. Ive just tried it on one of my model20s with gamewave and its fairly good imo. Ive didnt see something like that, with built in leap control yet.

I guess I am looking for an app like the Cyber Science 3D (Leap Motion) where I can upload content as that would work well (though that particular app does not seem that stable on PC but better on MAC of course.

Solid Edge, isn't it?

Let me ask you some more: what kind of interaction ability do you look at for your visitors at meetings/exhibitions? I.e. just a model exploration (rotate, translate, zoom), or a more realistic hand interaction with your products' CAD models perhaps based on physics (dynamics, gravity, constraints, friction, collision response, etc.)?

In the first case perhaps you could search for some external LeapMotion-based mouse- or keyboard emulation (the "os control app" as suggested by b​Patrik​0521), thus mapping hand interaction to view parameters of your favourite CAD viewer. Pro: no CAD model conversion pain, good visual quality, possibly playing of 3D animations as far as the modeller has put them in the model. Cons: no real advantage of hand interaction, apart for the "Minority Report" special effect (which only impresses grandmothers by this time...), lack of behavioral realism. Cost (time, sw): low.

In the second case you need some specialized application which acts as a viewer yet runs a real-time rigid-body dynamics simulation in which a virtual hand is allowed to interact with the 3D models in a physics-enabled world, the hand motion captured by the LeapMotion fully drives the virtual hand articulation. Pro: very realistic, physically-credible interaction. Cons: some model conversion activity, need the setup of the physics world (which parts are bodies, are there constraints such as hinges, pistons, etc.). Cost: more time to be spent.

Over years I have developed such a VR application for physically-simulated first-person interaction with 3D models, which I have almost interfaced with the LeapMotion (by the 80/20 rule I did the 80% of the work very quickly, the 20% finalization is taking the 80% rest of the time...). If you are interested in looking at possibilities of such an approach I can send you a video, and/or you could send me (leoncinip gmail.com) one of your 3D models w/ some explanation on its articulated behavior and I could import in my app, then I could send you the app for playing by youself.

Just let me know. Greetings,

Paolo

Hello again,

I may need to pass you onto one of our engineering team as I am unsure of the answers to your questions, we work in hip and knee replacement and are keen to enable surgeons to interact with models of our implant elements which work together to create a replacement joint, in the hip there are usually four components, a stem a head a cup and a liner and in the knee there is a femoral component a tibial base plate a tibial insert and on occasion a patella replacement, so 3-4 components.

On the hip front the head sits on the stem the liner on the head and the cup on top on the liner (fully constrained in the cup)

For the knee the situation is more complex as the joint is less constrained, the femur siton on top of the liner and the liner on top of the tibial base plate, with the liner constrined by the tibial base plate, the remoral component can rotate on the top on the liner internall and externally to a certain limit before it causes issues, there is another femur design that has greater constraint between the femoral component and the tibial insert but we ma be getting ahead of ourselves.

At this stage we are looking to keep it quite simple, select your components for a menu and have a look at it, rotate, in all axes and be able to zoom in and out, at this stage we are looking to dip our toes in the leap motion waters but not sure how to proceed, is there a tool kit that would allow us to see if we can work this through ourselves based on some simple building blocks?

If you have an example of your work I would be interested in seeing it?

Thanks

Brian

Hi Brian,

here there are two videos of an example hand interaction with a CAD object - a two-parts hinge-articulated object which could somehow resemble parts like yours. The interaction is carried out on a workbench by the right hand only.

hand interaction - overview107
hand interaction - action66

I'm currently working on improving the hand grasp since presently the LeapMotion driver doesn't allow to fully and reliably follow the finger movements when fingers bends to close in the palm (the "full hand skeleton" version is warmly awaited from LM). My approach under investigation is to alternatively use the "contained sphere" information in nearly close hand situations, and to seamlessly mix it with the active finger information.

I confirm to you my availability to try to do a sample application with your data in order to give you a flavour of what is possible to get out of such an approach (articulated bodies, then put them in a real-time rigid-body dynamics simulation environment which rules both object-object and hand-object interaction + friction, collision detection and response, then interact through the LeapMotion etc.). Feel free to make your engineering team to contact me and we will agree on technical details, it will be a nice experience for me first in a different domain (I mostly worked in VR for aerospace).

Paolo

2 months later

Paulo. I would be very interested in trying this app as well. I would like to manipulate CAD models at my exhibit booth at trade shows. If you have some ideas, i would like to discuss. Your videos are very cool. Particularly the second. The sphere concept is what I think will work best. Thoughts?

Mark

Hi Brian,

I worked on a similar project with CAD models of pedicle screws. I saved the files as .stl and then converted them to .x3d files in a program called Paraview. x3d files work nicely in 3D modelling software like Blender. From here you can get the models ready for an interactive platform, like Unity.

Leap just released some nice camera control Unity examples that will probably accomplish what you need. https://developer.leapmotion.com/gallery64

This is the closest thing to a toolkit/workflow you can use on your own. Best of luck

Mark,

I'd suggest you to send me (leoncinip@gmail.com) a CAD (Solid Edge, IGES, Step) or polygonal (VRML, 3DS, FBX, OBJ, or whatever) model of the prothesis you'd like to manage and I'll arrange data/config files for the app which I can send back to you for testing.
The hip is surely a sphere joint, the knee is perhaps a hinge in first approximation in the rigid body dynamics world.
For the model formats just send me what you have presently - I'll try to convert for input to the VR program, in case of problems we'll re-iterate. Just describe materials for the visual appearance and where is the ball joint center is located (or I'll figure it out).
Have you already a Leap Motion device? otherwise we can refine the setup w/o it and then introduce it at a later stage.
Next if you like I can also indicate you some new attracting display approaches for booth and visitors' stand in exibitions and venues.

I think we can continue the conversation offline, possibly coming back to the LM forum when some nice LM-relevant ideas will arise.

Paolo

9 months later

I know of an iPad app that can do all these things - a 3D surgical planning app with the ability to upload CAD models and interact with them in 3D - see www.conceptualiz.com54

It is also available to use with leap motion.

Please email me if you would like to see a web demo richard@conceptualiz.com