Hello,
I'm looking for a simple project and I'm studying the SDK V5 documentation and testing the provided samples that seem to work correctly.
Since from the V4 (Orion) we can use only C language (and not JavaScript, C#, Java, Python, Objective-C as in the previous versions), even in the C++ documentation of V2 and V3 the function names (and also their structures) are different with respect to the V4 and V5 of course, but if I want to check what is the function that manages the gestures (i.e. if turn one finger clockwise => scroll the page), I don't see any evidences of this kind of functionalities in the SDK V5 and documentation.
.
My purpose is simply a program that open a document (for example opening Word or Adobe application) and use the Leap Motion Controller for controlling the scroll up-down, the zoom in/out, and other basic stuff with the gestures of my hands only in these specific applications.
Do you know what are the functions to manage hand gestures and to contact Windows applications (like Word or Adobe) or a piece of code that I could take as example?
In the V4 and V5 documentation online is shown only guides to make projects with Unity, Unreal but not the interaction with Windows or Linux applications (like Word, Adobe, LibreOffice and so on).
Could you please provide some suggestions? I would like to avoid Touch Free because it is "global" for all OS environment. I would like to do this as a minimal project with few gestures and one specific application.
PS: I would like to prefer to work with V5 because of "Preview" and test the improved hand tracking sensitivity and other preview feature.
Thank you