"Demo of a Lego NXT robot, proportional steering via LabVIEW across Bluetooth. The Leap SDK data is piped from a C++ app using RTI Connext DDS, to a Java app running on MacOS, which returns the steering and power data back to LabVIEW and the RTI toolkit.
No comments on the driver's technique please."
I'm still working on the control loop, need to smooth the data a bit more over in the Java app.
Laptop 1: (Core i5, Win 7)
Leap SDK C++ app that converts the raw Leap frames into DDS instances on the topic "Leap::Hand" (i'm disregarding pointables at this point). This happens at the Leap frame rate.
LapTop 2: (MBP, MacOS 10.9)
Java application subscribes to Leap::Hand, processes the data in realtime from Leap info to "SteeringControl" topic, using this Type:
struct DuoMotorSteerNXT {
string<128> id; //@key
float steer; // -100 full left - 0 no steer - 100 full right
float power; // -100 reverse - 0 no move - 100 full forward
}
I'm using the stabilized palm Y reading, and the normalized palm X reading to control power and steering, respectively. Instances are constrained using a time-based filter, so while the Leap data is published as fast as possible, the subscriber gets the data at 60Hz, steady state.
The DuoMotorSteerNXT instances are only published if the data has changed above a certain threshold.
NOTE: As soon as I get around to doing a MacOS based Leap frame-to-DDS publisher, I'll be able to move the Leap over to this machine (or any other machine, DDS means not being locked to a single box).
Laptop 1:
LabVIEW + NXT Toolkit + RTI Connext DDS Toolkit. The LabVIEW graph accepts data via DDS on "SteeringControl" topic as generated by the Java app on the other device. The SteeringControl instances are converted from Steering/Power values into individual "power" readings, which each track will be set to. For example, if Steering is set to "-75" (3/4 left), and Power is at 70, then the left track will spin at approximately -35 power, and the right track at 70.
The power levels are sent over Bluetooth.
The thing to remember about this demo is that there are three applications: Leap C++ SDK, LabVIEW, and the Java app that bridges the data from Leap::Hand to SteeringControl. This is effectively a Real-Time SOA, since those three applications could be running on any suitable device.
Next step is for my son (in New York) to drive the robot (in Maryland) over teh Internetz. He suggested I mount a camera to the front of the robot so he can see what he's doing.
Kids these days.
rip