My friend just finished a Leap to OSC app using the Python SDK examples and pyOSC (http://gitorious.org/pyosc
). We now have Leap data streaming into Touch
A very alpha repo is here: https://github.com/topher515/leapyosc
I'm going to work on porting it to a chop that runs natively inside Touch, my first real 088 Python project.
My impressions so far (based on the developer unit) have been really positive. Very accurate, very fast hand and finger tracking. Trackable distance is about 2ft above the device but it has a very wide FoV. Also very tiny. The value of the device is really in the software, though. Right now developers lack access to the raw depth data. No point clouds, object recognition, or facial tracking out of the box, but access to a depth map is the most requested feature. I believe it has something to do with the stereovision camera and the way their software interprets the data as opposed to the way a kinect operates with IR projector and camera. They plan to support multiple Leaps for increasing field of view, which I hope means including some calibration tools in the SDK.
Will post some videos in the coming weeks.