I've been working on two projects recently that required gestures with kinect and built a simple component which takes OSC data from OSCeleton and then uses it for gestures. Figured whilst I was rendering I'd release it to you guys since its just been sitting in my palette doing nothing. I'll probably upload a vimeo video showing some examples but theres other things to sort out first.
This requires OSCeleton which can be downloaded here: https://github.com/Sensebloom/OSCeleton
It basically takes your hands and maps where they are in comparison to your torso.
I've also extended differences for hands to allow for things like moulding 3D objects and using sliders.
So as for channels you have:
r[xyz] - Right hand x y and z value away from torso
l[xyz] - Left hand x y and z value away from torso
[xyz]_hand_diff - Difference between hands in x y and z
It's listening on port 7110 which OSCeleton should use by default.
This is for OpenNI drivers, not the official Microsoft SDK.