Microsoft Kinect Learns to Read Hand Gestures, Minority Report-Style Interface Now Possible

Not only is the Microsoft Research Cambridge team finally releasing their 3D modeling API Kinect Fusion, they’re bringing you gesture control—with mouse clicks and multi-touch, pinch-to-zoom interactions.

Current Kinect sensors can track joints in your body, but not small gestures, in particular hand motions. But that’s about to change. Using machine learning, Kinect can now recognize open and closed hands. In the video below, Jamie Shotton, the man behind Kinect’s skeletal modeling, shows how users can use their hands to navigate a map or draw using a painting program. [Read More]