Kinect
Shortly after the release of the first Microsoft Kinect, using OpenNI and Max/MSP, Matthew Davis created an interface for converting Kinect tracking data to MIDI. The results of this experiment opened the doors to a new world of live computer vision audio-visual controllers.

Instruments have a history of acting like reverse moldings of our body. Those that are most effective take advantage of the body’s natural motions, allowing a user to manipulate a series of easily accessible controls. With time and practice, muscle memory takes over and control movements become second nature.
Gaming technology has always set a standard for human-machine interactive. The Microsoft Kinect is no exception. Using a combination of image processing and computer vision methods to track body motion, the Kinect opens up enormous possibilities with regard to intuitively controlling machines.
Here’s a look at the Max/MSP interface. It takes OSC from OpenNI, and allows for real-time monitoring, data scaling, and MIDI mapping:
