Concept: Place users at the center of a musical interaction that live edits a film projected mapped onto the instrument itself.
A simple set up: one microphone, an overhead projector and a xylphone. The mic listens for each of the 9 notes on my xylophone (actually called a "Freenotes" by the man who makes these in Colorado). I tuned the biquad eq filters in MAX to narrow frequency bands that represent the middle of each note's fundamental, so that when MAX registers one of the notes has been struck above a certain volume threshold it allows that column's video to pass through for the amount of time the note rings out above the threshold.
The video plays in the background on loop, waiting for it's 9 column divisions to be revealed by the user's mallet hits. The source video in the first video is s the epic 1992 film Baraka, and in the bottom user test video it's the equally grand 1982 film Koyaanisqatsi (sped up to 5x and edited for preference).
The 9 columns of video are projection mapped onto the 9 pads of the Xylophone. All that you see is the xylophone and mallets, and the microphone. The whole piece came together quickly and wouldn't have been possible without the help of Professor Luke Dubois and the valuable info and user feedback from classmate Or Fleisher and ITP Research Resident Matt Romein. I am indebted to these great humans for their help!