Shortly after posting "Getting Stuff Done" Kinect experiment to Vimeo, I got a couple comments from people saying it would make for a nice Aphex Twin video.
The project makes use of the Kinect to obtain depth information as well as the infrared video feed. By saving previous frames to an array, I was able to create a time echo effect where your movements are repeated by additional copies of you.
Because the video feed is mapped onto geometry that is created with the depth map information, it was a step better than just doing standard video frame overlays. Now, you could walk backwards and hide in the shadow of an earlier copy of yourself. It made for some amusing evenings of flailing my arms about in my apartment with the lights off.
Surprisingly, I received an email a short while later from WeirdCore who was in the process of working on live visuals for Aphex Twin's new years eve show in Rome. He had shown some of the videos to Aphex and they were positively received.
I ended up creating a Cinder app that had 10 preset modes and various parameter controls that WeirdCore could interact with in realtime during the show. The modes ranged from simple 3D point clouds to variations of Body Dysmorphia.
WeirdCore did a fantastic job integrating the Kinect content with the other feeds consisting of visuals created with QC, MaxMSP/Jitter, VDMX, and v002. Here is some footage of the last bit of his set.
Many thanks to Richard James (Aphex Twin) and Nicky Smith (WeirdCore) for the opportunity.