I got psyched about live music visuals after seeing Amon Tobin’s ISAM show, and knew I had to dabble in it at some point. My friend and I got our feet wet in TD doing some visuals for a party at my apartment using a kinect, projector, and a midi keyboard about a month ago. Here is my favorite thing from that night, some fire I faked with feedback loops:http://www.youtube.com/watch?v=iDC7-jFUw08
After the party,, we got a gig with a great band, Wildlife Control. They have awesomely catchy tunes, so it made it easy to justify spending a large amount of my free time building content for their show in less than two weeks of working a few hours a day. Check out their incredible time lapse music video shot on Ocean Beach in San Francisco:http://www.youtube.com/watch?v=boGyFAYomBo
The limited time frame and this being our first exposure to live visuals necessitated limiting the scope of the project, which luckily fit Wildlife Control's low fi, 8 bit aesthetic. Here is an example of something they made with the SoundCloud API:http://analogordigital.wildlifectrl.com/
We wanted to keep the extremely pixelated and reduced color palette look, but introduce more reactive elements. Luckily, they had very concrete and simple ideas surrounding the notion of having 4 inputs read off of the stage. We would have a mic on the kick drum, one on the snare, an 88 key midi keyboard, and some other audio source such as vocals. Each of these would control a separate element in the visualization. Here are the 4 visual elements working together in a rehearsal:http://www.youtube.com/watch?v=WfM3p0xRWbc
The visuals occur on a 44x24 grid, which roughly matches 720p resolution, and has 1 rectangle for every 2 keys on the keyboard. Each rectangle in the grid is rendered with some phong shading, and has a color derived from a TOP.
The kick drums give an overall shake to the entire grid with a quick restitution to feel punchy. The snare drum alters the normals of the vertices on each rectangular patch with some noise to look shimmery. These two elements were designed to reflect the audio qualities of the drums.
Next, we had a basic spectrum analyzer implemented in CHOPs that was driven from the vocals mike. It got a fair amount of bleed from the stage, which gave a nice ambient feel to the visualization.
Finally, we had the “rain” which was driven from the keyboard. We played a lot with the colors, and settled with a simple palette of 5 colors derived from the design they had for the tickets. Each time a key was pressed, a new “droplet” with a random color was released from the top of the screen. This was implemented entirely in TOPs with heavy use of feedback loops and GLSL shaders.
We had a lot of worries about cabling, and calibrating microphones on the day of the concert, but it all got sorted out efficiently. I had a nice little stand near the stage to control the relative opacity of all the different elements from my fader board. This usually entailed turning the spectrum analyzer off when the piano solos started. Here is everything together taken by my roommate:http://www.youtube.com/watch?v=YG-5C6Kkp1Q&feature=plcp
It was a great first experience with using Touch Designer in a live setting, and I’m excited to work with the band further. I have to give a huge thanks to my buddy Kyle, who did the look dev on the drum responses, and a bunch of signal processing.
Quick plug: you should check out Wildlife Control on facebook! We have big ideas for the visuals in the future...http://www.facebook.com/wildlifecontrol