Ableton & TouchDesigner - Audio controlling visuals

Hi everyone,

I’m still pretty new at Touch and I’m having a bit of a hard time figuring out how to achieve what I want. I’ve got a basic understanding of creating a particle system as well as getting this to react to some sort of input, such as the mouse moving or an audio stream.

Here’s an example by Pan Sonics Askel from the album “A”. This is basically what I’m trying to achieve. Using Ableton to generate the audio input and Touch reading this and creating the visual output: vimeo.com/2613364

The main thing is that I need is for Touch to read each input from Ableton separately. So for example reading the bass, drums, guitar all as separate values and with these I will be able to assign them to whatever I want (i.e. the particle turbulence etc). But I have no idea as to how to read in these values in Touch or what I should do next.

Any help will be greatly appreciated.

Thanks

Hi,

Just a quick update. Still haven’t found a way for Touch to break up the audio being received from Ableton. So I have tried two separate ideas to try and achieve what I want, they work… kind of.

  1. I am using an audioin to pick up what is being sent via Ablteon (as instructed on the Ableton Live and TouchDesigner wiki). I am then using several Pass filter’s and Parametric EQ’s in order to pick up the different sound frequencies. This allows me to get Touch to (roughly) distinguish between the bass, drums etc.

  2. I have also resorted to exporting the different parts of the Ableton composition as separate wav files which I am then reading in as three different AudioStreams. This allows me to work with each part’s parameters separately. However if I make changes in Ableton it obviously does not register in Touch because the files are being read from the wav file location and not from Ableton itself.

So basically I’m no where closer to figuring it out. I’m having to find ways to cheat the program. If anyone has any suggestions as to how to do this better I’d really appreciate it.

Our approach here is to do all the discreet sound control isolation and filtering within Ableton, then I created a Max4Live effect in Ableton that translates audio stream amplitudes into an OSC stream. This method allows you to isolate any audio track or selective audio bandwidth and convert to an OSC stream and output to Touch. This takes the load off of Touch for audio processing and allows Ableton to do audio and Touch strictly visuals.

I would give you my max4Live object, but I think you cannot use it without owning a copy of Max4Live yourself.

I know this post is old but I wanted to let you know about the Able Sync environment which is now build into the recent builds of TouchDesigner (since 077 went gold in November).

It requires Max4Live, but it comes with all the necessary example files (for TouchDesigner and Live) to get started. This is based on the system we used for the Plastikman world tour. You can read about it here in the wiki:
derivative.ca/wiki077/index. … le=Ableton

Cheers