It features emerging patterns from a Slime Mold simulation, along with other agent based behaviors.
Other than editing the footage and extracting motion vectors using Nuke, the entire project was done in TouchDesigner which was fun and allowed me to tweak and fine tune things in real time.
So interesting !
I would be curious to understand how it works since I’m doing some R&D around liquids (as you can see on my instagram [url]https://www.instagram.com/hello_im_flo/[/url]) to be able to recreate it in TD…
Hey,
at the bottom of the vimeo description I have a bunch of references and links to papers that describe the logic behind slime mold simulation.
Unfortunately the project file of the music video is heavily dependent on a bunch of external files that are too big to upload and share. But a while ago I uploaded a simple TD example for slime mold at the facebook TouchDesigner Help Group. Search for GLSL_compute_slimeMold.toe
Do you have any good resources you’ve used to learn more about syncing things in Touchdesigner with music? I’m a touch beginner and have 2 areas I’m a bit unsure about:
Triggering different things based on time. I have somewhat of an understanding how to do this? I imagine, based on a trigger, you could fade between different containers in touch, or trigger different effects.
A bigger thing I’m having trouble with is syncing visuals to audio. In touch I’m having trouble “scrubbing” through audio. For example, if I pause the timeline and move backwards a frame, the audio still plays forwards. I’ve been having a lot of trouble working on an animation that ties to a specific part of a track, and working on that in isolation.