Michela: This is our third production shipped using TouchDesigner but the first with 088. The inclusion of python has been a game-changer, allowing us to re-use parts of our production tool chain and web apps.
The inception of the project was a week of research I did to look at new distribution opportunities for the ACO. We started prototyping the concept with the Unity game engine and moved to TouchDesigner as it became clear we needed more video playback capabilities than Unity provided. We were fortunate to land a couple of smaller gigs in the lead-up to ACO VIRTUAL which gave us the opportunity to evaluate the OPs (Operators) we would need for this show. I'd been tracking TouchDesigner for many years looking for the right moment to dive into it.
One of the biggest challenges for the production was maintaining sync - blending the performances recorded on a gigantic sound stage at Fox Studios and in the real-time performance engine. For the former, sound designer Simon Lear (bsound) brilliantly mastered a set of recordings. He had to deal with the speed of sound - the clap on-set being recorded at noticeably different points depending which mic you listened to. He then built a real-time audio processing system that can be tuned for each venue and respond to remix cues passed from our tablet 'controller stand' to TouchDesigner and then to Plogue Bidule via OSC.
With invaluable support and guidance from Ben Voigt at Derivative, I spent almost six months refining the show starting with an extremely rudimentary experience prototype that could manage 2FPS and ending up with a cluster of three TouchDesigner instances that made heavy use of the Pro-only Sync CHOP to deliver an interactive video running around 30FPS that our orchestra partners were happy with as a representation of their performance.
Having multiple TouchDesigner instances on a show working in tandem to deliver large-scale visuals and high-end audio was a revelation. The project involved learning many new features of TouchDesigner and feeding back beta test experiences to the Derivative team. Months were spent performance tuning. To hit our target frame rate of 25FPS with up to 39 videos composited at a time (on the Roger Smalley piece - 2 videos for each of 13 musicians, plus 13 scrolling scores) we relied on the GPU Affinity feature to bind a separate TouchDesigner instance to each graphics card and very careful mapping of cables to displays to ensure that load was spread equally across the two Quadros.
Musicians and Scores
The project goals of producing a tour-able show made everything more challenging. So far we've staged ACO VIRTUAL in four venues so developing effective bump-in/remote support/and bump out processes has been critical. We made heavy use of git (version control) to manage not just the source code and project files but also wiki documentation to record troubleshooting and show setup procedures which have so far been a life saver re. some of the finer points of how we've used TouchDesigner - e.g. "don't simply drag and drop Window displays without reference" to "what graphics cards do what"!
To streamline and de-risk TouchDesigner development we opted to push a number of functions outside of that environment. The bulk of aesthetic rendering decisions that could have potentially been done real-time in TouchDesigner with shaders were handled by a fairly traditional non-real-time VFX render pipeline in parallel (Fusion & Lightwave for the musicians, After Effects for the animated score). We also shunted all audio DSP processing to a separate machine so that the audio crew could work in parallel tuning the mixes to each venue. This helped get the end-to-end system up and running on schedule so we had space to refine and polish the experience before launch.
I was acutely aware that our specification had not been tested before (e.g. the number of synched audio and video files) so we planned for worst case performance and it was a huge relief when we exceeded the performance benchmarks required. No doubt version two will take further advantage of real-time capabilities.
The new python script workflow has been of great value. I was pleasantly surprised how straightforward it was to incorporate 3rd party python modules. This sped up our workflow in several areas. For example the ACO VIRTUAL data model, including mappings between musician, the musical parts they play and their position on screen, was made available to the team via a web CMS. TouchDesigner instances, the tablet controller, and the mobile companion apps for iOS and Android all had API access to the same show control web app running on our production cloud platform Rack&Pin. This made editorial updates and layout configuration a doddle. It can be fun zooming around TouchDesigner networks with a scroll wheel but auto-configuration via JSON feeds no doubt saved me some repetitive strain injury. As of February 2014 we will have two ACO VIRTUAL shows on tour at the same time all supported by our platform.
All-in-all it was a great experience working with TouchDesigner and I couldn't recommend more highly the support we received from Derivative. Having shipped our virtual tour system we're now looking forward to create new interactive experiences and storytelling opportunities that you can mess with on-the-fly.