Sep.21.15 An Interactive Sonic Play-Space in a Maze of Netting
"When you walk in, you'll see a series of netting, mesh material… We want you to touch it. It's all very hands-on. There's six sections within this maze of netting that people can go up to each station. Within those stations my song has been, essentially, dissected and spread out across 44 speakers going around the rectangle of the space. So if you're in section one, and you are interacting with the netting and the material, the Microsoft Kinect cameras are essentially following your movements. Whatever you're doing with your arms and your hands, if you're pushing on the netting, the depth is being translated to data that goes into the master mainframe computer, which is running my song in real time on Ableton Live." Matthew Dear Discusses His Installation at the New Museum, Pitchfork
photo Drew Reynolds
photo Drew Reynolds
photo Drew Reynolds
DELQA is a collaboration between Matthew Dear and a talented Brooklyn-based creative team that puts visitors "inside Dear's music" with the ability to change and affect the composition by interacting with arrangements of netting in a gallery space. It is a playful and very clever interactive piece conceived of and produced in-part by Gabe Liberti and Dave Rife of Dave and Gabe, multimedia artists who design and build interactive sound and light installations.
The musical composition is multi-layered and dynamic such that the arrangement and sound synthesis can be transformed on the fly. In an interview with Microsoft, Dear explained he "wanted to compose a sonic dreamscape, fully realizing that dreams are weird and not always black and white. The music rests between peaceful and chaotic states, allowing the listener to come up with their own story while engaging with the piece."
We had the pleasure of meeting Gabe and Dave at a recent TouchIn NYC meetup organized by Kamil Narwatil of VolvoxLabs where Gabe presented on the project. Afterwards we had a chance to learn more about how the project came about and how the team combined skills in architecture, acoustics, interaction design, and music to create an experience that allowed the audience to be participants in this musical realization.
Gabe Liberti: It was the most immersive experience we've been able to create yet and leveraged all of our skills from spatial sound design to real-time interactive lighting. The crowd reaction was magical because they were listening, physically exploring, and also performing the music and visuals happening all around them. It was an opportunity for us to design and install a 44-channel spatial audio system that transformed the gallery into a living world of sound.
How the project came about
Dave and I were approached by Steve Milton at the music and marketing agency Listen in early April. They had been working with Microsoft to come up with a project that would allow a musician to use the Kinect in a unique new way for a live music performance. Originally, we had been contracted as consultants to think through how this could be achieved. Soon, after discussing the project in greater depth, the concept evolved from being focused on a stage show to instead center on an installation experience. As this shift occurred, we saw the potential to implement a large-scale, interactive spatial audio system -- something Dave and I had been wanting to do together for more than a year. Also, this would be a unique opportunity to collaborate with an established musician to create a music-based experience from the ground up.
We put together a team made up of our friends and collaborators at New Inc: to help us execute the interactive music development, Yotam Mann; for environment design and fabrication, The Principals; for handling the C++ development with the Kinect, Charlie Whitney; and to aid in the visual design, Phil Sierzega.
The search for a venue was also still on-going. Dave and I knew that there was a pop-up gallery space at the New Museum where New Inc events had been hosted in the past, so we introduced the project to New Inc director Julia Kaganskiy to see if she thought it would be a good fit. It turned out to be an ideal place to host the exhibition.
The project’s success is a tribute to Microsoft’s trust in the creative team as well as the New Museum’s willingness to engage in the shifting tides involved in producing large-scale, technology driven work.
On Working with TouchDesigner
We've been using TouchDesigner for only a few months! Earlier this Spring, we had a chance to first collaborate with the brilliant Vincent Houzé on MODULAR, and then shortly after with Gabriel Pulecio on DOPPLR. It was incredibly inspiring watching both of these artists work with the system and make so many tweaks to their designs in realtime. We've used Touch on a few of our own projects since then, including Smoke Machine and the Barkbox Firework. It's already become an invaluable tool and an essential component of our future projects.
Inspirations for the project and some of the technical underpinnings.
As music interaction developers, myself, Dave Rife and Yotam Mann (creator of web audio framework tone.js) worked with Dear directly to transform his composition from a static piece to an interactive one. We began by organizing the parts of his arrangement into clips of varying complexity. This allowed us to transition from playing simpler parts when no interaction was taking place, to parts with increasing density for more active inputs.
Clips organized in order of rhythmic complexity in the Ableton Live session
We tweaked synth parameters to also be able to transform with a given input, like increasing sustain on the arpeggios and changing filter settings in the drum machine VST. We also created various 3D trajectories in Max/MSP to alter the placement and spatialization of the sounds inside the array of speakers. So even with a simple interaction, the audience could create complex, multidimensional modifications to each of the sounds.
The environment, designed and fabricated by experimental architecture studio The Principals, is comprised of a series of stretched mesh and net structures within a clean aluminum frame. As the attendees climb and push on these forms, their movements and presses are tracked by an array of eight Microsoft Kinects via a Cinder application written by Charlie Whitney. The positional data of these interactions was networked together and sent via OSC to the Max+Ableton software to drive the musical and spatial sound outputs and also to the TouchDesigner app I wrote to control the reactive lighting.
The 44-channel audio system layout
photo Drew Reynolds
photo Drew Reynolds
This first person walk through of the DELQA installation recorded with binaural microphones gives quite a good idea of the experience. Important: headphones required!
The tracks from Ableton are routed through an internal audio bus via JACK to Max/MSP for spatial processing, then piped out over the 44-channel loudspeaker system via two Motu 828x audio interfaces. The sounds were carefully mixed with regards to each of their trajectories, levels, and acoustic reverberance in the room, leveraging IRCAM's SPAT object configured for Vector Base Amplitude Panning (VBAP). Some sounds become larger and more reverberant as the mesh is pressed, others are focused and follow users' positions in space. A portion of the composition travels quickly and sporadically throughout the gallery, giving listeners the feeling of an insect flying by their ear.
The TouchDesigner light animations reinforced the musical interactions and overall progression of the song. Twelve DMX controlled LED bar fixtures and four wash lights controlled via Arduino with Firmata were hung above the various interactive areas. The colors, brightness, and size of each of the lights were choreographed to shift in intensity when it's corresponding zone was activated. The design of this responsive visual system was done in collaboration with Phil Sierzega.
TOP to CHOPs
Above Diagrams Designed by Phil Sierzega
The installation ran from August 6th-9th and was just the beginning in terms of what's possible when combining realtime animation with 3D audio for interactive experiences. I hope many more musicians will be inspired to compose for experiences like this one!
About Dave and Gabe
Gabe is a multimedia artist and experience designer. With Dave Rife, he creates interactive sculptures and immersive worlds using sound and light. As an audio engineer, he produced and mastered recordings for artists Anamanaguchi, Nullsleep, and George & Jonathan. For the Criterion Collection, he restored soundtracks from the greatest films from around the world.
Dave is a multimedia artist, designer, and engineer with a background in architectural acoustics, physics, and music. In a previous life, Dave spent a number of years designing performing arts centers and cultural buildings across the globe with the building engineering design firm Arup. He's opened three concert halls, collaborated with a range of artists including Lou Reed, Ai Weiwei, and Maya Lin, and has spent the last year building a portfolio of interactive projects with Gabe Liberti.