Close
Company Post

Gravity: TouchDesigner Works in Zero-G

"We relied heavily on TouchDesigner after an initial test period in Leavsden Studios in the fall of 2010, and developed a very flexible and powerful realtime pipeline to drive some of the major aspects of shooting the film. Think ICT's Lightstage, but real-time, interactive, fast and stable - that should give you a hint! I worked with both director Alfonso Cuaron and DoP Emmanuel "Chivo" Lubezki with this system, as well as with our overall show supervisor Tim Webber. We found TouchDesigner to be extremly flexible, and pushed it further into the other stages too. By the end of a 5 month long shoot we were using it every day."
-- Theodor Goeneboom, Framestore

"We were using TouchDesigner for lots of different things, we were master control for the entire stage with the lightbox driven by the robot's timeline. Theo lit the movie using TouchDesigner - 70% of the film's live action was lit through TouchDesigner.

The movie was run through TouchDesigner as well. Our real-time visualization system where we could superimpose graphics through the lens view to make sure everything was lining up was all realised through TouchDesigner. We had on-set repositioning, on-set color correction - a whole visualization system run through TouchDesigner; there was script supervision for the Assistant Director, all the notes called out during the shot - for example: "grab left", "look right", "spaceship coming in" - was all on a TouchDesigner timeline that we controlled.

All of the audio playback was slaved to our TouchDesigner system as well so we would be sending out MIDI triggers through TouchDesigner to the sound department.

We designed and built an entire set-wide ecosystem of production around TouchDesigner."

-- Jeff Linnell, Bot & Dolly

One of Gravity's remarkable achievements is that it sets a new milestone in the "suspension of disbelief" - possibly the greatest factor that continues to drive people to the silver screen. It does so with the kind of ground-breaking innovation that will not only transform the way that films are made, but that also liberates our ability to tell stories that were hitherto thought impossible to translate from imagination to moving images.
It is now part of the film's folklore that director David Fincher had counselled Cuaron that the technology necessary to make Gravity didn't yet exist and advised the filmmaker to wait another five years. Cuaron forged ahead and it did indeed take 4.5 years to make the film.
In respect to the technology central to the filmmaking process, what differentiates Gravity from any other film is that an unprecedented amount of digital production and visual effects were integrated on-set in real-time (vs. post production) through the ingenious combining and use of these newly devised and appropriated technologies.
The making of Gravity is an epic story of its own and TouchDesigner plays a diverse role as we learn here in a technically-detailed account from Framestore's Theodor Groeneboom, on-set technical director and sequence lead compositor of Gravity.

Early Days > Pre-vis

Back in early 2010, Alfonso Cuaron approached Framestore's VFX supervisor Tim Webber about an ambitious sci-fi project named Gravity. The two had previously worked together on Harry Potter and the Prisoners of Azkabhan and Children of Men. But this time Alfonso was keen on keeping visual effects involved from the very beginning. The two would discuss ideas on how to solve some of the more complicated issues with the script, namely very long shots and the lack of gravity in the film.

We set up an office for Alfonso and producer David Heyman at Framestore as we knew the only way this movie could be made was with a very close collaboration between the film-makers and the visual effects artists. Framestore was involved in every aspect of the film, from planning, to pre-production, to the film's shoot and the final post-production.

During the beginning of the project, Alfonso would sit down with our pre-vis team and work out the cinematic narrative of the film, length of shots, pacing of story. We developed a virtual camera system that Alfonso could use on a virtual mo-cap stage. He'd then move around and capture the performance for our animators and give Tim and the team an insight into how he wanted to present the film. The pre-vis would be worked up to a very high quality in terms of animation, camera movements and storytelling. This was a very important step to get right, as we needed to move the data into next stage.

Pre-vis > Pre-light

We moved the pre-vis into a stage called pre-light, which is a process where we would light the pre-vis using a physically accurate renderer in order to get an idea of the kind of lights and look we would have to come up with once we would start the principal photography. Alfonso brought his long-time collaborator and cinematographer Emmanuel "Chivo" Lubezki on-board to light the film and he was heavily involved with our CG-lighting. He would spend hours at Framestore working with our lighters and teaching them about traditional lighting and cinematography techniques. The quality of the pre-light resulted in us having a blueprint for how the lights should look on set.

Towards a Working Methodology

Parallel to the pre-light stage, people at Framestore had been busy thinking about ways to shoot the film and create a working methodology, from spinning Tim around in an office chair, to using robots and motion control set-ups and traditional wire rigs. NASA's parabolic flights were also tested, but deemed unfit and too expensive.

The major challenge for the film seemed to centre around a few issues: "how do we spin actors around at the speeds required by the pre-vis and pre-light, without compromising safety or quality?" and "how do we spin lights and cameras around them at the same time?"

The solution seemed quite simple: "we don't".

Instead, we opted for an idea where we would spin the world around the actors and let them be stationary on stage. Tim and Chivo had an idea to use a large array of LED lights, with lights projected from the pre-light onto the actors, we would then be able to simulate the high speeds of their movements and their relationship to the sun and the earth.

Prototyping 'the lightbox'

In the autumn of 2010 we went to Leavsden studios and built an early prototype of the LED array dubbed "the lightbox". Patrick Lowry and I (Theodor Groeneboom) were tasked on setting up the lightbox and finding ways to drive the data from the pre-light into the lightbox.

We decided we wanted to drive the lightbox with something that was compatible with the data coming from Maya, had compositing controls, and most importantly would work real-time in order to accommodate changes from the DoP and VFX supervisor during the shoot.

This animation data was then fed into TouchDesigner and we would set-up textures and geometry to mimic the light from the sun and the bounce from the earth. This would then be rendered real-time and projected as lights in the lightbox.

The early tests proved very fruitful and it was established early on that this was a technique that would work for the majority of the shots in the film where the actor is in a spacesuit, as we're only interested in the lighting and performance of their faces.

We still needed motion-controlled cameras to be in sync with the lighting and collaborated with the San Francisco-based Bot & Dolly for all of our robot needs. They were using modified car-manufacturing robots for a motion control set-up, and had developed tools that let them run Maya animation on the robots via TouchDesigner.

The last piece of the puzzle was to create special effects rigs that would puppeteer and move the actors around in sync with the lights and the robots. The movements would not have to be very large, as most were compensated by the camera. We worked together with Neil Corbould and Manex Effram on creating various SFX rigs.

Bringing it all Together

In early 2011 we had worked out the methodology for shooting the film and we moved our Framestore shoot crew to Shepperton where we set up the stages and opened a small "back office". Jeff Linnell and Asa Hammond from Bot & Dolly joined us on set together with their three big robots.

Live Recording System

TouchDesigner was also used to create a playback and recording system that would sync our pre-light (QT) with our tech-viz Quicktime and the actual in-camera footage live. We could then always refer back to this recording and see whether or not the data lined up or if the actress' eye lines were doing the right thing.

In the weeks prior to principal photography we prepared all the shots making them ready to be put in the lightbox and on the robots. We would receive a "shoot package" from our on-set tech-viz team containing all the data needed to setup a specific shot. Like the animation locators and data for the lightbox, the Maya animation for the robots and the pre-light for reference along with any specific notes.

Pre-Test and Safety

We would then run validation moves on each one of the shots to make sure they were safe to run with actors, and that all the elements would match the pre-vis and pre-light as expected.

More Specific about the Lightbox

The lightbox was an array of 5x5x5 LED panels arranged in a cubic form, each panel consisted of 64x64 RGB light diodes, resulting in a whopping 1.5 million lights in total. The lightbox featured some panels that could be opened to let the camera go in and out of the box when needed. We created special shaders in TouchDesigner that "squish" our lights around the open doors to avoid losing too much light if the sun or earth were covering our door.

On-Set Lighting of Actors' Faces - "Reverse Image-Based Lighting"

Even if everything were pre-planned and pre-programmed in the pre-vis, pre-light and tech-viz, we knew we had to build something really flexible to accommodate changes happening on set. The lighting might be perfect in the pre-light, but the light from the lightbox would look slightly different on real human skin, and the DoP wanted full control in order to get the best possible lighting on the actors while making sure it was still respecting the pre-light and preplanned set-up.

To accomodate this the lightbox ran at 60 frames per second and with a full linear color pipeline with a special look up table at the output processes that would compensate for both the LED and the camera response curves. This way we could accurately adjust lights using the DoP's language. He would ask the TouchDesigner operator to change a certain light up or down f-stops, change the color temperature or size of lights. Or even add new lights to the scene, animate them on or off, or create virtual black flags to block other lights.

On a few specific shots, we rendered 360 degree CG renders of the ISS interiors and brought QuickTimes into TouchDesigner. This was sometimes used as ambient light and we would add our own TouchDesigner lights on top this for real-time adjustments. We ported some of our re-lighting tools from Nuke over to TouchDesigner using GLSL, so we had some control over the pre-rendered lighting and could do basic re-lighting when needed.

Almost a full year (and countless accolades) since the release of Gravity we still experience the same 'pinch me' edge-of-our-seats excitement over the phenomenal achievements and innovations made by the filmmakers and that TouchDesigner played the part it did. We could not be more thrilled!

It is with massive thanks to Framestore's Theo Groeneboom for his enthusiasm and eloquence in articulating this story and to VFX Supervisor Tim Webber for his generosity and willingness to share the studio's hard-earned knowledge that we are able to publish this now.

Once again, everyone at Derivative thanks and congratulates the team and are immensely appreciative to be able to share this work with our community!

All images unless otherwise noted copyright 2013 Warner Bros Pictures. All rights reserved.

Comments