Close
Company Post

WONDERS OF NATURE | GASOMETER OBERHAUSEN

"Wonders of Nature" is a current exhibition at the spectacular Gasometer Oberhausen where the highlight of the show is our blue planet itself 'brought to life'. Suspended in the lofty 100-meter high air space of the Gasometer the 20 metre globe floats as animated, high-resolution satellite images (8-10 K) are seamlessly projected onto its surface.

Visitors are thus able to experience from a perspective usually reserved for astronauts in space, the fascinating phenomena of the changing atmosphere, the alternation of night and day and the changing seasons on the planet we call home. The video above provides a glimpse into the process that enabled the team to construct this stunning representation (making it look easy of course!) and below we unpack some of the more granular details of the 'making of' with Intermediate Engineering's Nicholas Braren.

Conceived in partnership with the German Aerospace Center (DLR) and curated by Prof. Peter Pachnicke, Wonders of Nature runs till December 30, 2016 - experience it if you can.

Derivative: Tell us a bit about the data you are using.

Intermediate Engineering: It is made up of about 1,5 Million satellite photos of the earth stitched together by the Earth Observation Centre at the DLR (German National Aeronautics and Space Research Centre). They rendered image sequences for each of our 12 projectors, which we encoded with the HapQ codec, and are projected onto a sphere 20 metres in diameter. So it is a huge model of the earth hanging inside an even larger gas tank. At each projector there is a PC with an instance of TouchDesigner running, with which we synchronize and manipulate the geometry of the videos. The projectors are between 5 and 70 metres away from the sphere.

Derivative: Can you explain the process of getting data from its source into images on the screen? Intermediate

Engineering: With the multiple projector-setup we required synchronized playback across all players, high quality video, warping and last but not least flexible soft-edge blending. We have been disappointed with the performance of some of the usual suspects in the media server scene and after a lengthy evaluation period and a test-run, we decided to use TouchDesigner.

We used the Movie File In TOP to get the video data into TouchDesigner and then sent it through a custom-made Warping COMP. We had our programmer Lukas get into the nuts and bolts of the Stoner and set it up so we could warp over the network with UltraVNC. We made the output out of TouchDesigner twice the width so that the right hand side of the VNC window was used for the Stoner window, while the left was the content being shown by the projector.

The output of the Warping COMP is multiplied by the output of another Stoner, with which we warped the soft-edge masks. Our projection areas were either oval or rounded rectangle shapes, so we had to create similar softedge masks and be able to change the borders respectively. Straight-edged soft-edges were not an option.

The next step in the chain is a COMP of operators that change the geometry of the data again in order to automatically correct the projections.

We have 52 light sensors with which we calculate the position of the projection. We are able to locate the position of a sensor in relation to the pixels and if it differs to what the position it had just after warping, the projection is corrected. The sphere can sometimes move due to temperature changes in the gasometer, but with the calibration we can compensate for this without having to be onsite.

[Meaning: The projections are warped dynamically: If the sphere and mountings warp due to temperature, using the sensors it is re-warped automatically to adapt.]

The video data then goes to the output. We used the Window COMP to organize the output with and without warping and softedge editing and used keyboard shortcuts to switch between modes when connect over VNC.

We have a show control PC running a master script sending play, pause, stop and sync commands to the clients. It tells them which video to play, or to display one of the test grids. With the sync commands, the frame ID is handled like a time stamp so the players can adjust if they are out of time. If a player differs by 1-3 frames, playback is sped up or slowed down. A frame difference of 4 or more causes the player to jump to the correct frame.

Derivative: Was there an aspect of TouchDesigner was particularly useful in this production?

Intermediate Engineering: The gamebreaker was definitely the Stoner. We were not only able to be very flexible with warping (which can be very tricky with a 360° projection onto a sphere) but we could also put our softedge masks into the Stoner and define exactly where we wanted to have our projections overlap. Usually, you can't have rounded softedge borders, so this was great for our application.

HapQ playback was another feature that we also really appreciate. Other systems are restricted to H264 and MPEG-2, which although have a better compression rate, just don't cut it for picture quality. The colors are much more brilliant and we had a hard time finding any compression artifacts whatsoever. This really shone through with the content we had, with many of the satellite images containing large areas of clouds and vapour, where video compression can be very noticeable. Python scripting turned out be be a very useful tool, as TouchDesigner is being remoted by a self-written network service that's continuously sending signals for sync, play, pause, fading and selecting clips. It also monitors TouchDesigner's health status and restarts the software in case the python scripts are not answering.

Derivative: How are people to use this system?

Intermediate Engineering: "People" don't use it. People don't even know it's there. All they see are the beautiful images rotating on the sphere. All the funky looking stuff is hidden in the background doing its job 8 hours a day, 7 days a week.

Derivative: What's it like working with spherical information?

Intermediate Engineering: It's complicated. But we learnt a lot from this project. The projectors project squares onto the sphere and we have to stitch those squares together to make it look like a sphere. So we had to help ourselves and design a system that let us manipulate the video data the way we needed it. And we didn't need to buy expensive servers and software tools to achieve this.

Derivative: Did you face limitations?

Intermediate Engineering: We had a hard time to find a way juggling with licenses to setup the 12 players. The devices do not have an internet connection, so we had to attach and change the dongles manually. Some computers are positioned at 100m in height, others are on the floor. That was frustrating. The internet option is a good idea, but we'd really appreciate a peer-to-peer license transfer option.

Derivative: Did this experience lead to new discoveries, ideas, or processes?

Intermediate Engineering: Sometimes if the usual tools don't provide you with the features you need, you have to help yourself. This project showed once again that when you build your own infrastructure, you have almost complete control. It can be really satisfying too! We look forward to future projects using TouchDesigner.

Comments