We were recently taken by surprise by something rather out of the ordinary that challenged our most liberal interpretation of what can be categorized as ‘real’ and certainly as ‘real-time’. The projected image seemed not to be sitting on the surface of the object but rather to be embedded into it. A formation of suspended blocks, bigger than a breadbox and smaller than Amon Tobin's ISAM, moving in space with a surface movement sticking to it giving the impression of a single unified artifact that couldn't really exist by definition of what is known. But there it was.
A little investigation led us to White Kanga, a talented crew from Poland who have expertly been building their own custom applications with TouchDesigner to pull off the otherworldly and it could be said unprecedented fabrication seen here.
Following the video below which illustrates what we’re talking about, is an interesting and highly inspiring interview with White Kanga that details their experience working with TouchDesigner from a seasoned VFX background (Houdini pros) and the making of the Modeling Projection System (MPS) project.
Derivative: With the MPS you’ve produced a very ‘advanced’ body of work in a very specific direction and it seems to have come together really quickly. Can you tell us what the process of learning and adopting TouchDesigner has been from the standpoint of VFX designers with Houdini expertise.
White Kanga: Rafal Bielski has been a Houdini user since 2008 when he was looking for tools to help realize complicated tasks in the Polish animation "Switez" at Human Ark Studios where he was employed as pipeline engeener and CG supervisor. Arek Rekita on the other hand is all about procedural solutions so Houdini is his natural habitat. It’s where he developed techniques for J.C. Avatar's flora.
So our decision to use TouchDesigner was made because of similarities between Houdini and TouchDesigner’s familiar philosophy.
Rafal B was first to be hooked by TouchDesigner and used it to produce a prototype model of an interactive music game-with full working game play, 3D visuals and sound.
Rafal Bielski and TouchDesigenr
Derivative: You're using TouchDesigner in one of the ways it is best suited - to build custom applications that don’t yet exist, can you tell us a bit about that experience please.
White Kanga: Having a Houdini background we immediately started to miss OTL (OTL versions, parameters promoting etc.) so there were challenges to overcome at the beginning. but also astonishing surprises like mixing context and visual representation for the result of every node.
Since we are coming from postproduction, TouchDesigner’s interface was not a big surprise - and that is a good thing compared to what programing environments usually look like.
What fascinated us was flexibility of the workflow, CHOPs, SOPs and DATs - all things coming together when you need them and possibilities just inspiring you more and more. Still we had problems with memory management and we still hope for 64bit to push more data though. [stay tuned]
Derivative: Many motion graphics studios are being approached by agencies to produce immersive/projection mapping pieces, so TouchDesigner is a natural tool for people like yourselves from the VFX world to use to extend their capabilities to fulfill and also shape a growing industry.
White Kanga: In my opinion TouchDesigner can be very useful in VFX. From Kinect and mocap recording by tools for data processing (CHOPs, DATs), in-house presentation system, procedural modeling and more to a full virtual camera set as was used in Avatar. From the technical director point of view, possibilities for building tools on TouchDesigner are endless.
TouchDesigner is like a sports car. You can start a project extremely fast with great results, having great fun with it. Sharing components is great even without full fledged SDK you can have a whole team working on a project and manage incoming assets.
The idea of achieving a goal in number of ways really encourages playful behaviour. It would be even better if some optional script language could be involved.
Mixing between 3D, digital compositing, scripting, signal processing and others just lets us use our whole experience from years in the VFX industry and extends it to be interactive.
Derivative: Tell us about the making of the Modeling Projection System and the tools you created in the process.
White Kanga: We all know that mapping is a very impressive and effective form of visual communication that provides the artists with a rich tapestry to work with and the viewers with extensive emotional experiences. We are aware that many artists use this tool in their presentations, which makes mapping an increasingly familiar solution these days.
So we started thinking what would be the next step in the development of mapping and this is how we came up with the MPS Project. It combines mapping, robotics and extended reality – all delivered in a highly interactive way. Real objects perfectly matched with images that viewers can influence create an even greater sense of illusion and hence produce a stronger emotional sensation. After a few months of intensive work, we developed the first version of MPS.
TouchDesigner was the main development environment for this project. Its flexibility allowed us to develop in-house tools needed to achieve the desired effect. We managed to instantly match the position of the projectors with 3D stage, simultaneously synchronizing robotics with the projected image or spatial audio.
White Kanga: Here is a brief description of several tools which we developed in TouchDesigner while working on the MPS project.
WK Projection Tools
A tool for semi-automatic location of projector position in 3D space. Now it only takes a few minutes per projector to sync the projected image with the actual object. An additional advantage of this tool is the possibility to calculate the position of objects, which in turn enables instant modification (from a few to several seconds) of the space that we are projecting our image onto. This tool also has the option of projector stacking.
This is a combination of an old-school tracker (for instance Amiga – Pro Tracker) with a state-of-the-art timeline editor. This allows us to arrange the programme/impulses over time, at the same time maintaining full interaction of elements activated by the sequencer. We can create any number of channels and send both text and numerical data. This allows us to plan the events over time and to record a live performance with input devices such as Behringer BFC2000 etc. All fully synchronized with music or other tempo source.
This tool was inspired by SideFx Houdini takes. Take, in other words, is a status of single or multiple parameters saved under a given name. This tool allows to smoothly transition from one status to another on the basis of any defined curve and makes it easier to work with multi-parameter animations.
WK Augmented Tool
This tool generates proper perspective for the observer in a given effect and consequently generates an impression of extended reality. In MPS Project v 1.0 both the perspective from the window and the moving reflections on metal were created on the basis of data generated with this tool. Furthermore, Kinect was used to source the observer’s position.
WK Shading Tools
We developed a technical shader, which allows us to evenly illuminate the surface, regardless of its angle towards the projector’s lens. We also created an entire range of interactive shaders that allow us to generate realistic surfaces or non-realistic effects. Now it is possible to project onto metal, glass or water taking the observer’s position into consideration (WK Augmented Tool).
WK Stepper controls robotics and is responsible for synchronizing the projected image with the robotic movement. CHOP operator data is processed into appropriate format and transported with any given protocol (for instance OSC) to the devices that control the robotics. In MPS v 1.0 we used a single 27 Nm stepper motor, since we wanted to have a considerable power reserve.
WK Audio Tools
This tool allows generating realistic spatial audio with any number of speakers. In the future we are planning to develop it to include the properties of surfaces on which the sound is reverberated. Additionally the WK Audio Tools allows spectral sound analysis and processing the selected frequencies into impulses for the purpose of animation.
White Kanga: These are only some tools that were developed within the framework of this project. The works on the MPS project started back in November 2011 and it is difficult to predict when it will come to an end since we continue to have new ideas for its further development and application. We are simultaneously working on other installations based on techniques other than mapping.
Arek Rekita and Rafal Bielski at White Kanga studios.
Derivative: Very impressive! Seeing what you've already accomplished leads to wanting to know what else do you see using TouchDesigner for?
We see TouchDesigner as tool for control, synchronization and interface to coordinating external systems. The possibilities are endless! We prototyped in 2 hours from scratch, a database application that gathers, measures and delivers information about energy produced by solar panels.
Interactive multimedia product is something that breaks conventions, delivers new ways of communications and enriches classical ones. We like to think that TouchDesigner encouraged us to mix context more openly without the restraint of time-consuming preparation of each element and composition of such. Now changes can be applied on every level of the production without headache.
TouchDesigner plays a key role in our workshop and we are at the stage of integrating it with other environments, such as Processing or Openframeworks.
Derivative: Where do you see the main benefits of using a visual development environment in comparison to building your projects in a programming language?
White Kanga: Immediate visual and audio result = real-time, much easier cooperation during production and bigger transparency of the setup.
Derivative: What next?
White Kanga: We are not very fond of talking about future realizations. Keeping it secret makes the result more desirable and enjoyable. However it is commendable to say that we are working on cross-system solutions invoking mocap, robotics and audiovisuals.
Derivative: Thank-you so much for speaking with us and we look forward to surprises coming from your studio!
"White Kanga was established in 2011 and is headquartered in Warsaw, Poland.
We are a group of technical executives and supervisors with long-term experience in filmmaking, advertising and events.
Our mission is to generate strong emotions through interactive environments and installations, to research and develop in-house state-of-the-art multimedia solutions." White Kanga Team
White Kanga on Facebook: www.facebook.com/whitekanga
<<back to blog