![]() |
![]() |
|
![]() |
![]() |
![]() |
|
![]() |
![]() |
D: Well Kinetic Lights is certainly a very eloquent expression of your ideas and feelings! Could you tell us how it came to be… the idea/concept and then the process, development, testing etc.
CB: I had the initial idea for the Kinetic Lights system in 2001 while studying a semester abroad at the School of the Arts Institute in Chicago. I was taking the virtual reality class of composer and VR artist Insook Choi and experimented with the displacement mapping principle in a realtime 3D environment. I used an array of virtual spheres attached to an invisible plane modulated by a displacement mapping. I wanted to transfer this principle into real space. My idea was to do this with a large array of evenly distributed suspended spheres.
When I returned home to Berlin I started working on various options to realize this dream. The first installation was using an 8x8 array of helium filled tethered balloons reeled in and out by a computer controlled winch system that I developed. I perfected this system over the years with new generations of improved winches ultimately resulting in the first version of the hanging "Kinetic Lights" winch system in 2007.
The modular Kinetic Lights System has since then been used to create a variety of installations including temporary and permanent projects. We used the system for a lot of events and trade fair installations mainly for large corporations or brands like Vodafone or Volkswagen. But we also created live performances with electronic music like the GRID show, that we performed at lighting festivals together with musician and composer Robert Henke aka Monolake. We also do more and more permanent Kinetic Lights installation for restaurants and shops, the latest ones just opened in Dubai and Milan.
The new "Kinetic Lights" product lifts up to 3kg at various speeds and features full color RGB LED control at the end of the lifting cable. Each winch is individually addressable via DMX for dynamically controlled cable acceleration and speed control. The latest winch generation is also equipped with an auto setup function with cable end position self-detection and real-time feedback communication. The winch technology has been patented internationally (in 2014).
![]() |
![]() |
Markus Heckmann: The Kinetic Lights application is an ongoing development for WhiteVOID with the goal of creating a tool that is dynamic enough to adapt to the ever-changing deployments of the Kinetic Lights winch system at trade shows, art exhibits and cultural events. Included in the system is a Layout Creator, a Show Creator and a Show Player. With the layout set up and tested the designer can proceed to program the show by utilizing an array of preconfigured but adjustable light and motion patterns. Custom settings of these can be saved in presets and arranged in playlists. While the collection of default patterns fit the needs of many purposes, the system is designed to enable any TouchDesigner operator to create show-specific components that conform with certain requirements or particular layouts. Each pattern is created of the same base template and can be developed without full understanding of the complete surrounding system.
|
![]() |
![]() |
D: How did you come to know of TouchDesigner?
CB: I followed the development of TouchDesigner from a very early stage on since I was always interested in real-time 3D especially to VJ live visuals alongside electronic music. Some years later Markus Heckmann of Derivative visited us in Berlin to present the latest version of TouchDesigner and I was thrilled to see how much it had evolved in terms of usability, performance and also the whole look and feel of the graphical user interface.
D: Where do you see the main benefits of using a visual development environment in comparison to building your projects in a programming language?
CB: TouchDesigner is initially easy to learn as a prototyping environment with intuitive graphical user interface and stunning real-time 3D capabilities. But once you get into it you realize the endless possibilities to not just prototype but also finalize applications within the framework. The limit is just PC processing power and the will to learn and understand the features in depth.
The complete realtime approach gives 100% control over every bit of the application at any time in the process of creation. From the actual user interface elements with previews of every component executed in the framework up to the final output. This is very unique as other software prototyping environments are either not realtime (you need to compile before seeing any results) or have a very minimalistic or even geeky GUI approach. The support of the various 2D, 3D, video, graphics and generative approaches inside TouchDesigner allows you to use it as a universal tool to create any imaginable kind of application.
D: You've produced a lot of interactive and location/environment-based work - what has that told you about human behavior and how we learn, interact, adapt?
CB: I learned that for all our installations, software and systems to be controlled by strangers one common thing applies: They have to work without any explanation! This means that people generally like to try, explore, fail, learn, try again... but do rather not want to read instructions first. So we are always trying to keep our menu structures and hierarchies as flat as possible and design the graphical user interface minimalistic and simple.
This does not mean that we are not constantly thinking about design and style improvements, but never at the cost of intuitive understandability of the final application. People are only open to new interaction principles if they appear to be more simple and better understandable than the ones they already know. This drives us to always push the boundaries and try out new things to solve a common problem. The existing solutions that everyone uses are not necessarily the best ones.
Speaking of ... I am typing this with 2.5 fingers on a 10 finger keyboard. I never took the time and effort to learn to use it right and intuitively used it just does not work as it is supposed to.
![]() |
![]() |
D: 'digiStage – digital theater stage lighting system' was quite ahead of its time! How did you go about designing, testing and implementing? What was the experience and how was TouchDesigner used in this project? CB: The digiStage system was initially commissioned for the theater play "Fanny and Alexander". The piece inspired by Ingmar Bergman's movie premiered at the Central Theater in Leipzig, Germany. We designed a projector based digital lighting system with integrated depth camera tracking to pixel map the whole stage area of about 120m2. This included a large scale Neo Rauch painting as a backdrop. Since the theatre play used a flexible scene order, we enabled the lighting designer to select or deselect actors on the fly via a touchscreen-based remote control interface with a simple graphical user interface. This allows to make the actors active or inactive as triggers or actuators for digital light and shadow effects and generative visuals while the play was running. Multiple effects could be assigned to individual actors in almost infinite combinations. Additionally virtual "light agents" were autonomously interacting with the real actors on stage. The main challenge was the spontaneous nature of this particular theatre play. It did not follow a written script and the order of scenes and the dialogues changed and evolved every evening. TouchDesigner gave us the possibility to react quite fast to general changes during rehearsals and adapt individual effects on the fly depending on the directors input. Later on we transferred this flexibility to the GUI for the lighting designer, so variables for the effects could still be changed and refined with every new instalment of the play within a defined range. |
Polygon Playground (2008) CB: With the Polygon Playground we tried to push the idea of transfer from digital to physical even further. We created a 3D polygon model and then built it as a large scale solid physical object. Then we projected the virtual 3D model back onto its physical representation with 3 video projectors. We used an identical twin setup of object, camera and projector in virtual and real space to match the surfaces almost automatically with just a few clicks to realign both worlds in the end. An IR camera tracking system detects human interaction with both the physical and virtual object surface at the same time. This allows us to modify the projection mapping in realtime and with natural movements and gestures. We exhibited the Polygon Playground in various shows in different sizes and shapes. We also installed a permanent incarnation at the Daegu Science Center in Korea. |
Video Objects (2005) CB: VideoObjects was the result of further investigating the options to liberate video from the flat rectangular projection it was trapped in. I wanted to bring video and 3D content into physical space and onto any kind of surface like a real-life mapping. I was again inspired by the capabilities of 3D softwares and tried to transfer their digital properties to the real world. We programmed a simple software that allowed us to perspectively pre-distort and mask videos and images to map them onto any spacial arrangement or objects. Other functions allowed us to use a photo of the environment that was taken from the projector’s perspective to match the virtual to the real world with just a couple of clicks. This whole process is now known as projection-mapping and is commonplace in today’s media installations. Our software never made it out of prototype state but helped us to develop a good understanding for our following reactive real-time 3D mapping projects and experiments like the Polygon Playground or the DigiStage. TouchDesigner’s KantanMapper or MadMapper are now beautiful examples of thoughtfully created and easy to use yet powerful 2,5D mapping applications. |
![]() |
![]() |
D: Where are we heading? What future do you envision and the role you'd like to play?
CB: I think actually we are not heading towards the future any more, it's already here! All things I have dreamt of doing as a student in terms of interactivity and integration of real and virtual world are possible and happening right now. The integration of software and our physical environment feels completely natural. Self driving cars are starting to roam the streets. Display resolutions are surpassing the resolution of our eyes, finally providing compelling virtual reality experiences. No need to complain about processing power any more, it's more than enough to achieve whatever I can dream of.
I feel free and limitless to design and develop as much and as far as I can imagine. I am just waiting for the one machine to stretch time, so I can realize all those ideas before I have to check out in the end anyhow ;-)
D: Any thoughts on open or evolving systems - one example being that upgrading software embedded in any product or system can give that system 'new legs'... new capabilities/extended life-cycle.
CB: I know it was always a human dream to create modular, adapting, evolving and ever growing systems. Similar to our own development in life and as a species. But so far in my own experience it is only possible to upgrade, modify and adjust a system for a certain amount of time until it is finally outdated. Every system has a core, mainframe, brain, leader... If this central unit grows old, deteriorates, stops evolving, the complete system becomes weak and unstable.
Technology in general and especially software applications become more flexible and extendable. But later developments will always need more power, more throughput, more connections, more of everything, making them at some point incompatible with older hardware components.
Software at the core of a system is also not always a blessing. Take my car as an example, I can chip tune it to go faster without changing or modifying the actual drive unit. But at the same time most of the break-downs and repair costs are generated by its failing sensors, electronics and data processing units not the motor itself. Also the built-in adapter for a cellphone that was state of the art when my car was produced is now a sad reminder of how fast technology is evolving.
I believe in constant change and evolution for the greater good of mankind. But so far this was achieved in steps and radical changes or additions from time to time. Nevertheless the connectivity of the internet and cloud based services already give a glimpse of what will be possible in terms of growing and evolving software based systems in the very near future.
D: As a designer, a person, a teacher-- do you have any advice for the youth of today embarking on the course of their education and lives?
CB: Go out there and realize your ideas. Ideas you can have a million, but what counts is the few you make really work and last. As a job find out what you truly like and only do that, this will be most successful. I know, this is easy said and unbelievably hard to do. But it is worth trying even if it takes years to get there. I am still trying and still not there, but getting closer every day ;-)
D: Again, a huge thank you to Christopher for sharing such meaningful insights, we hope our readers are as inspired as we are! We'd like to mention that Christopher's new show with Robert Henke Deep Web will be premiering at CTM Festival in Berlin and run from the 2nd to the 7th of February at Kraftwerk. There will be winches, movement, orbs of light and lasers... and of course beautiful sound! Check it out!
Last note: WHITEvoid is hiring a TouchDesigner Developer and this is a superb oportunity!
![]() |
|
![]() |
|
![]() |
|
![]() |
|
||||||||||||||||||||||||||
![]() |
|||||||||||||||||||||||||||||||||
![]() ![]() |