Integration with Green Hippo

I am an LD at a big church in Indiana and we are about to rollout a TouchDesigner rig that is doing some formatting of Lyrics. Everyone has been so impressed with TouchDesigner that we are now looking at generative content options. We have a Green Hippo media server that drives a large screen behind our stage. We have an E2 between the Hippo and the screen but the Hippo is the only source that is used on that screen.

What is the recommended way to integrate TouchDesigner and does anyone have experience using it with a Green Hippo media server?

We are also considering Notch because it integrates directly with Green Hippo.

We would like to start using audio and video inputs and generating content from them but I am concerned about latency. I am assuming that our best option is to run TouchDesigner on the Hippo with spout to get the video from TD to Hippo. Is that a reliable way to do things?

Using Spout is definitely one way to do this. It’ll keep the latency low. As long as the server has the GPU power to handle both processes going, it should be good. You’ll want to make sure you have drawing turned off on your TouchDesigner perform window (parameter int he Window COMP) so that it’s just rendering in the background and not onto an actual window, which may affect vsync.

What are you using Green Hippo for anyway?

If you change that output to also run Touchdesigner, you will have lots of ways to integrate it with other touchdesigner systems.

I don’t have experience running green hippo, though I know that TouchDesigner can accomplish all of the same things with little effort.

In my experience when you have to rely on a spout system passing data and media over to concurrently running software systems you start to run into trouble and reliability issues.

My suggestion is to keep the number of different apps to a minimum. Pick one thing that can do what you want and reduce complexity overall. I suggest touch