syncing videos

Hello,
I’m asking for a suggestion to setup a multiprojection / multipc installation with synced video materials, specifically about playing multiple movie files synced: is the ‘specify index’ play mode with a index message sent through osc/sync chop an option? I understand that a sync chop can synchronize the rendering timeline of the clients, but would a index message synchronize a ‘client’ moviefilein top rendering? In general which is the correct play mode and procedure in order to get two video synced?
Any advice will be of course appreciated.
I 've seen people using (multi)quadro cards, but I would prefer to have a modular setup (and not having say 20 output from 1 pc) and working with standard hd video files (and not with ?K resolution video), possibly with 2 / 3 video played by the same machine (with the synced index communication, or any other solution, working also locally on a single machine).

Thanks

I’d give this article a good read through:

derivative.ca/wiki099/index … _Computers

We often use a software sync approach. A controller streams time as OSC multicast to nodes, which then uses time to playback video. If you have mixed framerates you’ll likely have less misery using seconds. If you know that all content is enocded at the same frame rate you can stream frames instead.

It’s worth keeping in mind that you need to be very mindful of overhead on your control machine… anything that impacts your timecode messages will be show up on render nodes, so you’ll want to keep your controls as optimized as possible.

Projection tends to be a little more forgiving, so a software sync using OSC with 1-3 frames of latency is usually pretty tolerable - in fact it’s how we do nearly all of our projection jobs. Large content walls are a little less forgiving, so opting for a pro license to use the sync in/out CHOPs along with framelock (hardware sync) is usually more advisable in those situations.

Regardless, it’ll be very important to make sure that you have a dedicated hard line network for your machines. Forget wifi, you’re only asking for trouble. Also forget using someone else’s network - unless you make time for considerable testing.

Thanks for the help (and for all the tutorials and reference material )!
i read the article, but I probably still miss some pieces (sorry if i’m misunderstanding some steps):

  • would you use a “master” moviefilein top to get the frame number or (and in case which) an external time sender source?
  • If the first option is correct, would you get the current index (last/next/true?) through an info chop and send it to control the clients?
  • is using the “by index” playing mode for moviefilein clients (using ‘index’ as it is 0 based vs frame) the correct approach? (see attached trial test image)
  • as you said the osc solution introduces a latency: i tried that locally and there seems to be a 1 frame of constant delay (but as I tried with an lfo from snippets with same osc settings I can get the correct same value in the oscin, see bottom of attached jpg)
  • would the “by index” playing mode approach be the same for a sync chop / pro license: disabling the realtime flag of the client and, in case of additional render on the controller machine, putting the time source (a master moviefilein or other time controller source) in a separate time comp (or in a separate td process?) would assure a perfect video sync across the machines?

No. I would not do this.
I do often use a moviefilein TOP to extract the attributes of a video, then use that to calculate either seconds on frames that I’m going to stream from controller to nodes. I stream time either with a speed CHOP or a timer CHOP. If possible, you don’t want to play your movie file on your control machine. If you want a form a confidence then make sure you’ve transcoded your media file to a smaller resolution that’s reasonable to play locally. Distributed systems often work best with large resolutions, and the last thing you want is a huge file potentially causing frame drops on your control machine.

No. I’d set my extend parameters to hold - so that the last frame persists at the end of a clip, then determine how I wanted to handle transitions.

Correct is difficult to assess here. It may work well in some circumstances but not in others. Like mentioned above, I typically prefer frame or seconds. Frame if all the media is encoded at the same frame rate, seconds if it’s mixed frame rates.

The latency you see locally will be different than what you see distributed across machines. The length of the cable increases the distance your electrons have to travel, as well as any hardware inbetween - routers, switches, other NICs. That’s likely still small, but could be as slow as 3 frames (in my experience).

No. Unless you’re using the sync CHOP and pro licenses I would not disable the realtime flag. In a distributed system there will never be “perfect” video sync across machines. If this is a requirement you should do this from a single machine.

Another approach to this would be to have all systems chasing LTC. It would allow you to avoid the master player hierarchy, if you wanted to for some reason. I like this approach when doing most of my primary control on another product.

In mixed-master setups (where the ‘master’ machine is also rendering frames) I usually wind up using the sync CHOP. to send the correct frame based on what the master is doing.

OSC is certainly one way to do it. I suppose you could do it using other network protocols. I tend to avoid OSC because I find it’s ASCII encoding for values to be a performance liability. It is a protocol that’s happy to run over UDP/IP which is definitely the way you want to go for networked sync. Especially on larger networks, you don’t want to be wasting time with re-transmission of lost packets telling your slaves what frame they were supposed to be playing a second ago.

1 Like