Hey ajk48n - sorry for the long response:
Working with the TouchDesigner network UI itself over a VNC connection is, in my opinion, not only frustrating but also cumbersome and inconsistent. VNC, or any kind of screen shared application for that matter, requires it’s own overhead to run and subsequently can’t give you accurate results in terms of knowing how well the target machine is performing. Additionally, it means that you have a machine with a configuration that’s out of sync from your master set-up. Depending on how you’re working this can mean work that’s lost in the saving process, or unique to a single machine that can’t recovered on a re-start. I largely work with installations where we use a single TOE file and lots of TOX files for the configuration of our networks, a machine that’s configured manually through a remote connection needs special attention to ensure it’s data / configuration isn’t lost during a restart - which I’ve done any number of times.
I also generally dislike the practice of using the Touch UI during shows and installations unless necessary - opening the network editor starts a cascade of operations that cook all manner of operators and place unneeded elements into both CPU memory and into VRAM - resources you can’t recover unless you restart in perform mode. While it might be stoic idealism, I firmly believe that when possible one should build out projects so they can be configured, run, saved, and exited entirely from the UI that’s been built or pieced together - there are a number of reasons for this particular ideology and a large part of this is tied to the fact that if you rely on using the touch network to operate or calibrate your installation / show, you’ll never be able to teach someone else to do it without you (even another touch expert) because any number of gotcha details are only apparent to the programmer / developer. For better or worse I’ve worked in situations where at some point I have to hand over controls to a client or to someone I’ve trained, and that usually means keeping them out of the network editor.
Thinking about remote calibration, as a process, is also important because it’s unlikely you’ll be able to see a model mapped from multiple angles from any single vantage point. You’ll need the freedom to walk to various parts of the room / installation, and being tied to a workstation will cost you valuable on-site time… and the longer you spend calibrating, the less time you can spend putting an installation through it’s burn-in paces - testing the various media or generative pieces. Again, this comes from the experience of having narrow margins between load-in, client review, and execution so any time lost to work around solutions are often costly and frustrating.
This also comes from the fact that these days I frequently have to think about how you calibrate and configure machines where direct access is not a safe assumption. A recent installation runs on 36 servers (18 primary machines and a mirrored 18 as back ups) to drive 32 projectors, led screens, and a controller interface. The venue has no single vantage point from which all surfaces are visible, and from the control room about half of the display surfaces can be seen. The venue floor, where you have to calibrate the installation, is three floors below the server room. Additionally, playback servers run headless and are only configured to output content rendered for mapping - which means on these machines there is no user interface. In this kind of installation there is simply no other answer than to consider how to calibrate machines from a remote location… you have to build an iPad app, use a custom touchOSC interface, or use a laptop to drive a calibration interface you’ve built in Touch.
In some cases you can cleverly re-use existing pieces of built modules - we do this at Obscura with Stoner all the time; in other cases it’s harder to get this right (Camschnappr, I’m looking at you… you’re so beautiful, but so hard to think about controlling remotely). Regardless, I think that it’s best practice to think through what you’re calibration process is going to look like - beginning to end. I am also in the camp that thinks it is unreasonable to rely on screen sharing tools to solve this problem. I’m sure there are some who hold the opposite opinion, but I’ve made huge messes with VNC trusting that the value ladder increments will work as I expect (they don’t), that my screen is always updating (it often doesn’t, especially in full screen mode), that where I’m clicking on my screen is where I want to click on my remote destination’s screen (it works most of the time, then when it doesn’t you’re really in trouble), or expecting key commands to work the same way (depending on your tool and configuration that ctrl+s may not have actually saved your toe file).
Worse than trouble shooting your own software is loosing time to fix a bug that is really just an artifact of poor planning / slow screen refreshes / or bad behavior you’ve never seen (because you’ve always relied on having the network editor available at all times). While these might seem like nit-picky details and grumbles they come from countless hours of frustration and trouble shooting that could have been avoided. The miseries I’ve visited upon myself are ones I try to steer others away from… at least when it’s possible to do so.