Multi-monitor, high performance card

Touch designer seems to be much pickier than other applications in regards to graphics cards running multimonitor setups. I’m in the process of spec’ing a new machine for use with TD, and I thought I’d point out how cloudy the information is around the forum and wiki. There is very little information on how well ati cards are supported, in fact if you look in some places the wiki will tell you that they aren’t at all. If the wiki isn’t updated frequently with hardware compatibility, then a sticky thread would really help, with derivatives test results stated, user experience comments etc.
What I’ve gathered in an hour or so, please correct/clarify:
-Ati is supported (with equivalent performance? Is it using opencl rather than cuda? Please give us some benches) but is a bit buggy on my 6870.
-Geforce only works consistently without tearing on a single display…(?!)
-SLI has no advantages
-That pretty much leaves us with Quadro. And if you need more than 3 outs you need to buy multiple quadros (wow) which may or may not have to be the same or different series to each other. I’m clearly confused, and frankly surprised. If the software is as picky as it sounds on the forum then the compatibility chart really needs to be accurate and frequent to save us from wasting our money and pulling out our hair. Am I just looking in the wrong places?

Hi . . . yeah, I would tend to agree, there seems to be a frustrating lack of definitive information . . . mostly it is guidelines and suggestions for experiments. I think this is mainly due to that there just aren’t than many people doing what we are trying to do and the individual uses are so different that it is hard to define absolute general purpose solutions. I also believe that those who have solved the issues aren’t ready to share because the solutions are competitive advantages.

Regarding Touch Designer being more picky I don’t think that is the case. Other video server solutions, such as Watchout or Uniview, require a unique server for every one or 2 outputs. Most off the shelf VJ programs only support a few outputs reliably. Games are a different story since they can optimize each game engine to run reliably and consistently within the bounds of the Nvidia cards. And look at the number of individual game optimizations that happen with updated Nvidia drivers . . . to me that says there’s a lot of special purpose code being developed to get games to run fast and clean. That seems almost impossible given the wide range of OpenGL support needed for a general purpose tool like TD. Related, games tend to use a fairly small subset of the hardware rendering on the GPU.

Regarding tear free performance on multiple displays and cards, I can tell you that it is possible to have at least 3 cards and 6 displays running in one server with no tearing. The problem though is that rendering has glitches . . . doesn’t matter how light the scene is, doesn’t matter if it is video or real time 3D, still you will see frame drops. From what I’ve heard this is due to a GeForce card not guaranteeing that all frames rendered will actually make it to the display. Quadros are supposed to guarantee that happens.

So, I think what that leaves us with is Quadro solutions. It seems that SLI Mosaic is the solution for multiple cards. Nvidia has some documentation on using Mosaic, there seem to be a lot of limitations but a lot of nice performance gains and possibilities.

FYI, I’ve only ever used Nvidia GeForce solutions, all my comments are based on Nvidia. From what I know ATI has not supported a full implementation of OpenGL (just what is needed for games) so has not been an option until recently. A further note, I’ll be buying several Quadro cards soon and will report back my experience.

Jeff

Jeff,

Let me know how you find the differences between the Quadro cards and the Geforce. I just ordered a new laptop with the Geforce 680m that I’m very excited for, so I’ll post my results once it comes in.

Hi Elburz,

Question: how much external monitors can you drive simultaneously on your gtx 680m?

Cheers,

M