I am wondering how are people editing on their machines with broadcast monitoring devices. These introduce lag and as FCPX doesnt have desktop delay, either there is fcpx GUI or the broadcast monitor delayed.
When working alone I can just use TV as second monitor but thats just preview
1. that doesnt say anything about the quality and color of output even on calibrated display (RGB vs YUV)
2. A/V introduces slight delay too, so i just have to use second display for viewers
3. not proper dealing with various fps (PAL land, cant say for NTSC) in gui viewers, as displays are 60Hz
Would rather prefer robust workstation setup as with all other NLEs where your gui, your output monitoring are in sync, although there is slight delay when u push keys. Ofc no lag anywere is preffered but not sure its possible.
Looking for tips and advice on your setups or workflows of dealing with these
All LCD monitors have at least 2 frames delay relative to the source audio.* Consumer TVs often have 4 frames or more. That's why the fancy home video AV receivers usually have an audio delay adjustment buried in their setup pages.
I get around the issue by monitoring the audio from my broadcast monitor's audio out (Flanders Scientific BM23(?). The monitor is fed an HD-SDI signal from an AJA T-Tap. The monitor disembeds the audio from the SDI stream and adds the correct delay. So the audio from the picture monitor feeds my audio monitors and is always in sync.
Audio coming from the computer output is always advanced a couple of frames relative to the display, but I don't use that audio except to check the mix on the less-than-ideal computer monitor speakers (like checking a mix on Auratones back in the day...). The audio from my computer monitor is typically in sync with its video, or close enough.
* The Flanders Scientific monitors (and likely other pro brands) have a "fast mode," which reduces the delay at the expense of a slight decrease in picture quality.