Monitor Sync’ing – Let’s Figure It Out

Having looked at monitor refresh rate, another aspect to consider when selecting a monitor is deciding what type of synchronization works best with the system’s graphics card.

There are three major offerings for sync’ing the video signal output of a give GPU to the monitor: V-Sync, G-SYNC and FreeSync (they will also be dropping a dope album in 2019).

V-Sync (vertical synchronization) has been around for a long time, going all the way back to CRT (cathode ray tube) monitors. V-Sync was devised as a technology that attempted to match (synchronize) the frames per second of the video card with the refresh rate of the monitor. The biggest hurdle to getting this to work was that a given GPU’s frames per second are in constant flux, making this synchronization difficult. New frames would be available before the monitor had finished drawing the previous frame. This fluctuation leads to the phenomenon known as screen tearing, where the refresh rate is slower than the GPU output and the monitor can’t keep up, drawing half of one frame and half of the next. The V-Sync technology evolved to use a buffering system so that it only grabs frames that are complete (if the video card is faster than the refresh rate of the monitor, drawing frames faster than can be displayed, it will put a frame in a buffer and start working on the next frame while the monitor displays the current frame in the buffer). This works well when the GPU fps is outpacing the monitor refresh rate. It breaks down (as most of these technologies do) when the frames per second drop below the refresh rate. It is also limited to refresh rates that are multiples monitor refresh rate (60 fps, 30 fps, etc.) (This post on HardForum really gets into the nitty gritty of V-Sync).

NVIDIA’s G-SYNC and AMD’s FreeSync look to alleviate the problems plaguing V-Sync in two different ways.

NVIDIA’s G-SYNC solution is hardware based, meaning work is done both on the GPU side and the monitor side. A G-SYNC capable monitor has an NVIDIA chip that communicates with the NVIDIA GPU and syncs the refresh rate of the monitor with the fps of the GPU. This obviously caps the possible fps to the monitor specs, but the ability to modify the refresh rate dynamically provides a noticeable image quality improvement. Screen tearing and input lag (the time between moving the mouse or hitting a key on the keyboard) improves due to the increase in screen refresh. (This probably only applies to high-level professional gamers, but we all think we’re Pros, so…) It does increase the price of the monitor given that the monitor manufacturer has to include extra hardware. (As stated earlier, as frame rates drop, the syncing technology suffers.)

AMD’s FreeSync technology is GPU-only and therefore doesn’t require any extra hardware on the monitor end (but does require that the DisplayPort input be used). FreeSync takes advantage of “Adaptive-Sync” that VESA has built into the standard DisplayPort standard. From the VESA website: “DisplayPort Adaptive-Sync enables the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience.” An AMD Radeon GPU is required to utilize FreeSync capabilities, (just like NVIDIA and G-SYNC) but FreeSync offers a wider range of monitors able to take advantage of the adaptive synchronization.

When making a final decision on which monitor is the best option, cost and technology preference are the two deciding factors. By most accounts, NVIDIA’s G-SYNC offering edges out FreeSync in terms of performance, especially at the high end. That performance comes at a price, however, as the extra hardware that is added to G-SYNC monitors increases its price. Without first-hand experience, I am not in a position to recommend either. I started the process of researching components for a new PC with the intention of using an NVIDIA GPU but now I am being swayed into AMD’s camp with the cheaper, more diverse FreeSync monitor options.

Either way, the current setup and future plans for your setup will dictate your monitor choice.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.