Monitor Sync’ing – Let’s Figure It Out

Having looked at monitor refresh rate, another aspect to consider when selecting a monitor is deciding what type of synchronization works best with the system’s graphics card.

There are three major offerings for sync’ing the video signal output of a give GPU to the monitor: V-Sync, G-SYNC and FreeSync (they will also be dropping a dope album in 2019).

V-Sync (vertical synchronization) has been around for a long time, going all the way back to CRT (cathode ray tube) monitors. V-Sync was devised as a technology that attempted to match (synchronize) the frames per second of the video card with the refresh rate of the monitor. The biggest hurdle to getting this to work was that a given GPU’s frames per second are in constant flux, making this synchronization difficult. New frames would be available before the monitor had finished drawing the previous frame. This fluctuation leads to the phenomenon known as screen tearing, where the refresh rate is slower than the GPU output and the monitor can’t keep up, drawing half of one frame and half of the next. The V-Sync technology evolved to use a buffering system so that it only grabs frames that are complete (if the video card is faster than the refresh rate of the monitor, drawing frames faster than can be displayed, it will put a frame in a buffer and start working on the next frame while the monitor displays the current frame in the buffer). This works well when the GPU fps is outpacing the monitor refresh rate. It breaks down (as most of these technologies do) when the frames per second drop below the refresh rate. It is also limited to refresh rates that are multiples monitor refresh rate (60 fps, 30 fps, etc.) (This post on HardForum really gets into the nitty gritty of V-Sync).

NVIDIA’s G-SYNC and AMD’s FreeSync look to alleviate the problems plaguing V-Sync in two different ways.

NVIDIA’s G-SYNC solution is hardware based, meaning work is done both on the GPU side and the monitor side. A G-SYNC capable monitor has an NVIDIA chip that communicates with the NVIDIA GPU and syncs the refresh rate of the monitor with the fps of the GPU. This obviously caps the possible fps to the monitor specs, but the ability to modify the refresh rate dynamically provides a noticeable image quality improvement. Screen tearing and input lag (the time between moving the mouse or hitting a key on the keyboard) improves due to the increase in screen refresh. (This probably only applies to high-level professional gamers, but we all think we’re Pros, so…) It does increase the price of the monitor given that the monitor manufacturer has to include extra hardware. (As stated earlier, as frame rates drop, the syncing technology suffers.)

AMD’s FreeSync technology is GPU-only and therefore doesn’t require any extra hardware on the monitor end (but does require that the DisplayPort input be used). FreeSync takes advantage of “Adaptive-Sync” that VESA has built into the standard DisplayPort standard. From the VESA website: “DisplayPort Adaptive-Sync enables the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience.” An AMD Radeon GPU is required to utilize FreeSync capabilities, (just like NVIDIA and G-SYNC) but FreeSync offers a wider range of monitors able to take advantage of the adaptive synchronization.

When making a final decision on which monitor is the best option, cost and technology preference are the two deciding factors. By most accounts, NVIDIA’s G-SYNC offering edges out FreeSync in terms of performance, especially at the high end. That performance comes at a price, however, as the extra hardware that is added to G-SYNC monitors increases its price. Without first-hand experience, I am not in a position to recommend either. I started the process of researching components for a new PC with the intention of using an NVIDIA GPU but now I am being swayed into AMD’s camp with the cheaper, more diverse FreeSync monitor options.

Either way, the current setup and future plans for your setup will dictate your monitor choice.

David vs. Goliath – Round #?

This news of AMD taking Intel to court is getting lots of air-time everywhere. I’m personally an AMD guy, I’ve used them in all my PC builds (Full Disclosure: I also own AMD stock) so I’m excited and nervous about this news. Excited, because I’d love to see AMD continue to grow. It’d be great to see a Dell without an “Intel Inside” sticker on it. I’d love to see the company I have a vested interest in succeed on a larger scale. But this makes me nervous because AMD may alienate the people it wants future business from.

Nevertheless, reading some of this case does make it look like something is amiss with Intel’s “practices”:

AMD’s complaint lists examples of what it characterizes as bribes, threats or intimidation by Intel involving 12 computer makers, nine distributors and 17 retailers.

One example noted that Gateway paid a hefty price for its limited dealings with AMD, with its executives conceding that Intel “had beaten them into guacamole.”


In discussions about buying from AMD, “Dell executives have frankly conceded that they must financially account for Intel retribution in negotiating pricing from AMD,” the suit alleges.

This will be interesting to watch because the ramifications are huge.

One thing that should be pointed out is that this AMD/Intel bickering has been going on for a while, here in the US and overseas. Intel may already be trouble in Japan, and possibly the EU, too. The case here in the States isn’t something that just happened out of the blue.

I hope AMD has a solid, legitimate case. Crying “Intel isn’t playing fair!” because they feel they are losing ground is something I hope they’re not doing. Life, especially in a free market, isn’t fair. But, and that’s a big but, if Intel isn’t “playing fair” they should be held responsible, within the bounds of the law.