Monitor Sync’ing – Let’s Figure It Out

Having looked at monitor refresh rate, another aspect to consider when selecting a monitor is deciding what type of synchronization works best with the system’s graphics card.

There are three major offerings for sync’ing the video signal output of a give GPU to the monitor: V-Sync, G-SYNC and FreeSync (they will also be dropping a dope album in 2019).

V-Sync (vertical synchronization) has been around for a long time, going all the way back to CRT (cathode ray tube) monitors. V-Sync was devised as a technology that attempted to match (synchronize) the frames per second of the video card with the refresh rate of the monitor. The biggest hurdle to getting this to work was that a given GPU’s frames per second are in constant flux, making this synchronization difficult. New frames would be available before the monitor had finished drawing the previous frame. This fluctuation leads to the phenomenon known as screen tearing, where the refresh rate is slower than the GPU output and the monitor can’t keep up, drawing half of one frame and half of the next. The V-Sync technology evolved to use a buffering system so that it only grabs frames that are complete (if the video card is faster than the refresh rate of the monitor, drawing frames faster than can be displayed, it will put a frame in a buffer and start working on the next frame while the monitor displays the current frame in the buffer). This works well when the GPU fps is outpacing the monitor refresh rate. It breaks down (as most of these technologies do) when the frames per second drop below the refresh rate. It is also limited to refresh rates that are multiples monitor refresh rate (60 fps, 30 fps, etc.) (This post on HardForum really gets into the nitty gritty of V-Sync).

NVIDIA’s G-SYNC and AMD’s FreeSync look to alleviate the problems plaguing V-Sync in two different ways.

NVIDIA’s G-SYNC solution is hardware based, meaning work is done both on the GPU side and the monitor side. A G-SYNC capable monitor has an NVIDIA chip that communicates with the NVIDIA GPU and syncs the refresh rate of the monitor with the fps of the GPU. This obviously caps the possible fps to the monitor specs, but the ability to modify the refresh rate dynamically provides a noticeable image quality improvement. Screen tearing and input lag (the time between moving the mouse or hitting a key on the keyboard) improves due to the increase in screen refresh. (This probably only applies to high-level professional gamers, but we all think we’re Pros, so…) It does increase the price of the monitor given that the monitor manufacturer has to include extra hardware. (As stated earlier, as frame rates drop, the syncing technology suffers.)

AMD’s FreeSync technology is GPU-only and therefore doesn’t require any extra hardware on the monitor end (but does require that the DisplayPort input be used). FreeSync takes advantage of “Adaptive-Sync” that VESA has built into the standard DisplayPort standard. From the VESA website: “DisplayPort Adaptive-Sync enables the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience.” An AMD Radeon GPU is required to utilize FreeSync capabilities, (just like NVIDIA and G-SYNC) but FreeSync offers a wider range of monitors able to take advantage of the adaptive synchronization.

When making a final decision on which monitor is the best option, cost and technology preference are the two deciding factors. By most accounts, NVIDIA’s G-SYNC offering edges out FreeSync in terms of performance, especially at the high end. That performance comes at a price, however, as the extra hardware that is added to G-SYNC monitors increases its price. Without first-hand experience, I am not in a position to recommend either. I started the process of researching components for a new PC with the intention of using an NVIDIA GPU but now I am being swayed into AMD’s camp with the cheaper, more diverse FreeSync monitor options.

Either way, the current setup and future plans for your setup will dictate your monitor choice.

Monitor Refresh Rate – Let’s Figure it Out

(I am in the process of doing the research before building a new PC. As I work through this research, I’m going to post some of the more useful/interesting tidbits I find along the way. This is the first of a series of quick posts.)

The plan is to build a moderately beefy computer. I’m not going the budget route, but I’m also not going the all-out, 4k-capable, 100fps ultra-settings PC, either. I think I’ll use the monitor as a starting point to build the system around. I’m selecting a 1440p monitor and will select components that will give me the flexibility to upgrade to the 4k realm if I so desire down the road. For now, I think 2560×1440 will be plenty of pixels.

It’s been a long time since I’ve bought a new monitor (I am still rocking this solid but definitely long-in-the-tooth Lenovo ThinkVision 24″ beauty pictured above) so I haven’t done any serious monitor research in over eight years. As you can see, the refresh rate on my ThinkVision is 60Hz. Back when I got it, no one fretted over refresh rates. It was all about resolution, and I specifically chose that model for its 1920x1200 resolution. Those extra 180 pixels turned me into a resolution snob. Oh you play at 1080p? That’s cute. My monitor eats 1080p’s for breakfast.

So since I’m not going for those ultra-high 4k settings, I’m going to look for a solid 2560×1440 monitor with a good refresh rate.

In simple terms, the refresh rate is how many times per second the monitor is able to change each pixel on the screen. While it is not tied to frames per second (that’s how many graphical images [frames] your graphics card can pump out), you need a monitor that can at least keep up with the power of your graphics card. If your graphics card is easily humming along at 100 fps, that old 60 Hz monitor isn’t going to keep up and you’re losing graphical “quality”. Having a monitor’s refresh rate be as close to the frames per second being pumped out by the GPU is essential to making the most of your system’s power. How fast you need is really a personal preference. This article I found at Digital Trends sums up what the majority of my research has shown:

In short, if you’re a gamer, we’d argue that you would see a greater, more obvious benefit from switching to a high-refresh rate monitor than you would in upgrading to 4K — as doing both can get inordinately expensive and taxing on your hardware. 120Hz or 144Hz displays make for smoother, tear-free gaming with less input lag.

The way it looks is this: the jump from 60Hz (where I’m currently at) to anything over 100Hz is going to be markedly improved. The sweet spot seems to be around 144Hz. There are newer screens coming out that top out at 240Hz, but now we’re in the realm of diminishing returns. The difference going from 60 to 120Hz is drastic. The difference from 120 to 240 is only going to be perceptible to the trained eye (that means not me).

So we’re looking for a monitor with a 144 Hz refresh rate, but that’s not all. There’s still things like response rate, sync’ing, and if curvature is all it’s cracked up to be.