AMD FreeSync Technology Review

👤by Tim Harmer Comments 📅19-03-15
Why is Frame Rate Syncing Important?

FreeSync and similar frame synchronisation technologies are born from the same underlying issue – a mismatch between the normally fixed refresh rate of a monitor with the variable frame rate a GPU will render at. We’ve all experienced it – texture tearing in a scene, or something we’ve just perceived as odd bands running up and down the screen. Many have learned to ignore it. But it’s still there, just waiting for the unsuspecting and breaking immersion when we least expect it.

V-SYNC is the typical means which we overcome the frame-rate/refresh-rate mismatch. By artificially limiting GPU frame rates to the refresh rate of a monitor – usually 60Hz/60fps for historical reasons – tearing is eliminated. However there are still downsides:

Latency – A game doesn’t just pause whilst a monitor waits for a new complete frame to be rendered by a GPU; you’re still playing and potentially important information isn’t being presented to you. For this reason many competitive gamers won’t play with V-SYNC, instead dealing with tearing as a prices for low latency.

Stuttering – What happens when the GPU can’t keep up with the 60fps needed for proper synchronisation? A frame is repeated, visually obvious as a stutter on the screen if it only happens for a frame or two. If it happens for longer the viewable frame rate effectively defaults to 30fps, with odd situations where motion will appear to be smooth for a moment. 30fps may be fine for consoles, but this is PC gaming.

So, what if you could tell the monitor to update whenever there is a new frame, effectively changing the refresh rate of the panel on the fly? Well it turns out that the groundwork for this had already been layout out by VESA in the early days of the LCD panel through the use of the V-Blank portion of the monitors specification, which was defined as potentially variable rather than fixed.

Another example of tearing, this time in Bioshock Infinite

There is essentially no reason that an LCD panel should have a fixed refresh rate. The 60Hz design is an artefact of cathode rate tube monitors and televisions which used a beam of electrons to cause phosphorescent pixels to illuminate, and this process was easiest if it was (ironically) synced to the frequency of the mains supply (or rather a factor thereof, essentially one quarter of 240Hz AC). An LCD panel may need a minimum refresh rate so that no damage is caused to the display components, but beyond that fixed refresh rates are mainly down to convention and ease of implementation.

Nvidia announced G-SYNC in 2013, taking a rather extreme approach in synchronising frame and refresh rates. The current implementation replaces the monitor scaler part with a proprietary ‘G-SYNC Module’, which communicates via Displayport with an Nvidia Kepler or Maxwell-class GPU only. The module continuously polls the GPU for a new frame, holding the display static until it gets the go-ahead that a new frame is fully rendered and available. This also allows Nvidia to have significantly more control over the way that a frame is rolled out on to the screen, obviating the need for a top-to-bottom refresh pattern in favour of something more exotic. Nvidia also build additional proprietary modes into the module which go beyond frame-rate syncing to include low-persistence modes (periodic pulsing of the backlight to reduce ghosting) and 3D Vision. G-SYNC is currently integrated into certain models from Acer, BenQ, Asus and Philips.

Unfortunately Nvidia’s implementation immediately locks out non-Nvidia graphics hardware, whilst also increasing the cost of monitor hardware to the consumer. Estimates of the cost for a G-SYNC module are in the ballpark of $100, which is $100 you’re not spending on another part of your system. Furthermore you’re locked in to Nvidia graphics if you ever want to take advantage of this feature, which is problematic for many. Naturally therefore alternatives were bound to be proposed.

10 pages 1 2 3 4 > »