AMD Freesync demo

With AMD and Nvidia now offering monitors that support variable refresh rate (VRR) technology, the conversations around the internet about the two competing solutions – GSync from Nvidia and A-Sync/FreeSync from AMD – have now turned into one big, lovey-dovey flame war from both sides of the fence, with fanbois rooting for each camp all claiming different things and calling each other shills. However, there is a legitimate reason to call G-Sync better than FreeSync and its backed up by actual science and a logical testing regimen. PC Perspective took to figuring out the differences between the two technologies this past week and things are very interesting once you go beyond the brand and delve into the minute workings of what’s going into your eyeballs.

Firstly, as a refresher, VRR technology simply means that unlike a monitor with a fixed refresh rate of anywhere from 60 to 144Hz, a monitor supporting VRR will alter the refresh rate to match the fps output of the graphics card, which means that it only updates and displays a new frame once it is good and ready. The methods for doing this are exposed through altering the blanking intervals which tell the display what the refresh rate should be, but there are different approaches in how you get this working at all.

In Nvidia’s case, custom-made silicon replaces the scaler inside your monitor and performs some complex stuff behind-the-scenes to make VRR work. In AMD’s case, they worked with scaler and timing controller manufacturers to make updated silicon that reports to the GPU that the monitor is VRR capable and gives it a refresh rate window in which the tech functions.

In a nutshell, Nvidia’s solution is a proprietary one from the GPU to the monitor, while AMD uses a standardised solution accepted by VESA as part of the Displayport 1.2a specification. But the question now is, is Nvidia’s solution better in all aspects, and not just when looking at the LG 34UM67 and its tiny 48-75Hz VRR window? As it turns out the answer is, “Yes”, but the better answer is “Yes it is, but its more complicated than that.”

PC Perspective used an old oscilloscope in their video linked above and they decided to explore what happens to the display and the resulting animation when the refresh rate drops below the VRR window. The thinking was that Nvidia was still using G-Sync to alter the display’s refresh rate, but it wouldn’t make sense to do so at such low framerates because you would end up with anomalies like image degradation and colours that looked really weird.

This is in stark contrast to how Adaptive Sync or FreeSync currently function. Below the VRR window, you can either have V-Sync on, which will drop the framerate even lower and introduce judder, or have V-Sync off and be stuck with tearing and hitching in the framerate, but in both scenarios the monitor will remain at its lowest default refresh rate.

The video explains their testing methodology in more detail, but the graphs they drew up from their results are very interesting, particularly the one below.

freesync vs gsync

What you’re seeing here is a graphed output of how A-Sync and G-Sync behave with regards to refresh rates as framerate begins to drop to low levels. You can see here that from 41Hz and up, both technologies work exactly as described. Up to their maximum refresh rates, their VRR implementations will alter refresh rate to match your fps output.

However, under the VRR window with A-Sync/FreeSync, when you bottom out to around 40Hz the panels typically stay at that refresh rate and they never go lower. This is a problem for games that frequently dip below your VRR window, because this will add in the graphical artifacts as mentioned above. With G-Sync, things are a bit different.

As the refresh rate of the monitor, the Acer XB270HU, drops down to 37fps, the G-Sync scaler kicks up the framerate to double that, around 74Hz. While the game may be putting out a new frame at 37fps, the frame is displayed twice before being refreshed with a new one, but the difference is so subtle that you don’t realise that it has been doing this. As the framerate drops even further to around 20fps, the refresh rate is still doubled to preserve animation smoothness. At 18fps, the monitor boosts up the framerate by a factor of three, so now its refreshing at around 54Hz, with the same frame being displayed three times before a refresh occurs.

This trend continues with the refresh rate being increased to compensate for the low framerate until you hit about 9fps, where things would be so bad already that I don’t think anyone could enjoy that experience. What is interesting is that this is happening independent of the operating system – in other words, the G-Sync scaler is performing this function on its own thanks to Nvidia tuning the scaler hardware to each monitor they put it into.

Which one ultimately wins out?

For now, G-Sync is the superior solution if you’re playing on a G-Sync monitor with a graphics card that is somewhat weaker, allowing the framerate to dip down low when it can’t cope with what’s happening in the game. There are also more than a handful of G-Sync monitors available and it works with three generations of Nvidia graphics cards, but it is limited to only those graphics cards. So its better, but still proprietary.

Adaptive Sync, on the other hand, can and probably will have this behaviour fixed in AMD’s drivers soon, with Intel surely already testing these things out to see how their graphics cards handle VRR. It is possible that AMD will end up implementing a fix in their drivers that will alter with the EDID settings reported by the monitor, telling the scaler to ignore the reported framerate and run the display according to a specific refresh rate.

But, that’s part of the problem – AMD will have to make sure that their drivers can properly talk to each and every monitor compatible with A-Sync from now until the end of time. The pace of AMD’s driver development has been anything but frantic, and launching with such a particular piece of technology like variable refresh rates may mean that they’ll have to dedicate staff to just making sure that things work properly.

Of course, any monitor vendor can approach AMD to run the monitor through the FreeSync program to be able to use the FreeSync logo and branding and have it verified that the monitor works with AMD’s drivers. This is all free as in beer for these companies. Will it be enough to ensure a high level of quality and keep the market competitive? I don’t know, but my gut feeling says “probably not.” I expect this to get out of hand before Intel has to step in and save the day, but until then it will be a trying time for AMD to make sure that their implementation of Adaptive Sync works just as well as G-Sync.

Source: PC Perspective

More stuff like this: