It has been quite some time since AMD last announced that they were working on Project FreeSync, a cheaper, standardised alternative to Nvidia’s G-Sync technology. Both are really similar implementations of VESA’s Adaptive Sync (A-Sync) standard, which will match the refresh rate of your display to the frame rate of the game or application that you’re currently running, giving you stutter and lag-free display environments while removing the display lag induced by V-Sync, which we’ve relied on for decades to keep displays looking smooth and free of artifacts. Lets take an overall look at what AMD is promising today.
Firstly, FreeSync isn’t an actual technology development by AMD. It is, actually, based on the Displayport 1.2a+ standard that AMD asked VESA to include last year, which uses the variable refresh rate capability of embedded Displayport (eDP) inside laptops to alter the display’s Vblank timing, which is the rate at which the display’s timing controller forces a flush and refresh. For now, it is optional for monitor manufacturers to offer DP 1.2a+ compability, but it won’t be a feature for all monitors moving forward, only the ones that have DP 1.2a-compliant scalers and Displayport 1.2a ports will feature it. So, monitors based on D-Sub (VGA), DVI, HDMI and Thunderbolt ports will miss out.
FreeSync is also interchangeable with Adaptive Sync for this release, but FreeSync itself is just a marketing drive by AMD to incentivise monitor manufacturers to offer the capability on new monitors that support this technology. The FreeSync program also includes the option to have AMD certify monitors as FreeSync capable, which will allow vendors to put the “FreeSync Ready” logo on their boxes and in their promotional material. So, whether I’m writing about FreeSync or A-Sync, or DP 1.2a+, or VRR (Variable Refresh Rates), its really the same underlying technology, but with a different name.
With today’s release, though, the latest drivers for AMD’s Catalyst software should have been made available, but things aren’t there just yet. Newer drivers might only be released later this week or even later this month, so if you happen to grab one of the compatible monitors and already have a compatible GPU to go with it, you will have to be patient for just a little longer to finally see what all the fuss is about.
Like G-Sync, A-Sync adjusts the refresh rate of the monitor on the fly by altering the Vblank interval which tells the scaler and timing controller (Tcon) to force a refresh and flush out the display’s internal buffer. Unlike V-Sync, which will fill in the time between completed frames being available by redrawing the last frame, A-Sync will tell the display to hold the current image and wait for the next frame to be ready before issuing a redraw. This means that unlike V-Sync, which will often make the next frame wait for the current one to finish, A-Sync will hold the frame and then perform a refresh, which solves the problem of mouse lag – as soon as new data is available, it is immediately forced through the display chain.
Of course, there are drawbacks to doing this and Nvidia had to design their own scaler to account for some of these things, and simultaneously have a greater degree of control over them. If the game engine stutters and delivers frames with wildly varying completion times, you might experience stuttering if the display doesn’t have a fast enough panel to keep up with the changing refresh rates. If the GPU drivers have a hiccup and drop frames from the display chain, that manifests as hitching, because the next available frame will be completely different. If your game has sections where the framerate drops to 0 (level loads, cut scene changes, that sort of thing), the Tcon will simply display a black screen until a new frame is available.
Despite these and other minor issues, though, the fact that we have this technology available today is quite awesome. A-Sync relies on the display’s scaler to do a lot of the magic behind the scenes, so this can be supported by Intel’s integrated graphics and even Nvidia’s cards if they have the ability to alter Vblank intervals. Personally, I think we’ll see that only will be possible with GPUs based on the Maxwell architecture, because Nvidia is beginning to support A-Sync with mobile GPUs based on Maxwell and a select array of notebooks with the right scalers and panels in their displays.
That’s possibly why AMD claims that there is more consistency with A-Sync rather than G-Sync. Looking at the graphs above, AMD claims that the performance delta from using A-Sync over a similar configuration with G-Sync is more consistent when it comes to performance versus V-Sync turned off, with negligible performance losses with A-Sync enabled. But that’s not the whole story – while there is a bigger hit on the G-Sync graph, AMD hasn’t misrepresented Nvidia’s performance here – unlike the large variations with V-Sync on, G-Sync tries to smooth things out and optimise for varying workloads.
That’s partly a function of Nvidia’s greater control over the G-Sync scaler and also their tighter integration with monitor vendors to tune their displays properly for a better experience. I’ve said before that while A-Sync will be a better option overall for multiple GPUs, G-Sync will continue to be the superior solution because of Nvidia’s deep dedication to making their experience definitively better. Merely making variable refresh rates work with little overhead is an admirable goal, but that helps nothing if it doesn’t somehow improve the experience overall as well.
Upcoming FreeSync-enabled monitors
|Manufacturer||Model||Size||Resolution||Min Refresh||Max Refresh||Panel Type|
|Acer||XG270HU||27-inch||2560 x 1440||40Hz||144Hz||TN|
|BenQ||XL2730Z||27-inch||2560 x 1440||40Hz||144Hz||TN|
|LG Electronics||29UM67||29-inch||2560 x 1080||48Hz||75Hz||IPS|
|LG Electronics||34UM67||34-inch||3440 x 1440||48Hz||75Hz||IPS|
|Nixeus||NX-VUE24||24-inch||1920 x 1080||40Hz||144Hz||TN|
|Samsung||UE590||23.6-inch||3840 x 2160||N/A||60Hz||TN|
|Samsung||UE590||28-inch||3840 x 2160||N/A||60Hz||TN|
|Samsung||UE850||23.6-inch||3840 x 2160||N/A||60Hz||TN|
|Samsung||UE850||28-inch||3840 x 2160||N/A||60Hz||TN|
|Samsung||UE850||31.5-inch||3840 x 2160||N/A||60Hz||TN|
|Viewsonic||VX2701MH||27-inch||1920 x 1080||40Hz||144Hz||TN|
Right off the bat, FreeSync launches with eleven compatible displays of varying sizes. Its great to see what only two of them are 1080p displays, but those also sport some of the highest refresh rates and the largest coverage range for A-Sync to work in. Having N/A as the minimum refresh for the UHD 4K Samsung displays just means that, for now, we have no idea what the minimums are going to be there. I’d like to say 30Hz, as that was the minimum refresh rate for UHD 4K displays back when there were no scalers to support them and monitor manufacturers had to use tiling to get the proper resolution, but I’m not 100% certain at this point.
The minimum refresh rate officially supported by the scalers compatible with A-Sync is just 9Hz, with a ceiling of 240Hz, but there are limitations to that. More specifically, the panel in use needs to be able to display an image at such a low framerate without having the picture quality degrade.
This is one of the reasons why the panels with IPS displays top out at 75Hz and have a minimum refresh rate of 48Hz – going any higher increases the chances of images overshooting themselves, while going lower might increase the chances of ghosting and a loss in colour and brightness. If those displays were made to support a lower refresh of 30Hz, you’d see a dip in contrast ratios, colour accuracy, and brightness because the display is more off than it is on.
This behaviour, then, introduces a new delicate dance for those of you wanting to buy into a monitor supporting Adaptive Sync. You could choose one that is a higher resolution than what you would normally use, and then you would have to optimise each game so that the framerate never dips below the minimum supported rate. Alternatively, you could buy into a monitor with a lower resolution (say 1080p), and increase graphics settings and fidelity until your average framerate drops into the supported zone.
For now, I think a lot of people would be happy buying something like a Radeon R9 290 and one of the 2560 x 1440 displays. Don’t buy into the IPS panels as things stand today, as the supported FPS range is really small and hard to hit reliably with a mid-range GPU. Wait for the second generation of these monitors which will have slightly faster S-IPS or AHVA panels and a ceiling of 120Hz.
Using any FreeSync-compatible display, though, requires a compatible GPU and your options for that are quite small. Only the GPUs in this list are compatible with the Adaptive Sync technology and an even smaller pool of options exists in the mobile world, which is basically limited to the Kaveri mobile APUs, some Kabini APUs and discrete mobile GPUs based on AMD’s Mars architecture. That sucks, but that’s how the ball has rolled so far.
However, if you already have one of the mobile laptops that support this technology, you’re in for a treat. Not only could you already have a compatible scaler that works with the Adaptive Sync standard, but you could also benefit from battery life improvements as a result of running the display at a lower refresh rate. Intel will probably exploit A-Sync for this exact reason pretty soon, while Nvidia has been playing around with the idea of running games locked to 30Hz on battery power for quite a while.
In a nutshell, this is what it all boils down to. FreeSync and G-Sync are both similar in that they require new hardware – a monitor and GPU upgrade, really. That raises the price of adoption because there’s very little chance that you have both components on hand right now. G-Sync monitors have been around for just over a year now, but they always carry a hefty price premium as a result of the custom scaler inside, while the GPU needs to be a Geforce GTX 650 Ti Boost or newer to take advantage of the tech. I can hear the collective “Meh” that you’re all groaning now, but this is the teething stage of a new display technology that will be common place in the future.
Will it help? Will it make PC gaming better? That’s debatable and there are reasons both for, and against, VRR technology, and many of them are completely valid. It will require driver optimisation to make the experience better and better application support to do some of the really neat things that the hardcore enthusiasts will appreciate, like an actual 23.976Hz mode for watching movie content, or a 30Hz mode to save on power. There are so many applications for this kind of thing that can benefit us, but there first has to be general adoption before support becomes better – a chicken-and-egg scenario, if you will.
We can’t just have one and expect the other to manifest itself, we need both.