NAG Online > News > Lets Discuss: Variable Refresh Rates

Lets Discuss: Variable Refresh Rates

game tearing without V-Sync

Lag. Tearing. Stutter. Frame drops. Spliced frames. Missing frames. V-Sync frame drops. If any of these things have annoyed you at one point or another, then you must be a gamer! All of these issues are common on the PC platform and, to a certain degree, also seen on hardware-locked console platforms. But all of these things have a single thing in common and as a gamer it’s something you’ve come to both love and loathe – refresh rates. But if we’re to solve all of these issues, the mindset of monitor and GPU manufacturers as well as game designers needs to change when it comes to refresh rates. We’re well into 2014 now and we’re beginning to see this happening, but what is needed to be done and why?

Issues that gamers face today

First off, because I’ve never spoken about variable refresh rates on NAG Online in any great detail, let’s go over some of the issues I’ll be looking at and using in this discussion (which will more likely turn out to be a serious bit of reading). Because I don’t do FCAT or FRAPS testing or have the hardware or data to draw analysis from either, I’m going to be drawing some conclusions from the work of others in this discussion, notably PC Perspective, Anandtech and Tech Report. These sites have writers who have delved a lot into frame variance issues and could be considered at this point to be the authority on what and why things happen.

So, to the issues at hand:

Frame drops/Dropped frames - This occurs when the GPU’s onboard buffer receives a finished frame too late in the process to be included into the current monitor refresh cycle. This frame is rather dropped and swapped for another at the start of the refresh cycle, the contents of which may be completely different. Frame drops happen in single or multi-card setups and in single and multi-monitor rigs. In the picture below of Bioshock Infinite, PC Perspective’s testing revealed that every second frame was being dropped, resulting in perceived lag and animation stutter. They can infer this through the use of FCAT, which colour-codes frames as they’re received from the game engine – in this snippet the colours of lime, red, navy, aqua and silver are omitted from the usual colour cycle that FCAT uses.

bioshock infinite frame drops pc perspective

A series of frames from Bioshock Infinite, which have missing colours indicating that frames are being dropped.

Runt frames - This occurs in games running with two or more graphics cards in SLI or Crossfire and it’s not restricted to either setup – it happens to both Nvidia and AMD and is game-dependant, despite a fix being possible through driver optimisation. What happens to produce a runt frame is the master GPU sends out its finished frame to your monitor, which may be in the middle or the start of a panel refresh. The secondary GPU then finishes its frame and sends it through, but for whatever reason the frame wasn’t ready in time. What happens is that frame is injected to the stream but is only a few pixels high – enough to be counted in FRAPs and frame rate averages and boost up the scores artificially.

But that’s only a symptom of the underlying cause, which is that only one GPU’s work is being seen on the screen, which means that Crossfire is effectively no faster than a single card running at the same settings. Its worth noting that this was an issue on AMD’s cards right up to the HD7000 series and was recently solved in Nvidia’s Kepler family thanks to a combination of hardware and software frame pacing methods.

battlefield 3 runt frame

A runt frame in Battlefield 3 is noted by FCAT to the left of the image as a small silver-coloured sliver.

Stepped tearing - A phrase coined by PC Perspective, this is an anomaly that almost exclusively happens in AMD Crossfire setups and in Eyefinity. Stepped tearing occurs when two or more frames are blended into each other because, in a Crossfire setup, the master GPU needs to receive a copy of the results from the memory buffer of the second GPU in order to display the frame it worked on, on your screen. However, when the buffer from the second GPU is slightly slower to update than usual, the master GPU begins to interleave one frame with another to produce a full image. But then the secondary GPU’s buffer does speed up, the master GPU flips over to using the new information instead and this creates a see-sawing effect in how the frame is put together.

In the picture below, you’ll notice that not only is the interleaving done at odd intervals and not matching up with the original frame, it is also a little skew, indicating the points at which the GPU buffers either sped up or slowed down as each line was drawn. AMD still hasn’t fixed frame pacing in multi-monitor Crossfire setups yet and this remains an issue today.

bioshock infinite stepped tearing

An example of stepped tearing in Bioshock Infinite.

Frame interleaving - This is a related issue to runt frames and is caused by the master GPU in a multi-monitor Crossfire setup not receiving copies of the finished frame from the second GPU on time, resulting in a see-saw effect in which two frames are shoved into the same space as a single frame, with some parts being mismatched due to the timing of the frames being out of order. This was also an issue with Radeon HD4000, 5000 and HD6000 cards in 2010, where AMD mistakenly screwed up some optimisations for OpenGL games which resulted in wide-spread tearing and interleaving. AMD says that this is fixable in their drivers, but to date have not offered a deadline for a solution to be offered to gamers.

bioshock infinite frame interleaving

A clear example of frame interleaving in Bioshock Infinite.

 The cause: Static refresh rates

Ever since computer monitors were invented, they’ve been run at set refresh rates for consistency and performance. The earliest monitors based on Cathode Ray Tube (CRT) technology were customised to run on the country’s electrical power grid at the same frequency to avoid spikes and ripples in the TV set’s output. This meant that TV sets in the US ran at 60Hz while European TV sets predominantly were set to 50Hz. This was the cheapest and simplest way of getting things going and over time the mindset of creating programming for these specific refresh rates stuck in. For a frame of reference (pardon the pun) 60 and 50Hz refresh rates have been a standard since the late 1920’s.

But this same issue isn’t present in LCD monitors or, at least, it doesn’t have to be. You don’t need to match refresh rates to the power grid to minimuse noise because it doesn’t use hardware that is generally susceptible to ripple and noise. You don’t need to even refresh an image, particularly if the contents are still static compared to the last frame. The hardware industry has seen an incredible amount of fast GPUs that are easily able to spit out frames at a faster rate than what most monitors can display and yet we’re still stuck in our old ways, churning out 60Hz monitors that flood the market and leave people thinking that this is their only option.

charlie-chaplin-dance-o

In a similar vein, film has been shooting at 24 frames per second because it was the best compromise between fluidity, film quality and reel length. Since the digital era began, we’ve been trying to achieve the same kind of mesmerising perfection, but most TVs and graphics cards simply don’t match up well enough to properly support the 23.976Hz mode that best mimics proper film quality. Anandtech has been documenting this at length for a while now and only in the last three years has anyone made strides to properly fix it.

Simply put, we’re trying to put analog and digital signals on the same playing field and we even try to give them the same set of rules, but the reality is that digital plays a whole different ballgame. What we’ve been trying to do for years is akin to forcing a left-handed child to write using their right hand – it can be done, but letting the kid do it in their own way is a much better solution.

The solution: Variable Refresh Rates

Instead of trying to make the GPU’s workload fit into something the monitor can work with, why not make it work the other way around? Let the GPU work on the drawing of the frames and let it dictate how it wants or needs the monitor to perform to best deliver frames in a smooth and artifact-free fashion.  Let the monitor support any refresh rate from 23.976Hz all the way to 144Hz and include every single other refresh rate in between. The result is a panel that can adapt to any situation and does not hamper the experience in any way.

This idea may take some time to really gain traction and it’s only being used in a small number of cases already. Samsung’s Pentile and RGB screens on their mobile phones and tablets technically already support variable refresh rates, but its up to the hardware and the Android OS to properly support this. This is similar to how e-Readers work, where examples like Amazon’s Kindle only draw a new frame when there is new data to display, which is how they can achieve longer run times and better battery life than LCD-equipped tablets (like the Kindle Fire).

Nvidia Geforce GSync

Nvidia’s solution is GSync, a customised scaler for  monitors that have panels that can support variable refresh rates, but do not currently do so. The GSync scaler can add anything from $100 to $199 to the final price of the monitor and can also be sold as a DIY hobbyist kit for brave souls, but it is limited to a single monitor from ASUS, the VG248QE, for the moment. GSync is financially backed by Nvidia and is a proprietary solution, which means that support will not come to AMD or Intel GPUs on the desktop.

For GSync to work you must have V-Sync enabled either in the drivers globally, or in individual applications to have the feature work for you. Regular V-Sync works by having the monitor communicate with the graphics card to tell it when it’s ready for a new frame, requiring the GPU to hold the last frame in its buffer indefinitely, causing a slight amount of input lag. This isn’t especially important for some games, although in twitch shooters it can mean the difference in dodging a crack sniper or seeing your mangled corpse of polygons lying defeated on the ground.

What this means is that Nvidia’s solution is completely under their control, but is merely a band-aid to cover the issue that affects the rest of the industry. The hardware is good, make no mistake, but it’s a solution that only benefits Nvidia and, at least for now, has a niche market of people who are able to afford the super-expensive scaler chip.

Variable VBLANK and Displayport 1.3

Intel and AMD, on the other hand, have included support for changing the VBLANK interval rate already and have been technically capable of this for some time. The VBLANK interval determines how long a monitor must wait before a new frame is ready for delivery. AMD calls their implementation FreeSync, owing to the fact that no customised hardware is required to support variable VBLANK intervals. Altering VBLANK intervals is a feature that monitors conforming to the new VESA  and Displayport 1.3 standards will be required to do and happily, AMD says they’ve been able to do this since the HD5000 series.

FreeSync has its own issues, however. You’ll almost certainly need a new monitor for it to work (as is the case with GSync) and there is a slight amount of overhead in the process of telling the monitor when a new frame is ready, just like GSync which sees slightly lower average framerates as a result of the computing overhead required. To AMD and Intel’s benefit, however, this is much easier for them to achieve in the notebook and tablet space and Nvidia does not yet support variable refresh rates in their Tegra hardware. That will change with time, but for now, on the desktop, it’s a costly endeavor for Geforce fans who want it.

Adoption more rapid than you’d expect

DP 1.3-compliant monitors don’t hit the market or trade shows until much later this year and GSync itself is seeing slow adoption, although it could gain traction much quicker because of the financial push from Nvidia. If you have existing monitors that only use DVI, VGA or HDMI, you’re out of luck – either you’ll be at the mercy of Nvidia, Intel and AMD’s drivers, or you’ll find yourself tweaking your game to be able to find the perfect balance between smoothness and playability.

Variable refresh rates could see an increased adoption as the industry begins the slow haul to move over to UltraHD 4K displays. Because these displays require a lot of GPU muscle and games at maximum on a GTX Titan typically run at sub-40fps, it would be a perfect fit to ensure that the experience is still the same despite moving to a much higher resolution. Games don’t need to be optimised as much to achieve playability and players can also opt to drop detail settings to enjoy faster framerates and more fluidity.

Need for Speed: Shift's implementation of motion blur is particularly annoying.

Need for Speed: Shift‘s implementation of motion blur is particularly annoying.

This will also mean the eventual death of motion blur, which I’ve always disabled in games that don’t need it. It’s been used in the past to trick you into thinking that gameplay is smooth, but it’s also been used to cover up aliasing, or poor quality textures (EA Black Box and NFS: Carbon were guilty of this), or crushed colours due to poor design. Motion blur has its place in a few scenarios, but I find that it sometimes reduces my accuracy in games and hides details that I could see with it disabled.

Game developers who make terribly-performing titles will also have a bit of a liferope thanks to variable refresh rates and this means that games no longer need to be designed in the same manner as Skyrim, which requires V-Sync to be on at all times for in-game physics to work. Conversely, if you’re a developer and your game handles poorly despite using Gsync or FreeSync, then there’s little you can do to cover it up.

Its going to be a good thing for everyone involved and I intend to purchase at least one monitor that supports FreeSync to replace my main one for work and gaming purposes. 2014 is an exciting time to be a gamer and a hardware enthusiast and I hope that both Nvidia and AMD stick to their plan of bringing their solutions to market. If this means that I’ll also be able to watch The Hobbit in 48p without frame drops, I’ll be happier than a pig in mud.

Sources: PC Perspetive, Anandtech, Tech Report

Discuss this in the forums: Linky

  • Jordan

    Nice article. Pity it doesn’t really appeal to me and my craptop. Still, nicely done

    • http://www.facebook.com/pages/Wesley-Fick/184346154999538 Wesley Fick

      Yeah, but it could help other craptops one day!

  • Michael Bouwer

    Flippen great article!

Advertisement

Login / Search

Latest games

Latest opinions

Advertisement

Advertisement