Ever since the reveal of Nvidia’s Gameworks suite, a collection of software that helps make integration of Nvidia’s software tricks available to developers without much hassle, there have been articles popping up on the net with AMD and AMD fans decrying the inclusion of Gameworks into some titles as “unethical” and “corrupt.” Gameworks is blamed for everything from unoptimised driver performance to generally poor game performance and while there is some truth to parts of the argument against Gameworks, there’s nothing concrete at the moment that spells out Gameworks as being the reason for AMD’s poorer performance in recent game releases. With the release of The Witcher 3: Wild Hunt stoking the fires of this argument once more, lets look at what seems to be happening and what the internet claims is happening.

The Witcher 3: Wild Hunt

Nvidia’s been coming under fire from gamers recently because of claims that the Hairworks API, which is included in both Witcher 3 and Far Cry 4, reduces performance on competing graphics cards from AMD’s stable and generally is a blight on the gaming industry. There was some credence given to this argument when CD Projekt Red’s Marcin Momot released the following statement to Witcher 3 forum users investigating the issue while they were doing performance testing:

“Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations.

The particular point of AMD not being able to optimise Gameworks code has been stated before by AMD and various industry sources, but the proof still doesn’t yet exist for these claims. Do I believe them? Sure. Nvidia isn’t in the habit of sharing proprietary code for their stuff with AMD before, why would they begin doing so now? This statement seemed to get the internet and Witcher 3 fans on AMD systems up in arms (understandably so), but consider for a moment what primarily drives Hairworks – tesselation. The hair on the beasts found in the world of Witcher 3 is generated through a tesselated 3D model, something that both AMD and Nvidia have been more or less decent at since the introduction of their GCN and Kepler architectures respectively. It makes little sense for Nvidia specifically to crap out tesselation performance in one game on AMD’s hardware and not for their own, particularly because it would be very obvious if they did. AMD did the same with Global Illumination in DiRT: Showdown and that kicked up just as much fuss.

Adding fuel to the fire this morning, I read a headline news article from PC Perspective, where editor Ryan Shrout asked Nvidia’s Brian Burke, one of the Gameworks developers, to clarify some of the lingering issues that were confusing people. Instead of actually clearing this all up, the situation is far muddier now because of Burke’s comments, which you can read below, copied verbatim:

“We are not asking game developers do anything unethical. GameWorks improves the visual quality of games running on GeForce for our customers. It does not impair performance on competing hardware.

Demanding source code access to all our cool technology is an attempt to deflect their performance issues. Giving away your IP, your source code, is uncommon for anyone in the industry, including middleware providers and game developers. Most of the time we optimize games based on binary builds, not source code. GameWorks licenses follow standard industry practice. GameWorks source code is provided to developers that request it under license, but they can’t redistribute our source code to anyone who does not have a license.

The bottom line is AMD’s tessellation performance is not very good and there is not a lot NVIDIA can/should do about it. Using DX11 tessellation has sound technical reasoning behind it, it helps to keep the GPU memory footprint small so multiple characters can use hair and fur at the same time.

I believe it is a resource issue. NVIDIA spent a lot of artist and engineering resources to help make Witcher 3 better. I would assume that AMD could have done the same thing because our agreements with developers don’t prevent them from working with other IHVs. (See also, Project Cars) I think gamers want better hair, better fur, better lighting, better shadows and better effects in their games. GameWorks gives them that.”

Later on, Burke says that developers could optimise the workload generated by the Hairworks API by lowering the tesselation density inside the game if an AMD GPU is detected, but adds that doing so would be unnecessarily complex, and that would certainly field even more claims of favouritism.

Some numbers to ponder

Is that the case with Witcher 3, though? Does enabling Hairworks in-game affect performance more severely on Radeon GPUs than Geforce? We can look at some benchmark numbers from PCGames Hardware to see what’s going on, although keep in mind that this is only one data source – more testing by other review sites will be needed to make sure that this is the case.


Source: PCGames Hardware

Because the original language is Dutch, Google Translate is making a hash of things here. There are two settings for Hairworks, aside from the obvious “Off” switch, and they are “On”, which only affects Geralt’s hair and “Full”, which enables Hairworks for everything. In the graph above, “From” is setting Hairworks to “Off” and “To” is putting it to “Full”. The results are not surprising – both Radeon and Geforce GPUs suffer in an equally punishing manner, but Nvidia’s cards see much less of a performance drop than their competing Radeon parts.

The GTX Titan X drops 20% of its overall average performance with Hairworks turned on, while the Radeon R9 290 loses 24%. Those numbers are somewhat comparable, but it doesn’t really gel with the GTX 970’s drop of 12%. My thinking is that these performance losses are mostly due to Hairworks being computationally intensive and AMD’s history of poor tesselation performance, but I also suspect that drivers play a part in this as well.

A history of low tesselation performance

Cast your mind back to Crysis 2’s launch on the PC platform. Back then, the game shipped with only a DirectX 9.0c render path, because Crytek said that their DX11 implementation wasn’t ready and they were working with AMD and Nvidia to improve performance and get things working properly. When the DX11 patch finally came out, it included some welcome improvements like increased geometry on the in-game models, improved textures that looked fantastic when moving to 2560 x 1440, and some optimisations that improved performance. The water effects were also incredible. But for some reason, the patch decreased performance tremendously on Radeon GPUs when specific water features were enabled, while the Nvidia GPUs were similarly impacted, but never to the same degree.

Well, there was a good reason for that.


Part of the DX11 patch for Crysis 2 was a new tesselation model that changed the game’s look dramatically. However, it also did some very weird things. One of those was having an entire tesselated, flowing, ebbing body of water simulated underneath the city wherever there was an example of standing water in the level. If it was a puddle that was ankle-deep, it was powered by this massive tesselated body of water. When this was discovered by Tech Report in 2011, they didn’t know what to blame it on, whether it be developer laziness, an actual bug in the engine that couldn’t be fixed, or shenanigans by Nvidia. There was certainly something to be said about Nvidia’s tesselation performance at that point with Fermi, which was far superior.

In fact, Fermi’s performance when it came to higher polygon counts as a result of tesselation just became better and better the more you scaled up the workload. The actual performance didn’t increase with the extra load, but it was handled way better than competing AMD GPUs available at the time.


And there were also highly tesselated objects scattered throughout Crysis 2’s levels. The barriers were highly tesselated. Broken wooden planks and bricks were highly tesselated. The leaves of the trees were sometimes tesselated. Copious use of tesselation definitely benefited the game in terms of graphical fidelity, but it came at the expense of exposing AMD’s weak performance in that area.’s testing of the impact of Crytek’s use of tesselation revealed that the performance cliff for AMD GPUs was between 31-39%, while the GTX 580, at the same performance level of 64x tesselation samples, only dropped 17% of its performance without tesselation on average.

In fact, Catalyst Control Center’s application performance tab has an “AMD Optimised” check box that you can tick for games that use tesselation. By default, back in 2011, that setting is set to 64x samples. This was before the days of Geforce Experience and AMD Gaming Evolved, so it took a while for them to realise that having 64x samples set in the drivers by default was probably a bad idea (today the default is optimised for AMD cards, likely falling somewhere around 16x samples). Despite how awesome the Hawaii architecture is, the same problem seems to apply even with the latest GCN hardware, and it has only really been fixed in the Radeon R9 285.

Source: Tech Report

Source: Tech Report

In comparison to today’s cards, AMD’s performance, even at 32x samples, is still pretty bad compared to what Nvidia’s capable of.

In conclusion

So, to the argument that Hairworks is somehow screwing around with AMD’s performance in Witcher 3, I’d say that there’s very little to base this on currently. Yes, it involves heavy use of tesselation, but I’d posit that some testing needs to be done with changing how many tesselation samples are used in the drivers to see whether this is Nvidia’s fault, AMD’s fault, or CD Projekt Red entirely. The saying goes, “Never attribute to malice that which can equally be attributed to stupidity” and I think that may be the case here. Without a new patched driver for Witcher 3, AMD’s performance problems may just be of their own making, and Nvidia would therefore share none of the blame for it.

What do you think? The Witcher 3 on Steam unlocked in just a few hours. Maybe do some testing on your own and then report back in the comments here as to whether you’ve fiddled around with the game’s settings or not. While I can certainly agree that Gameworks holds a silly amount of power over AMD’s chances of optimising their hardware for it properly, I’m quite sure that this isn’t the real reason behind the performance issues here.

Hey you! Share this! SHARE IT WITH EVERYONE!
Share on Facebook0Share on Google+2Tweet about this on TwitterShare on Reddit0Share on StumbleUpon0Pin on Pinterest0Share on Tumblr0