Nvidia logo HD

This opinion column was sparked by my recent tests that I’ve been doing on an Acer Predator G9 gaming notebook, a beast of a machine that will be featured in a review hopefully before the end of next week. It’s a fantastic system for gaming, but it features NVIDIA’s Optimus switching technology, which allows the operating system to switch between using the built-in graphics on the CPU or the discrete GPU, alternating between them to optimise performance for the application being run. Now this works fairly well for most things, and I never ran into a problem with running the games themselves… but the benchmarks? That was a bit of a headache, and I wish that this wasn’t the case. 

Optimus: The beginning

NVDA_Optimus_Overview

When NVIDIA first announced Optimus, it was a part of the NVIDIA ION platform, which was designed for netbooks with Intel Atom processors that had poor integrated graphics performance in GPU-heavy applications. Prior to this, systems with switchable graphics were available in the mid-range and high-end notebook markets, but it was a pain in the butt to make the switch between discrete and integrated graphics. Sometimes these systems had a hardware switch that needed to be pressed, with a reboot following shortly after, or others allowed you to set which GPU you’d use inside the notebook’s BIOS before booting into the operating system.

As it happens, that whole “reboot to discrete GPU” process is still with us today in several products, notably the MSI GS30 2M Shadow with the MSI Gaming Dock, and others like it (anyone remember the Fujitsu GraphicBooster?). Because the GS30 allows access to the internal hardware through a PCI-Express switch instead of Thunderbolt, you have to turn it off before docking or undocking the notebook from the dock. The PCI-Express standard does allow for hot-plugging of add-in cards, but many desktop motherboards don’t support it well, and laptops generally aren’t designed to do this (that’s what Thunderbolt is for). It’s a feature relegated to high-end servers and very, very expensive workstation motherboards that have had significant engineering efforts put into them.

Optimus promised to fix this by being a middle-man of sorts. Instead of having a switch in hardware or software to allow the GPU to talk directly to the monitor, the Intel integrated graphics parses the output from the discrete GPU and feeds it to the display, allowing for the ability to shut off the discrete card when it’s not in use, or when the system is in a low-power state. AMD has had a similar thing in place for a while, and some of this technology is used in their desktop GPUs in the form of ZeroCore – if you have two Radeon GPUs in Crossfire, the second one will power down when not in use because all the displays are hooked up to the first one, which handles the output from the second GPU to give us lovely moving, coloured pixels on our displays.

On the notebook side this works well for many applications, and it’s been one of the reasons why Intel and NVIDIA have dominated the gaming laptop market for the past five years, but it’s not without its quirks, as users have since discovered.

Acer Predator G9 header

The Predator G9-791 features an Intel Skylake Core i7-6700HQ processor, alongside NVIDIA’s Geforce GTX 980M with 4GB of GDDR5 memory. It runs Windows 10, and when I set up everything for my tests I updated the OS to the latest version available, build 10586.104 (which Microsoft also calls version 1511). This was also a clean install with the latest drivers, Geforce 361.91 WHQL.

steamvr performance testMy first issue was with Civilisation: Beyond Earth. Launching the benchmark should normally result in a dark grey display slowly shifting to a bright white before launching you into the test map. On the Predator G9, the benchmark didn’t even launch, even though the game ran just fine normally. Disabling the GTX 980M didn’t help, and explicitly setting the GTX 980M to be the preferred card in the drivers, even going so far as to uninstall the drivers for the Intel graphics, didn’t change a thing.

I’m pretty sure that in this case the game wasn’t “hooking” itself to a specific GPU, and thus it got stuck in a never-ending loading process because it didn’t know which one to choose. The same behaviour was seen in Thief, where launching the benchmark with all settings at default ended up with a black screen and a looping loading cursor. Once again, changing the preferred GPU in the Geforce drivers didn’t fix it, and both games would run just fine when not being used in benchmark mode. I didn’t test if setting a different profile to these games, or naming them something else would have worked, but it’s safe to assume that doing this would have introduced other anomalies in my benchmarks.

It got weirder with Dragon Age: Inquisition, which did something different that I discovered by accident by forgetting to turn the Intel HD 530 graphics back on. Disabling or uninstalling the drivers for the Intel graphics reduced performance to a slideshow, and it wouldn’t detect the GTX 980M, but it wouldn’t kick up a fuss about it either. In a similar vein, Valve’s SteamVR performance benchmark passed with flying colours and gave me good scores with Optimus on, but at the end of the benchmark it lowered the recommendation because all that it could detect was the Intel HD 530 graphics, and not the GTX 980M. “Upgrade your GPU for a better score” has never been funnier to read than when SteamVR’s benchmark thinks it was all done through an Intel GPU.

Interestingly, Middle-Earth: Shadow of Mordor allowed me to specify which GPU I would like to use before launching a benchmark, and would actually hook in my selection properly. It would be really nice to see more games do the same thing.

Acer’s partial workaround

Included in the software that Acer ships with the notebook, called Predator Sense, there’s a hot-key function that removes some of the issues from some games, but not others. Assigning the hotkey to a button gives you a GPU switch, which sets the mode for how GPU switching is handled. By default, Optimus and the Geforce driver use a combination of heuristics and data gathered through Geforce Experience to determine which applications need the discrete GPU, and which ones don’t.

Acer Predator Sense Discrete GPU switch

Toggling the hotkey alternates between the default mode, and another mode which appears similar to the now-deprecated “Fixed-mode switchable graphics“, where the OS tells applications to use the discrete GPU explicitly. This solves the issue of the black screen in Thief‘s benchmark mode, but it doesn’t change anything for SteamVR’s benchmark or Beyond Earth. I didn’t see something similar in the Predator G9’s BIOS when I was fiddling around in there, so this must be some kind of software switch that Acer has created that mostly mimics a hardware switch.

What should be done

I think that notebook manufacturers need to take a long look at what they’re offering to the consumer, and weigh it against the cons of only having the discrete graphics enabled and in use. I can see the sense in deploying Optimus for systems where battery life is a selling point, and where the integrated graphics is only a little bit behind the discrete GPU in terms of performance. When both GPUs are using DDR3, it makes even more sense to offer the NVIDIA graphics as an option for applications which benefit from CUDA acceleration, or that run better on the NVIDIA GPU purely because of better driver performance.

However, once you have a Geforce GTX 950M or faster paired up with GDDR5, I believe that the benefits of Optimus are outweighed by the cons it introduces. In the case of my dad’s Lenovo Y5070 (Intel Core i7-4700MQ with GTX 960M), merely closing the display causes stutter in Fallout 4 because the Intel GPU is underclocked when the display is closed. Acer seems to know that this is an issue created by design, because they advertise G-Sync compatibility on the Predator G9 only for external displays connected via Displayport.

Microsoft Surface

Some notebook manufacturers are beginning to offer workarounds for this. A few of them offer a button on the keyboard, or a switch under the service panel, that turns Optimus and the Intel HD graphics off completely, leaving only the discrete GPU in charge of operations. Some vendors hide the option in the BIOS, while others just don’t put it in at all.  Sometimes, throwing out the old methods isn’t always the best idea.

A prime example of how convoluted designing a product with Optimus could become is the Microsoft Surface Book (featured above in the forefront), particularly the version which has discrete graphics from NVIDIA in the base. You have to use the unlock button to release the muscle wire latch before detaching the keyboard, an action which also tells the Geforce drivers to shut down the discrete GPU and hand off everything back to the Intel GPU. The muscle wire wasn’t just a “hey, this is cool!” kind of thing – Microsoft needed a locking mechanism that would deter users from simply pulling out the tablet without ejecting the GPU. They had to design a dummy-proof latch to make sure that users wouldn’t end up ripping off the tablet in the middle of a Blender render.

Microsoft actually cautioned reviewers about this feature and notified them that any applications making use of the discrete graphics at the time should be exited before removing the base, lest they end up with a blue screen of death, hours of lost work, or worse, a broken Windows install.

So the question now is, who should make this call to fix the status quo? Who has the heft to ensure that this becomes a thing?

NVIDIA should be the one to make this change

Hear me out on this, NVIDIA, because I know you’re listening. Removing the option of having Optimus on high-end gaming laptops benefits NVIDIA and the consumer directly. NVIDIA already works with notebook vendors on systems that have G-Sync displays built in, which enhances the user experience tremendously. However, a characteristic that all these notebooks share is that they all lack Optimus support, because otherwise the GPU cannot speak to the display controller directly, and thus there is no G-Sync.

nvidia-g-sync-mobile-implementation

That’s how NVIDIA is able to get G-Sync working on a notebook in the first place – they recommend a panel that is capable of variable refresh rates, and they recommend that Optimus not be used so that they can work in their black magic. The end result is a G-Sync-capable laptop that blows others out of the water, with decent battery life, and no issues thanks to applications needing to hitch a ride on the Optimus roller-coaster. AMD needs to do a similar thing – take away Enduro graphics switching and only offer either an APU with a 1080p FreeSync monitor, or offer a an APU with disabled graphics, a discrete GPU, and FreeSync.

NVIDIA and AMD are the only companies left in the game that care about the user experience on desktops and laptops, and both invest absurd amounts of time and money into figuring out how to make computing better for the average user as well as the power user or gamer. I can’t expect notebook designers to try dictate the rules when they’re not the ones making and selling the silicon, so it’s up to NVIDIA and AMD to step up to the plate.

Linux users would also benefit from this change

Who could forget the headlines across the Linux-centered websites when Linus Torvalds, the project lead and maintainer for the Linux kernel, told NVIDIA to go f**k themselves because of their approach to graphics and Optimus switching which wasn’t friendly at all to the open-source community, or to the people who maintain and develop the open-source Nouveau drivers. Linus made that brash statement in June of 2012. NVIDIA only got partial, buggy, and incomplete Optimus support working the following year. Because of this, laptops that shipped with Optimus weren’t exactly compatible with most Linux distributions.

To this day I don’t know of any laptop that ships with Linux and discrete, switchable NVIDIA graphics. Even getting it to mostly work on Arch Linux installs requires jumping through more hoops than it takes to actually install Arch in the first place. There is a project that aims to reverse-engineer the Optimus driver from Windows, and it works fairly well for most things, but you still have to invoke the GPU switching through a terminal command.

The way the closed-source Geforce driver on Linux works currently requires users to sign out of their session once they’ve selected a preferred GPU in the drivers, and then sign back in to have the OS switch to it (or just restart X, which achieves the same thing). Just looking at the notebooks from System76, a company that sells systems only configured with Ubuntu Linux, will show that the only notebooks that have discrete Geforce GPUs conveniently don’t feature Optimus.

NVIDIA, Intel, and laptop vendors should make using Optimus optional

That’s honestly the only way this can work, by allowing users to turn this off in the BIOS or through a hardware switch. Hell, be old-school and make us use jumpers on the motherboard. Just give us the option to run whatever we want, however we want, and make it so that switching operating systems won’t give us reduced functionality to the point where Windows is the only option that works properly. That’s great for your hardware partners and Microsoft, but you’re deciding for the user how they should use their computer, and I believe there should always be a choice in the matter.

Standing in the way of users and telling them how to use the hardware they’ve bought is unfair. Some laptop vendors even specify in their warranty terms that installing anything other than Windows voids your warranty. Why is this a thing? How is Linux going to break a new Skylake laptop that came with Windows? Apple doesn’t do this, they practically encourage you to install Windows or a Linux distro on their laptops by offering drivers for their hardware to Windows users and make it easy to switch back to Mac OS X.

So do the right thing, guys. Stop shoving Optimus on to notebooks which don’t need it. These systems already have hours of battery life, adequate and quiet cooling systems, and play games really well. NVIDIA, your GPUs are already frugal and sip tiny amounts of power when idle on the Windows desktop. Intel, you don’t need to inflate your GPU scores on Steam’s hardware survey, you’ve won that race already.

Do the right thing. Make better systems. Ditch Optimus on high-end gaming notebooks. Make your customers happy.