Its the end of the year and things are running full steam ahead into 2015. In the world of technology, time waits for no man – it’s always progressing forward, yielding new products and hardware for people to swoon over for ten minutes before the next big things comes along. While 2014 was important for a number of reasons for hardware vendors, 2015 is promising to be the end-game for a number of plays that vendors made during this year. Lets dive into some of them as we see off 2014 for good.
3D NAND and the rising Korean threat
Samsung’s reveal of the 850 Pro came alongside a complete surprise – a big jump in memory technology. Instead of making NAND flash memory in separate stages and stacking them on top of one another while wiring them all delicately with tiny, tiny wires, Samsung took the memory and stacked it vertically, running holes through the chip at particular points and then stacking more chips on that until they got to around 32 layers. Then they filled those holes with a rod to transfer electrons to the flash memory and coated the rods with other substances and things and magic.
As strange as this may sound to some of you, a simple change in flash memory is by far the biggest feather in Samsung’s cap this year. With their 3D NAND flash in the 850 Pro, it dominated benchmarks thanks to absurdly predictable read and write speeds and it’s much, much cheaper for Samsung to manufacture as well. They are working on a 40-nanometer production process which is by now eons old in computer terms and they still have every other process node to run through before 3D NAND needs to be replaced.
That’s not all. Samsung’s 850 Evo, recently released to overwhelmingly positive reviews, also has the same flash memory and Samsung knew far more about it now than before. The latency in the 850 Evo is so low it can run circles around the 850 Pro. While other memory vendors are working on similar technology, they’re about 10 months away from a commercial product. If Samsung puts this onto a PCI-E SSD to combat Intel’s P3700, they’ll claim the performance throne by a country mile.
Four cores is now the baseline
Another interesting and welcome trend for 2014 was in the CPU market – two AAA games from two big, respected studios/publishers, imposed a hard-codes restriction for hardware support. Far Cry 4 and Dragon Age: Inquisition don’t play nice in a system with a dual-core processor and that’s a big move to make for two franchises that have traditionally run on very cheap hardware.
Its a hard play to make and it’s possible that EA and Ubisoft let this one out of the bag because they can stomach any temporary losses. Doing this on something bigger like Assassin’s Creed or Battlefield would be a disaster because more people would have to hold off on a purchase while they upgrade their rigs. Activision is going to have to make a similar play with the next Call of Duty and that’s going to be a painful one to run through – sales will most definitely be lower than Advanced Warfare.
On the plus side, this is good for gamers and the industry as a whole. With quad-core processors fast becoming the minimum requirement, we can expect games to get bigger and more detailed and if you’ve recently played Far Cry 4, you’ll understand just how big of a change we can expect in terms of fidelity.
With better hardware support, there’s the chance that more gamers will buy into higher-resolution monitors, which in turn will benefit the graphics industry because they’ll need more muscle to run the games at higher resolutions. Crysis started off a hardware up-cycle because it was so graphically intensive and beautiful to look at. This time, it’s up to Far Cry and Dragon Age to push things forward and get people ready for the coming change.
DDR4’s launch goes off without a hitch
Its weird – whenever there’s a switch to a newer, better memory standard, something always breaks. Thanks to better planning from Intel and all the memory vendors, the launch of Haswell-E and DDR4 with it brought few problems to the fore. Everyone who bought a X99-based system is having a fun time with it. People overclockling DDR4 for records and competitions don’t seem to have too much trouble with it.
Hell, while there were people calling the switch to DDR3 a disaster all those years ago, they aren’t saying much these days. It is completely weird. WHERE’S THE CATCH, PEOPLE? Where are the angry mobs saying that DDR4 is a failure? I know that Intel’s memory controller isn’t running at full speed but COME ON – you have to be angry about something!
GPU architectures are not running out of bandwidth yet
A big trend that will stretch through 2015 is that of AMD and Nvidia trying to do more with less. Specifically, both GPU vendors are skimping on memory bandwidth and memory subsystem design in order to open more room for profit and improvement elsewhere in their design. We can’t expect the graphics industry to suddenly move toward 512-bit buses and giant chips in order to fuel the 4K revolution and we certainly can’t rely on running multiple GPUs to make up for the performance deficit. While we need better power efficiency, we also crave more performance. Something’s gotta give.
That thing, as it turned out, was colour compression. In both the Radeon R9 285 and in all Maxwell-based GPUs, colour compression is used to lower the amount of bandwidth needed for running games. The technique compares the output of two or more consecutive frames and then looks for blocks of pixels which confirm to a colour pattern. If the colour pattern is addressable, the GPU will not save per-pixel colour information; rather, it will describe the pattern the pixels must conform to and lists the colours needed for specific parts of the pattern. The savings amount to somewhere in the region of 50% in terms of bandwidth and/or actual VRAM usage.
The effectiveness of such a trick depends on the game you play. Games that have lots of patterns that can be recognised and compressed see the biggest performance improvements. If there are only some parts of the game that are static and others variable, that helps even more. In Bioshock Infinite, Booker’s hands are always in view, so the pixels that make up his hands can have colour compression applied to them.
In a game like Watch Dogs, however, there’s less benefit, but it’s mostly due to game design. Watch Dogs uses double-texturing to save on memory bandwidth, so objects far away in the game have lower-quality textures applied to them, swapping out to higher-res ones as you approach the object. That in turn also forces a redraw, so colour compression only saves you a little in terms of memory bandwidth.
What does this mean for you lot, though? Mostly, you can expect that both AMD and Nvidia will go back to using 384-bit memory buses on their flagship cards. If they can save as much memory bandwidth as they do now with their next-generation cards, there’s no reason to invest in a GPU that needs to be large to fit a 512-bit bus. That means these flagships will also be cheaper to make and cheaper for you to purchase at the end of the day. Yay!
However, I urge you all to pick up the latest copy of NAG and read Neo’s column on the same subject. Despite these colour compression tricks and others hiding up AMD and Nvidia’s sleeves, there’s still no real replacement for raw memory bandwidth. The balancing act Nvidia and AMD will have to do will set the tone for how quickly the market adopts UltraHD 4K monitors for gaming purposes.
2014 was a terrible year for games and a horrible year for important issues like net neutrality and million-Rand cattle kraals, but for the hardware industry it was a great one indeed. With all these changes rolling about, 2015 is set to become packed to the brim with new advancements and the race to your wallets never lets up. Roll on 2015!