Over the weekend I stumbled upon some interesting posts detailing what some sources were saying to be planned specifications for both Microsoft and Sony’s next-generation consoles – codenamed Durango and Orbis, respectively. We’ve known for a while that AMD was going to be the hardware partner – after the huge loss Sony incurred from Cell development and Microsoft learning that switching out from IBM could yield more profits, both companies turned to AMD and their APU to get the ball rolling. Previous rumors on the internet were based on reported specifications of early development kits, both hinting at the use of A10-class APUs, more than 2GB of RAM and 250GB hard drives as standard.
But as we’ve seen before, very rarely do dev kits ship with final hardware and some new information that’s been going around hints at what both companies are planning. As with all rumours, though, take it with a pinch of salt, even if there’s a grain of truth to it.
When Microsoft and Sony sat down in their respective boardrooms back in 2003/2005 and worked out the final specifications they were targeting, both were going in very different directions with their hardware – hardware which, at the time, could be considered pretty high-end. The trip back in time that I’m taking you on, I feel, is necessary because history has a habit of repeating itself and what we’ve seen in the past may play out for the future consoles as well, albiet to a different degree.
The PS3 uses IBM’s eight-core Cell chip fitted with 256MB DDR3 RAM, with one core disabled to improve production yields. The Nvidia RSX (“Reality Synthesiser”) GPU was based on a a Geforce 7800GTX, stunted at birth with a low-bandwidth connection to the CPU and given small shoes to fit its enormous feet, kitted out with just 256MB GDDR3 RAM (note that these were exclusive memory allocations). A good few features were lopped off the original Geforce G70 design to make space for others that were more important, like extra pixel shaders. Anantech guessed correctly in 2005 that the nature of the RSX GPU meant that it would be bandwidth-limited at 1080p, which is exactly what happened at launch. Currently all modern PS3 titles are stuck at 720p.
The PS3 has seen two hardware revisions and three chassis changes from launch. Both Cell and the RSX GPU were fabricated using the 90nm process, switching over to 65nm for the Cell processor in November 2007 with the 40GB model, but still housed in the original “phat” chassis. The PS2 emulation hardware was also cut out at the time, further improving power consumption. Both processors were again shrunk to the 45nm and 40nm node process respectively in September 2009 with the new, matte-flavoured PS3 Slim. The top-loading version currently available is the first and likely the last revision Sony will make to its six-year old behemoth.
Microsoft’s Xbox 360, meanwhile, also featured an IBM chip, but this was instead a triple-core unit, codenamed Xenon. Xenon and Cell are from the same chip family and both are PowerPC processors – they’re similar in behaviour to ARM-based processors, both being RISC processors. Xenon had IBM’s version of Hyper-threading and took up much less space than Cell, though it was similarly power-hungry. In a way, Durango is Microsoft coming back full circle – the original Xbox was based on x86-compatible Intel Pentium III hardware. While similarly capable to Cell, Xenon often had caching issues because of its meager 1MB of L2 cache shared with all six threads. IBM’s Cell was likewise haunted by Amdahl’s law, which states that if a given amount of a workload cannot be parallelised, you’ll be limited with the amount of performance gained versus the number of cores used to finish the workload. You can throw more cores at the problem, but its not going to help.
The 360′s GPU was produced by ATi (this will be important later on) and was codenamed Xenos. Xenos was based on the ATi X1800 or the X1950, depending on who you speak to. While it was based on hardware that was current at the time, Xenos featured many technologies that were later adopted in ATi’s desktop family. One such improvement was the use of unified shaders, tackling space issues by using flexible shader cores that could either do vertex or pixel shading, or both at the same time – Nvidia adopted unified shaders into their Geforce G80 chip designs later on to save on die space, but ATi was the first to get it out into the wild.
In a nod to the distant future, many of the features commonly found in a Northbridge chipset were incorporated into the Xenos die – that’s right, the 360 was a very early variant of today’s APU chips. Also important to note is that Xenos wasn’t as specialised as the RSX GPU because it has/had one very important requirement – full, or semi-limited DirectX 9/Direct3D compatibility. Xenos is one of the reasons why ports from the PC to the 360 were easier, because a lot of the environmental variables and hardware were the same.
The Xbox 360 went through three chassis revisions and currently sits on seven motherboard revisions. The original 360 version was codenamed “Xenon” and featured both the Xenon and Xenos processors made on the 90nm process. Early issues with GPU overheats resulted in Microsoft redesigning the heatsink, which resulted in the first chassis revision and the “Zephyr” motherboard, which bundled HDMI for the first time in July 2007. September 2007 saw the “Falcon” revision hitting the market, with the Xenon CPU made on the 65nm process and a smaller PSU (there was also “Opus”, with the same power savings but lacking HDMI, used for “Xenon” RMA replacements). “Jasper” put both the CPU and the GPU through the 65nm process, bringing in more power and heat savings along with some onboard memory to allow for Dashboard updates. “Trinity”, released in 2010, integrated the CPU and GPU all into one die, just like an APU and used the 45nm process, along with the new slim chassis. The currently-shipping “Corona” finally integrates the Southbridge into the same die, essentially creating a system-on-chip with AMD graphics fitted into a 120W total system power draw.
The Xbox 360 is arguably the most important out of the two consoles because it gave us AMD’s APU. AMD’s early interest in buying ATi probably stemmed from their console development because AMD saw early on how the GPU could be used for more than just graphics. ATi was already incorporating motherboard hardware into the Xenon die to save on space and was on its way towards the foundations of a system-on-chip design; by the time the AMD/ATi merger was announced in 2006, the Fusion APU was already beginning development. If you’re paying attention to dates, the “Trinity” Xbox 360 revision was launched in 2010, while the first generation of AMD APUs only debuted in 2011 – if you read between the lines, that means that the 360 was a development mule and served as AMD’s testing ground for the APU design and how they could fit everything into a workable TDP.
Looking at it that way, its a significant change in how you may view today’s new speculations because AMD has used console development to perfect their desktop and mobile products in the past. Eurogamer published an article that detailed the Orbis, Sony’s codename for the PS4. According to sources leaking information to Eurogamer, both the PS4 and the Xbox Infinity/8/720 will rely on AMD for both the CPU and GPU. Furthermore, both will probably use a low-voltage AMD “Jaguar” octo-core APU pumped at 1.6GHz. Taking die space into account given how AMD works its desktop FX octo-cores, I’d say we’re looking at both the CPU and GPU put through the 28nm process, but the die will be significantly larger because AMD’s engineers will aim to make this a complete SOC from the start.
The fact that the rumor points to an eight-core CPU is being chosen signifies that games in the future could handle multiple threads better, showing gains when you have more than four cores available. We’re getting closer and closer to getting over Amdahl’s law, with significant portions of code today allowing for parallelism thanks to most gamers now sitting on a quad-core chip (That’s like the porn industry accelerating the adoption of VHS and high-definition content delivered through the internet). IBM’s Cell chip was a step in the right direction, but it was ahead of its time and too power-hungry – today, that’s a different story.
The rumor of the HD7970M mobile chip being the closest thing to what’s in the consoles really isn’t surprising either. Both RSX and Xenos were based on previous-generation high-end hardware built for the desktop. Considering how well the HD7970M performs in gaming laptops, its enough to include it inside a console because it has more than enough muscles to do the job properly, as well as already comfortably fitting inside the TDP requirements. Its also made using the 28nm process, making the console adoption that much quicker. We’ve seen rumors before that both Microsoft and Sony are targeting games running at 1080p with 3D enabled and framerates capped at 60FPS. You might think that’s impossible until you view this video:
That’s the Alienware M17x R4 running Crysis 2 on Intel’s quad-core Core i7-3610QM and the AMD Radeon HD7970M GPU. The Very High preset runs the game at 60FPS almost consistently while Extreme occasionally dips a bit lower into the 40FPS range when it gets to things like water effects or heavy firefights. If you take that same hardware and shove it into a console with closed-off software, you could see performance gains thanks to optimisation in the 50% and higher region. It irritates me to no end when PC fanboys complain that console games have bad graphics when they don’t take into account the age of the hardware or the fact that on release both consoles were as capable as many mainstream gaming computers. You certainly can’t grab a PC from 2005/2006 and run Battlefield 3 on it like you can with both consoles today.
There was another round of rabid speculation from console fans following a post on Reddit by a user named KR4T0S who claims to be a developer with inside information under NDA. Some of his/her stories and theories match up to what Eurogamer was presenting, but KR4T0S goes a little further and suggests that Sony kitted out the console with faster RAM for better gaming performance, while Microsoft opted for more, but slower RAM in order to allow the console to fulfill other software functions. This may be something to do with the rumor that Kinect 2.0 is being integrated as well the other rumor about Microsoft’s adoption of some embedded form of Windows 8, complete with browser and social networking capabilities as well as improved Dashboard functionality.
Sony’s performance aspirations, as claimed by KR4T0S (and taken with a pinch of salt) may have more to do with the use of the PS4 again as the centre of a home media network. The company is actively pursuing the Digital Cinema-4K standard in its television lineup and will probably be the first to actively market 4K movies and media on both Blu-Ray and through its Sony Entertainment Network, their reworked digital distribution platform. If you recall, the company also recently bought out cloud gaming service Gakai, its services which may land up on the PS4 if the company sees a way to make it work.
In the end, though, its the price that wins out, right? Both may launch at a $500 RRP and you might be forgiven for thinking that with that kind of price, its unreasonable to expect both companies to shove in high-end hardware. However, even though the HD7970M only shows up in laptops costing north of twelve grand, those are low-volume products and are priced that high to ensure that profits are made and R&D is paid off. If AMD sells the custom APU to both companies at bulk discount prices, they could easily make enough money to accelerate pushing their desktop chips to the 28nm process ahead of Steamroller’s release. If all my conjecture (based on rumors, nogal) turns out to be somewhat close to the truth, this may even positively impact the rollout of Excavator and AMD’s plan for a unified socket.
Its all or nothing at this point and I expect AMD to succeed, regardless of the financial and product-related setbacks that they’ve had over recent years. A cash injection like the one they could be making here would be welcome, it might even keep the company afloat and put them back in the performance game.
As for Sony and Microsoft, they’re both looking to give their fans the best value possible and if AMD is capable of delivering as promised, it’ll be us, the gamers, who will benefit most of all. Viva le console!
Discuss this in the forums: Linky