If I’ve learnt anything from the last few years writing for this site, it’s that AMD has a pattern for their product improvements. When they had bought out ATi in 2006 the latter company had just released the X1900 series and things were looking up nicely. Fast-forward past the HD3000 series, AMD was pretty much on par with whatever Nvidia was pushing out and the HD3870 still carries the honour of pushing the company out in front of benchmarks, at least for half a year. When they still had the HD4000 series kicking ass halfway through 2009, they introduced a “wild card”, the HD4770. It was the blueprint for the eventual HD5000 series and was responsible for allowing AMD to figure out DDR5 RAM as well as giving them time to sell a 40nm GPU to gauge public perception and interest. At that point 40nm wasn’t quite ready for mass production, but AMD got the ball rolling for the single product line anyway.
The HD4770 was a cut-down version of ATi’s HD4850, but kept some parts from the HD4730, limiting the bus size to 128 bits, but keeping most of the HD4850′s innards intact. It was more efficient than the HD4830 and, with some light overclocking, could be made to go much faster than its bigger brothers. When AMD came out guns blazing with the HD5000 series, they brought with them experience gained from the HD4770 and we all know how well that tactic worked. While weird and experimental, the HD4770 could be considered every bit as important as the HD3870 was to enthusiasts back then.
“Bonaire”, the family the HD7790 belongs to, is likewise important for AMD not only because it brings a few things in line with what Nvidia’s planning but also because it shows how they’re going to optimise the GCN design for now while they work their way towards the eventual commercial release of their next GPU family. I reported some time back that the company announced that it’s HD7000 family would be “stable througout 2013″, which some people, myself included, took to mean that there would be a year without any new products. Well, guess what? Its nearly April, the same month the HD4770 was released in and we have something almost new from the red team.
But first! A nod to the future…
In another piece I did this year, I noted that when you compare the execution of the various hardware revisions of the Xbox 360 console and tie them up with AMD’s 2011 release of the socket FM1 Llano APUs, it lines up perfectly. AMD has been known to use experimental hardware in the market before to get things ironed out and the integration of the Xbox’s Xenon and Xenos chip in 2010 into the first commercially available APU was an important milestone for the company – it showed that both the APU and it’s eventual goal, introducing the market to Heterogeneous System Architecture (HSA), was possible. Without a development mule like the Xbox console, I’m sure the company would have soldiered on with their plans, but would have been at best two years behind implementation. We would still be stuck with the Phenom and Athlon processor lines and no way to catch up with Intel in the near future.
Technically, Bonaire belongs to the Sea Islands family because it’s an improvement to the Southern Islands family, composed of the rest of the HD7000 series. Sea Islands didn’t bring the promised new architecture or improvements to HSA but for that to have happened today requires a massive change in AMD’s lineup which would be expensive to implement – but hey, now the company is conveniently promising the same APU as the one found in the PS4, just with a cut-down GPU. Bonaire XT’s stats, incidentally, almost eerily match up with the proposed stats of the PS4 – it offers nearly 1.84 teraflops of floating-point throughput. Pair that up with, say, an APU like a Jaguar-based eight-core processor with eight floating-point units and one could easily have something on-hand that’s very similar to the PS4′s final hardware.
Seriously, I’m not kidding. Go compare the HD7790 to the HD7970M. They’re not that far apart and considering that mine and the rest of the industry’s theory was based on AMD using a modified HD7970M, I really think I’m on to something here. Can I say, “I told you so!” when I’m right? We won’t see an exact match of the HD7790 in the PS4, but it might be something similar.
In addition, AMD’s roadmap for HSA improvements for this year calls for three things to change on an architectural level. See that first entry? That’s the PS4 right there, because it has 8GB of shared DDR5 RAM that can be used by both the CPU and GPU. In fact, all three of those design aims could be accomplished in the PS4 alone, accelerating AMD’s plans and putting it well on the way to 2014′s improvements, where they plan to have hardware sort out whether to accelerate the GUI or whatever’s being done with the on-chip GPU, or the discrete one. That is pretty damn significant. We’ve already had the power management improvements to the APU family thanks to Trinity and now Richland improves that even further. This year is a very busy one for AMD.
So what’s changed on Bonaire?
The Bonaire XT chip, as it’s now being referenced all over the internet, ships by default with 896 shader cores, 16 ROPs, 56 texture units (same as the HD7850), default core clock speeds of 1GHz and DDR5 VRAM shipping at stock speeds of 6GHz. Compard to the HD7850, it has about 80% of the HD7850′s shader power and all of its texturing capabilities, simultaneously reigned in with RAM limitations of a 1GB frame buffer at launch and a 128-bit bus. It would be just as capable as the HD7850 were it not for the smaller bus width, which now places it at least 20% faster than a Radeon HD7770. Its very similar to the GTX650 Ti, but really is the in-place upgrade for the HD6870.
What I find most intriguing is that in most markets, both the HD7770 and the GTX650 Ti are equally priced. In South Africa we have a rather unique situation in that most of the AMD products are much cheaper than their Nvidia counterparts and there’s a gulf of about R800 between Sapphire’s HD7770 and the cheapest HD7850, a 1GB version also made by Sapphire. The HD7790, when it arrives, should fill in the new gap between the HD7770 and the HD7850 2GB and settle in at around R1800 as a starting point. That’s smack bang next to the various GTX650 Ti versions available and locally, it’ll be a tighter match than overseas.
Finally… meet the HD7790, in the flesh
The HD7790′s die measures just 160mm² and is larger than the HD7770 at 123mm², but smaller than the Pitcairn-based HD7850, which weighs in at 212mm². That extra space goes to the extra 256 shaders and 16 texture units over the HD7770, which accounts for the 500 million-odd transistors that have been squeezed in. Mind you, that’s also with a slight increase to the TDP by 5W. It is definitely closer to the HD4770 than AMD will admit.
The reference boards with the longer PCBs are actually just using the ones that were reserved for the HD7850. All that extra space to the right goes unused, for now. I expect that some companies like ASUS might make use of that space, fitting in some extra VRM circuitry for a custom chip design. There also won’t ever be more than four memory chips on any Bonaire GPU for the time being – currently there are four 2Gb Hynix memory modules, although we could see double-density 4Gb chips in the near future. Bonaire also won’t move beyond a 128-bit bus because it wouldn’t be financially feasible and the dual-channel bus layout currently benefits AMD’s third-party partners because they can use existing stocks to build it up. For the first time in I don’t know how long, Nvidia and AMD now use the same chips in their cards. (it still counts if it’s only one product line).
On the smaller PCB’s that chop off the unneeded bit, the PCI-E 6-pin PEG connector is rotated, now facing towards the front of the chassis rather than to the side. There’s a lot of space saved here and I think we’ll see enthusiasts plop this in all manner of chassis. In most cases, it’ll be a better fit than any HD7770 available right now.
Hey man! Where’s the scores already?
We’re getting there, but there’s one more thing. I mentioned earlier that Bonaire starts bringing AMD in line with technologies Nvidia is currently using in the GTX Titan. You remember that Titan’s overclocks are tied to its voltage? Its a similar story here, only that Titan has fifteen voltage presets to dynamically switch to – here AMD has just given us eight.
When they introduced PowerTune in the HD7000 series it was meant to be a direct feature match to the GTX600 series, only with the drawback that it didn’t work as well as it should have. Part of the problem with stuttering with today’s Radeon cards in games that haven’t been properly patched is that while the GPU runs the game, it switches between pre-configured PowerTune states many times but it doesn’t keep clocks at a high level by default. You can get around it by fiddling with P-states in the drivers and using AMD Overdrive, but the reality is that most users won’t touch it even though they may know they’re missing out on extra performance.
The new version of PowerTune switches dynamically between eight pre-configured P-states based on the starting clocks set in the drivers. You can set limits to the highest DPM state and you can manually override the driver settings that limit voltage levels. It analyses usage levels every 50ms and switches to the appropriate profile in about 10ms. That means that in the course of switching, there’s around 60ms of lag when you enter a new level in a game that requires more power and a further ten milliseconds to switch up or down to an appropriate state, polling every 10ms until the card is satisfied that your needs are met. In contrast to the previous implementation, the PowerTune feature in the HD7790 will no longer add in more voltage than is necessary for both under and overclocking.
In addition, PowerTune will now automatically boost to the frequencies you allow the card to overclock to (which is now called the “Effective Overclock” setting in CCC). By default, the highest DPM state (in red) is whatever clock speed the card starts with, but you can set that and the voltage as high as you feel comfortable and the card will boost to that level if temperatures are in check. Instead of Nvidia’s implementation which changes clocks dynamically according to power draw and GPU usage, this does it based on temperature readings. For comparison, Tahiti had just four DPM states, as shown in the second diagram above. Much more efficient.
Now, on to the games…
As usual, I’m using results from TechReport as they’re the only website out there, apart from PC Perspective, who give an accurate review of frame-time latencies. In Crysis 3 (medium settings, 1080p) in comparison to the 1GB HD7850 and the GTX650 Ti, you can see that the game hammers all three cards because it loves both high bandwidth and lots of texture units. If you haven’t played the game yet, you really should give it a look, it’s fantastic. There seem to be several parts in the test where the HD7790 drops to just around 25 frames per second, or just below 35FPS, but I guess that’s down to the 128-bit bus. You can see that the GTX650 Ti dives down to the same levels, just not to the same extremes in some instances. Driver optimisation is what’s needed here.
Tomb Raider only released a few weeks ago but already it’s been noted that it’s a strenuous title and can bring systems to their knees at Ultimate settings. TechReport tested at 1080p with High settings and as you can see, it was a pretty smooth experience. There’s very little stutter and for the most part the GTX650 Ti pulls ahead, but for a fresh product without driver improvements, the HD7790 holds itself up pretty well.
Borderlands 2 shows a strange finish for the HD7850 and as TechReport noted, this could have been due to the beta driver that was shipped with their review samples messing up the scores of older cards. The HD7850 stutters all over the place (the HD7770 was worse) but the HD7790 and the GTX650 Ti finish neck-and-neck for most of the test. Once again, the Ti is the better solution currently, but with a little driver optimisation the HD7790 could pull ahead. I’d also like to note that this was tested at 1080p with an Ultra-high draw distance and nearly everything else turned on. These cards with 128-bit buses are handling themselves pretty well, all things considered.
Sleeping Dogs finally shows off a win for the HD7790. Indisputably, the GTX650 Ti’s performance is rather lackluster and this is a game that’s been out since August last year. The older HD7850 pulls in well here, but you can see that AMD’s drivers haven’t been fixed to the point where they can figure out the memory manager. Here the HD7790 outdoes it’s larger sibling and it’s rather surprising that it’s cheaper to boot, as well.
Here we get to the other end of the spectrum – games that are played by so many millions of people that it’s not possible for them to not get some optimisation love. Skyrim runs close enough to perfect with little to no frame stutters and both generations of Radeons perform really well here. Take note that there are places where the Geforce stumbles, while the HD7790 carries on as normal. Skyrim seems to favour Radeons now after over a year’s worth of patches and driver tweaks.
Say what you want about Battlefield 3, DICE’s deep optimisation makes it accessible to a lot of people, provided you have Windows Vista 64-bit at the very least. Mind you, its not very indicative of the kind of performance you’d expect in a multi-player match, since all of the Battlefield 3 benchmarks that tech sites run are GPU-limited. That’s basically scaling it down to which company does better optimisations for the game, not who has the better hardware.
Benchmarking Battlefield 3′s single-player component is pretty much like having a masturbating contest with your friends. You know what the outcome’s going to be, some cards finish slightly quicker and slower than others, but in most cases it’s the same experience and the same results no matter what cards you choose to throw in to the ring. Unless you’re trying to run it on a Pentium 4 and a Geforce GT620, I’d say the chances are good that most people are able to fire up the single-player campaign.
I don’t know a single person who bought it for the storyline, anyway. DICE should just include an offline match mode in Battlefield 4 where you can play against and with 32 bots in a reasonably-sized map. That’s more of a real-world test than what we get given today.
Anandtech puts the Compute capabilities of Bonaire XT into something more familiar and runs it through the benchmark for Civilisation V and the Folding@Home benchmark. Although overclocked the HD7790 is streets away from the GTX650 Ti and the card it’s essentially replacing, the HD6870. I noted before, though, that the HD7790 doesn’t outrun the HD6870 outright and you may find that they’re mostly equal in many reviews around the net. But its definitely better than the HD7770 – just look at the scores for explicit single-precision math compared to the HD7790. All those folders who just bought HD7770s for the purpose of running Folding@Home all day long may want to take them back or RMA them. This is a much better card for the money.
Temperature and power consumption are also way down…
Power consumption is down in TechReport’s findings, at least at load, slotting in neatly between the HD7770 and the GTX650 Ti. At idle things are a little high, but what’s seven watts between friends, considering this is a product practically made to cash in on whatever expertise AMD gained in making the PS4? Still, room for future improvement. I think the GPU load temperatures are the most interesting. Its a larger chip than the HD7770, consumes slightly more power, has more texture units and higher clock speeds, but it runs cooler? That’s sweet.
Most coolers will perform at the same level, although I’m guessing that those like the DirectCU II and MSI’s Power Edition design will do a much better job. Considering how it’s performed thermally, it’s an easy fit for those small ITX chassis that packs everything in neatly and has, like, one fan feeding everything. I expect it to be very popular with modders and even in rigs where people have the madness to put two of them in Crossfire – according to other reviews on the internet, especially the Crossfire one done by TechpowerUp, two of these babies pulls right up next to a vanilla HD7970. I’d love to see some frame latency times for that.
So what does this all mean?
Well, two things. One, there’s no longer any reason to opt for the HD7770 once this thing lands, since it’s much better and much closer to it’s bigger brother. I suppose the same coule be said of the GTX650 Ti, but there’s still merit in siding with the green team. For sure, those of you who were buying a HD7850 1GB should look at the overclocked versions. While the 2GB Bonaire XT GPUs won’t show up until the end of May or mid-June, it’s not that long to wait if you’re coming from a much older card, like a HD5700-series card. Certainly, those of you with a HD4700-series would do well to grab one of these.
And two, I guess this is another turning point for AMD. See, what we have here is a shrunken HD6870. Its die is 40% smaller, it uses nearly half the power, it’s half as hot and it’s nearly $100 cheaper at launch. Therein lies the rub – if you take two of these and put them together, it’s twice as fast as a single HD6870, no? Two HD7790′s match up to a single HD7970. What if, in the next generation, AMD just doubled up on a revision of Bonaire and released that for $300? Kepler is still complex enough that fitting it into a similar package with the same constraints would take some time, so Nvidia would take longer to respond.
At this early stage of what looks like the beginnings of the move to the HD8000 family, it’s anyone’s guess how things will turn out. AMD could be trolling all of us and distracting it’s rival into looking in another direction while it pulls a fast one. On the flip side, this may be a blueprint of their plan for the price/performance targets for the HD8000 family. Whatever the case may be, the arrival of the HD7790 is a welcome one…
…If only for the free copy of Bioshock Infinite. God, that game is going to rock so hard.
Discuss this in the forums: Linky