AMD’s HD7970 launched to much fanfare on the 22nd of December 2011. It was a paper launch and there were only a few units available owing to supply issues at TSMC. At the time, Nvidia was still experiencing issues with die yields from Kepler, while Intel was furiously fabbing away in a closed room at 22nm. The race for efficiency and superiority and AMD’s aggressive rollout plan gave it the benefit of four full months of a free market from launch to Nvidia’s competing product. But in all that time, whispers and rumors abound that AMD was making a dual-GPU product once again. But nothing came to fruition.

AMD Radeon HD7990 Malta

Even at the launch of Kepler, the launch of the GTX690, Titan and AMD’s HD7790, no dual-GPU from the chip designer was shown. We had seen previous examples of the HD7990, codenamed “Malta” in a couple of images where AMD  was previewing other products, but nothing else was said about it. Taking the matter into their own hands, third-party partners like PowerColor, ASUS and HIS did their own thing and went about creating their own dual-GPU monsters.

Cards like the ASUS Ares II, the PowerColor Devil 13 and HIS HD7970x2 were brilliant products in their own right, but they were still not based on a reference design. They were all done with AMD’s blessing and assistance, but they weren’t technically the real thing.

At the time. AMD cited the lack of bridging chips from PLX Technology like the PEX 8747 used on the GTX690 as the main reason why they were not producing a dual-GPU reference design, in addition to the company’s belief that the market wasn’t right for that sort of product just yet. But things have changed since then and they’ve recently launched the one we’ve all been waiting for – the HD7990!

But can it run Crysis 3?

HD7990 Crysis 3

One thing that’s been hyped up for a while now, though, it’s its gaming credentials. In the media slides sent out today AMD confirms that, yes, the HD7990 will run Crysis 3 maxed out at 4K resolution which is about four times as taxing as 1080p. That’s no light claim either, because the Digital Cinema-4K standard is what’s going to be effectively replacing 1080p in about five year’s time. A lot of things need to change before that resolution becomes easy work and graphics horsepower is one component.

Then there’s the Battlefield 4 connection. We reported on the reveal of a 17-minute trailer that showed real-time gameplay and it looked and ran smoother than anything I’ve ever seen before. We were later told that the demo was running on two HD7990 graphics cards and that the game was actually rendered at the full 4k resolution – what you were seeing in Youtube was a downscaled 1080p version.

There was a whole other three thirds of DICE’s beautifully rendered world that you couldn’t make out because it was too small to see.

There’s also the game bundles. AMD’s Never Settle bundle has been making lots of waves for the company and retailers participating in the promotion, so much so that even I recommend that people seriously consider the value they’re getting from the free game licenses. Yesterday AMD confirmed that the HD7990 will receive the full Never Settle bundle currently available.

Possible new Never Settle titles

But what of the leaked Gaming Evolved roadmap that may or may not be true? If the rumor holds any water, I expect AMD to seriously consider at least giving GRID2 for free to owners of the HD7990 who redeemed their codes. But in the war against a brand as strong as Geforce, they may need a much bigger advantage than just performance and game bundles will play a significant part in that.

Onto the card…

AMD Radeon HD7990 Malta

It is a beauty. The dual-slot cooler houses three 90mm fans that cool the card down better than two physical HD7970s in Crossfire. Its a smattering of red and black and this will be the same on all the versions that third parties will release for now – later on we’ll see water cooling blocks, the Windforce 3x cooler and even MSI’s TwinFrozr with, perhaps with a pinch of Vapor MG spice added to it. Its quieter as well, well under the whine of a reference blower design. In its own way, it’s every bit as attractive as the industrious and beautiful metal of the Geforce GTDX690.

AMD Radeon HD7990 Malta bare board AMD Radeon HD7990 Malta backplate

The two Tahiti XT cores found in the Radeon HD7970 and the HD7970 GHz edition lie on opposide sides of the board, joined by PLX Technology’s PEX 8784 bridging chip. The bridging chip emulates a PCI-Express connection between each card, with forty-eight lanes of PCI-E 3.0 goodness and oodles of bandwidth unhindered by other connections. The PEX chip provides 16 lanes of connectivity between each GPU and a further 16 to the PCI-Express slot for interfacing with the rest of your system. Nvidia uses the exact same layout, putting them both on equal footing in this regard.

AMD Radeon HD7990 Malta video ports

At the back, it’s a normal dual-slot affair, with AMD neatly side-stepping the useless HDMI that wouldn’t be very comfortable here. They provide one dual-link DVI port and four mini Displayports. Right off the bat that’s enough for triple-display Eyefinity or a 5:1 penta-array of displays in portrait, which is what AMD reckons most people will be using this card for. Its worth pointing out that the Geforce GTX690 and the Nvidia Titan can only run four displays at once.

Specs-wise, default clocks are 950MHz for the core and that gets boosted up to 1GHz by PowerTune. The 6GB of DDR5 RAM sits pretty at 6GHz effective. Most reviewers managed to overclock their units by a decent degree, managing an average of 100MHz improvement on the core clocks and 400MHz on the memory.

Absolute power corrupts…

This dual-GPU beast is fed by the 75W of power available from the PCI-Express slot on the motherboard and the two 8-pin PEG power connectors on the top of the board, for a total of 375W of available power. This board was never meant to be an efficient machine but that’s how it ended up. A single HD7970 can consume up to 250W of power on its own. This beast, through power tweaks and some engineering black magic, uses around 350W on average for games. That’s impressive compared to the GTX690 which has a peak power draw of 300W, but there’s a downside.

powertune enhancements powertune DPM states

To the left is the current implemenation of AMD’s PowerTune technology, used in cards like the HD7970, HD7950 and HD7870 GHz Editions as well as the HD7990. To the right is the boost states that the Radeon HD7790 from the Bonaire family cycles through in it’s implemtation of Powertune, giving the card more flexibility to increase its clock speed based on how far it is from the temperature and thermal limits.

To it’s detriment, the HD7990 uses Tahiti XT cores and is still stuck with the old boost cycles. In most cases it’ll be sticking to the boosted state with higher voltages only when the card isn’t drawing too much power in gaming, or it’ll be stuck in the high state with default clocks when too much power is being drawn and there’s not enough left. Sometimes the Powertune software becomes confused and also jumps up and down in games where it can safely stay at the boost state for longer. Drivers will be able to help this a bit, but Bonaire is AMD’s real vision of the future.

Frame latency is today’s key word

As I’ve said many times, forget FPS averages. They don’t matter this year because the industry is beginning to move away from all that. They are beginning to realise that merely pushing out new frames as and when they are done isn’t the solution. We’re capable of producing incredibly smooth gameplay, but that is an obhective that needs to be attempted by three different sectors – the drivers, the API and the game engine.

howgameswork

Modern frame capture tools use the T_display data to figure out where things have changed from the FRAPS data.

For example, you can play Crysis 3 on a card that you were told would guarantee you 60fps without VSYNC. but in the course of the game, you realise, by keeping an eye on FRAPS on the corner of your screen, that performance sometimes dips to 35fps. That’s fine if you’ve just arrived into “Welcome to the Jungle”, the game’s most impressive level. But you notice that while you’re playing, even at 60fps, there’s a little bit of stuttering on the screen and some frames seem choppy. That’s as a result of frames simply being delivered to the screen without any consistency. Some are being delivered faster, others are slower, hence the graphical issues that can ruin your experience.

FCAT Analysis

The colours on the side show how frame capture tools identify where frames begin and end as they’re being fed to the screen.

It is measuring that latency, the time taken for frames to be delivered in a smooth, consistent manner, that people and reviewers the workd over are now looking into. The graphs below are from PC Perspective and I picked these because most other reviews do not tell the whole story. The graphs have the results for the HD7990 at two resolutions – 2560 x 1440 and 5670 x 1080. Although the latter graph won’t show the biggest driver improvements, the former one will and you should take note of how things change.

The reason is simple – AMD has a very early alpha driver that is focusing on fixing up frame latency for Crossfire. It fixes a number of anomalies like runt frames, frame tearing and visual artifacts that in the past would have made playing the game mildly to hugely annoying. Its only due out in June but golly, if they don’t have it available in less than two weeks they might have an issue on their hands.

Battlefield 3

BF3_2560x1440_FRAPSFPS BF3_2560x1440_PLOT BF3_5760x1080_FRAPSFPS BF3_5760x1080_PLOT

At 1440p with the raw FRAPS data, it appears that Battlefield 3 runs pretty well on all the cards. It shows the HD7990 beating the GTX690 in terms of raw power and with a large advantage over the similarly-priced Nvidia Titan. However, PC Perspective also checks out how the frames arrive at the monitor after going through several legs of software optimisations. The orange cloud at the observed 1440 graph is a result of frames that are either rendered and displayed too quickly or too slowly. However, notice the black line – that’s the HD7990 with the prototype driver installed and as you can see, it’s generally running quicker than the Titan or the GTX690. That’s a massive change.

The multi-monitor results are here to give you an idea of the extreme scale of how much latency plagues Crossfire. Both FRAPS and PC Per’s observed results using Nvidia’s FCAT frame recording technology register the HD7990 doing much more poorly than the Titan, let alone the GTX690. If AMD can get the prototype driver up and running for Eyefinity resolutions, they will be right inside the blue line for the GTX690, possibly even higher thanks to the better hardware. With frame metering improvements to the drivers, Battlefield 3 in Eyefinity without VSYNC will be butter smooth.

Crysis 3

Crysis3_2560x1440_FRAPSFPS Crysis3_2560x1440_PLOT Crysis3_5760x1080_FRAPSFPS Crysis3_5760x1080_PLOT

If I was a HD7990 owner right now, I’d be crying more tears than if I was cutting onions. Without VSYNC capping your frame rates, you’d experience mountains of latency issues. It is hugely different to what Nvidia offers and the Titan in particular composes itself very well, with the GTX690 stuttering majorly in two areas. However, with the alpha driver in place, the Malta-flavoured card pulls ahead in a lot of places and makes a massive difference. Just remember, the latancy spikes to the bottom of the graph don’t imply that that card is far more powerful than it’s competitors, those are just frames that were held back by the deliver of the delayed frame preceding it, creating a see-saw motion in terms of how the card responds over time with current drivers.

Remember, many people have complained over the years that without Vsyc, Crossfire was broken in terms of how smooth it felt. All those years ago, no-one had a means of figuring out why this was the case. Nvidia’s Geforce 400-series and earlier were also prone to it as well and when they caught it internally, they began tailoring their driver improvements around it.

The multi-monitor scores are just as bad. The Titan appears to be smooth, but hovering between a frame latency of 30 to 40ms roughly equals a fps rate of 35 to 25. That’s not actually that great and neither does the GTX690 do any better. The problem is the bus size and the memory on the Kepler cards. Once AMD gets frame metering sorted for Eyefinity, I expect the HD7990 to have a commanding lead over both cards.

DiRT 3

 Dirt3_2560x1440_STUTDirt3_2560x1440_FRAPSFPS Dirt3_2560x1440_PLOT

PC Perspective couldn’t get DiRT3 running in multi-monitor resolution (they never mention why, strangely) so for the moment this is all they have. DiRT Showdown would have been more indicative of real performance for Codemaster’s latest games and it takes advantages of things only Radeon cards can do reliably, like global illumination. Nevertheless, the results are interesting. FRAPS records that the HD7990 in both driver versions is significantly faster than the GTX690 and the Nvidia Titan at spitting out frames. However, that doesn’t correlate to PC Per’s captured frame data, which shows that something in the drivers is messing up with the performance, in particular two huge orange clouds in the benchmark.

If you’re wondering by this point in time if this is a once-off thing, it’s not. PC Perspective’s benchmark takes place over 60 seconds and in their testing of their frame analysis tools, it was found that this pretty much correlates to the overall gameplay experience. So every sixty seconds, you’d be guaranteed to have two major fps dips. However, this is slightly out of proportion. The dips never take longer than 15ms to render to the screen – it’s running at over 120fps. If you had to enable Vsync, those issues would go away.

In any case, the results from the prototype driver are telling – everything is mostly cleaned up and performance, on average, is higher than the GTX690. The manner in which the frames vary in the third graph seems to be really bad, but take into consideration again the fact that the fps average is higher than 120. Performance here ends up higher than most people seemed to anticipate.

Far Cry 3

FarCry3_2560x1440_FRAPSFPS FarCry3_2560x1440_PLOT FarCry3_5760x1080_FRAPSFPS FarCry3_5760x1080_PLOT

I once remarked about how Far Cry 3 needed a beefcake PC to run at Ultra settings and I’m still right. The game makes a mess of the Radeons in Crossfire and the HD7990 still has some issues with the prototype driver, although AMD says that they’re still working on it. The 1440p results are impressive, cutting down on the frame latency in a big way and bringing the HD7990 closer to the GTX690 in performance. Taken into some context, a 30ms frame time equates to around 33fps. 20ms is closer to 50fps. Even on the GTX690, there are frequent frame drops below the 30fops threshold.

The multi-monitor results are dreadful. Even the Nvidia Titan, a composed little green line, teeters at the edge of 20fps (45ms) in average latency. The dual-GPU solutions are unplayable and keep in mind that the GTX690 already makes use of frame metering technology and there’s no results for the prototype driver in here.

The game really does kill your PC’s performance to the same extent that Crysis did. You really do need a beefcake computer to run it at its highest settings above 1080p (which isn’t very impressive, or hard to do, these days).

Sleeping Dogs

SleepingDogs_2560x1440_FRAPSFPS SleepingDogs_2560x1440_PLOTSleepingDogs_2560x1440_PER

The last gaming title I’ll discuss (for now, updates may warrant more articles into the matter now that Crossfire is getting some attention), Sleeping Dogs has been a dual-GPU killer for AMD from day one…which is odd, considering they do bundle it with their cards. Multi-monitor is also broken for the moment on Crossfire, so analysing the results will be a moot point – you’ll just see the same mangled mess that was Battlefield 3.

But the results for the prototype driver are impressive. It cuts down tremendously with frame times and it’s an even better result than the Battlefield scores – the card is now consistently faster than the GTX690 and it’s frame variance isn’t that high. The FRAPS data pretty much matches the outcome from using the prototype driver. But here’s the thing – unless you’re already running Crossfire, you probably have no idea how much of a difference frame rating makes to game play. PC Perspective demonstrates this below, although I’d really recommend you download their raw, un-Youtubed version to see the differences clearly.

[youtube]3_WCFPYAZ1g[/youtube]

Power consumption is competitive

HD7990 power consumption idle HD7990 power consumption gaming HD7990 power consumption stress

Power consumption is another major issue for most people and the drawbacks of running Crossfire with the current HD7000 cards have been mostly those of power consumed. The HD7970 on its own can suck up nearly 240 watts of power. Nvidia’s GTX Titan uses a similar amount of energy compared to the HD7970 GHz edition, which it greatly outpaces. Nvidia is just more efficient with Kepler at this point in time and the GTX690″s power usage is just insanely good.

But the Malta pulls ahead where it needed to in Anandtech’s measurements. Idle power consumption is on par with the GTX690 and that’s because one of the cores is deactivated in low-power scenarios where the GPU’s muscle isn’t needed. The extra circuitry and the PEX bridging chip is probably responsible for the increase over a pair of HD7970 cards in Crossfire. Playing Battlefield 3, the HD7990 is only 70 watts or so above the Geforce, which isn’t too bad considering it’s chopping off 100W from a proper Crossfire pair.

The Furmark test is about the same kind of load Bitcoin mining would induce and that’s actually lower than the gaming results. Its now chopping well over 100W from what was previously expected with a Crossfire pair of GHz edition cards or PowerColor’s HD7990. That’s extremely impressive.

Does Malta change anything, really?

AMD Radeon HD7990 Malta

If we take it into account with the current drivers, the game bundle, the low power consumption, it’s a bargain if you’re one to play on a 1440p monitor with Vsync enabled. Average framerate statistics from many reviews put it above 60fps average and once you enable Vsync, many of the frame time issues do go away, although there will still be the stutters and a few frame drops because the HD7990 ships with beta drivers.

But Vsync is only part of the answer and it’s likely that the only people picking something like Malta up will be AMD fans, people upgrading from something like the Radeon HD4870 X2 (yes, it still kicks ass) or those of you seeking the best value bundle. Selling all eight games for R200 each (Deus Ex at R100, because that’s what it costs at other places) means that you could potentially put R1500 back into your pocket. But game bundles aren’t this card’s real value.

Many online reviews understate the importance of the drivers with a new launch and what the prototype driver AMD is working on does to change their recommendations. Assuming you buy the card and get AMD to give you a copy of the alpha code they’re working on, you’ve already got the fastest dual-GPU solution on the planet in terms of hardware and the software is just about catching up.

AMD’s value has always been in long-term support. I have a friend still on the Radeon HD5870. He still has zero reason to upgrade yet because he can’t afford a HD7970. That’s a card over three years old with 2GB DDR5 RAM and a 256-bit bus. The phenomenal driver support just allowed it to keep on trucking and this of true of Nvidia as well.

But with the HD7990, it’s theoretically faster than any other card on the planet. DICE chose two of them for their demo running on Alpha code and you saw how good it was. AMD is finally catching up in frame latency optimisations and with frame metering software five months old that’s just as good as Nvidia’s implementation, which they’ve had a two-year head start on, they’re probably going to pull ahead in the performance department again this year. The GTX690 reaches it’s performance peak much sooner and it was already the world’s best dual-GPU card available.

Even those owners of HD7990s from other brands will benefit, particularly for the custom versions that ASUS, PowerColor and HIS make. The prototype driver, once included in the Catalyst 13.5 release, will just make them that much faster and that’s good value for money – double the perceived performance for just a few hundred megabytes for a driver install and a reboot? That’s value.

For those of you still on the rails, I say buy it. Play it with whatever resolution you like, lower the details if necessary for more playable performance and enable Vsync and triple buffering. You’ll be tided over until the driver hits, where upon you can open the floodgates and see what your hard-earned cash really bought you.

Too many reviews that went up yesterday were quick to denounce this card and that’s a bit unfair, although understandable considering the prototype driver was released quite late. Its the best dual-GPU product AMD’s ever made and like fine wine or Half-Life 2, it’s only going to get better and more rosy with age. If you wanted something to last you through to the 4K revolution, while playing every game you throw at it on maximum settings, for $1000 (approx R9200) this card should be your first choice.

And it’ll be the one to beat.

Read the reviews: PC Perspective, Tom’s Hardware, TechpowerUp!, Hardware Canucks

Discuss this in the forums: Linky

More stuff like this: