A little over 6 months ago, NVIDIA launched the GTX400 series in the hope that it would stand a chance of grabbing market share away from AMD, and in the same breath satisfy millions of gamers around the world who waited for a replacement to the GTS and GTX200 series.

Initial reviews of the top-end GTX480 suggested that NVIDIA could have done a lot better on many fronts, and heat and power draw were enough to keep some fans at bay, hoping for a revision that would produce better results. Some waited a little longer and bought the much cheaper GTX470, and others took a leap of faith (and have now regretted this, probably) by buying into the GTX465’s promises – a lot of which were lies. It wasn’t worth the heat, the power draw, the performance, or the price one paid for it – you could have done a lot better with a cheaper AMD HD5830.

The mid-range GTX460 then hit the market and it was immediately apparent that NVIDIA had hit the right spot with its fans. Lower power draw, excellent value for money and some insane overclocks helped the company back into the limelight, and foreshadowed better mid-range cards that could easily give AMD a run for the money. The GTS450 continued this trend, providing a worthy rival to the HD5750. And then, the horror: the HD6000 Barts series was released less than four weeks ago.

NVIDIA really had nothing to counter with at that time, and kept mum on any developments that could have scored them points against the HD5870 and HD5850 price drops, the superlative value for money that the HD6870 offered, and the rather spiffy-looking AMD Vision branding on the new boxes. I know that last point might seem a bit moot, but consider this: NVIDIA doesn’t sell CPUs, and Intel doesn’t sell discrete graphics cards. AMD sells both and their stuff is pretty cheap. And they have better heat sinks to boot (I know its low, sue me).

You’ve got to hand it to NVIDIA, and Jen-Hsun Huang as well; the spiky-haired CEO recently said that they wouldn’t have released the GF110 chip unless they were 100% certain it would perform well. Last week the brand-new Geforce GTX580 was released, and damn. It rocks.

On GF100, enabling all 512 CUDA cores produced enough heat to cook a damn steak, much less an egg. In the GTX580 they’ve all been tamed and re-arranged to give the best performance-per-watt available. There are an extra 4 shader cores, as well as higher core and memory clocks all-round. The specs are actually quite close to the rumours that were doing to rounds before the GTX480 was released, and overall things are looking quite healthy.

For around R5,500-odd, there’s a lot of value in the package, and it can even best the previous single-card performance king, the Radeon HD5970. But while the HD5970 is essentially a dual-core HD5850, the GTX580 is a single-core card. Although there are thought to be no dual-GPU cards on the horizon just yet, two GF110 cores would probably rip a hole of awesome in the universe.

This card could possibly be the best thing to happen to you yet, but let’s take a few things into account before you jump ship:

1) NVIDIA supports triple-display gaming, but only if you run two cards in SLI. Their reason for this is so that gamers will always get the best experience the technology can offer, lag-free. ATI only requires one card with either DisplayPort or single-link DVI ports, and you can add another card to give you that performance boost necessary to eat through quad-HD resolutions.

2) You need an awfully strong power supply (at least 650w) to keep one of these babies running, and anything less is asking for trouble. The total power draw of a decent quad-i7 rig with a GTX580 will approach 440watts from the wall. It’s important that you get rid of the heat quickly, too, so a decent tower with at least one fan in the front and one at the back is a must.

3) It’s expensive. Even if it’s R1,000 cheaper on average than most HD5970 cards, R5,500 is a lot of money, and chances are most of the card’s value won’t be seen. If all it will do is play games, one should re-consider other options that will give the performance levels you need before blowing it all away. A GTX580 for an HD screen that isn’t a native 30-incher is going to be overkill.

There you have it boys and girls; NVIDIA’s GTX580 is the new performance king. If anyone refutes this, I beg of them to do a price/performance check against an HD5970. Is the Radeon R1,000 more expensive? Yes. Is it R1,000 faster? No. But then again, two HD6870 cards beat the hell out of a lot of things (GTX580 included), so I’d rather everyone follow that route 😉

As for NVIDIA, it deserves a giant pat on the back for doing so bloody well. We all knew GF110 would succeed; we just had no clue how good it actually was. Don’t rest on your laurels just yet though – the Radeon HD6980 is right around the corner, and will kick ass.