Stop thinking about the end of the world or today’s competition for a moment. I’d like to transport you back to March 2008, when everyone was either running XP or Vista, Windows 7 was still in very pre-beta phase and the word “Ultrabook” wasn’t even invented yet. That was the month that Crysis was released and ever since, Crytek’s titles have been looked to as a standard that will always push hardware to its very limit. Crysis was insanely taxing on then-modern systems and only last year did we get to the stage where playing it on Ultra (comfortably) was possible with most mid-range cards. Far Cry was hard on most systems too, but it didn’t have that same kind of oomph-sucking ability Crysis did, even though it used the same engine. Over the years we’ve had Far Cry 2 (Dunia engine, based on the Crytek engine), Crysis Warhead (Cryengine 2.0) and Crysis 2 (Cryengine 3.0) pushing the boundaries of what’s capable on a mix of different hardware and platforms. This year, Ubisoft returns to the throne with Far Cry 3 – once again returning to an island paradise and, once again, bringing even systems costing R15,000 or more down to earth to eat humble pie. Its running the new Dunia 2.0 engine and it looks spectacular.

Tom’s Hardware went through the tedious task of benching the game on a variety of different hardware and the result is that while its surprisingly forgiving on low to medium settings at resolutions lower than 1920 x 1080 resolution, pump that up to 1080p and medium settings and it starts exerting a stranglehold on even the beefiest GPUs and modern processors. Ultra? Lets just say its closer to the original rig-strangling Crysis than I’d care to admit. Some comparisons will also be drawn with Just Cause 2 because it also features a sprawling island paradise with a huge amount of detail and the freedom to roam wherever the map extends to. Shall we look at the results?

Firstly, a couple of things to note. In that picture we have several million things going on all at once. There’s realistic reflections on the water’s surface as well as wave formations and an ebb and flow to the water’s edge. See those clouds? They always change and are moving in real-time. There’s a variety of shadows to calculate and extremely realistic light ray tracing going on here. Its a beautiful game and easily comparable to the visuals from the first Crysis or Far Cry. Who remembers the first sunrise in Crysis? Yeah, I thought so – its an unforgettable moment. And lets not forget what’s happening in areas not in your character’s peripheral vision – maybe there’s a shark swimming nearby, perhaps there’s a tiger for you to punch or maybe a few scary men with guns behind you – that’s all still going on, again, in real-time. So you can imagine, just like GTA IV and its sprawling city with inhabitants that do their own thing, that this requires some beefy hardware to keep going at a smooth pace. And you’d be right.


So far, you’ll be fine playing with most low-end GPUs at 720p and low settings. AMD’s Radeon HD6670 is the lowest bet you can make to play the game or even launch it and it handles the frame drops and action very well, staying far above 30fps across the entire benchmark. Note how the graph corresponds very well with that of the GT630 – our first hint that this game is GPU-limited. For fluid gameplay, start off with the Geforce GTX650 at these settings.

But what about those of you using an APU? Well, the A10-5800K pairs up with the HD6670 and generally brings around 10% more performance as a result of the Hybrid Crossfire pairing. On its own, the A10’s Radeon HD7660D is equivalent to the Geforce GT440, so it might hover at just over 30fps average, peaking at around 34fps. Since I don’t have data that supports those numbers, however, this will just be an educated guess for now. But if you’re using the cheaper A8-5600K, the A10-5700 or even the lowly A6-5400K, you might struggle a little with achieving playable framerates.

What about Llano APUs? Your guess is as good as mine, honestly. If you equate the chips  to their Trinity variants you might get similar peformance, but the performance bump from VLIW5 to VLIW4 depends on the game and the engine in use. The only card here displaying really strange behaviour is the HD7770 – that frame drop might be something for  AMD’s driver team to check out in future, as this is likely some side-effect of the latest Catalyst 12.11 beta drivers – its worth noting that this doesn’t impact the card’s performance at medium settings nearly as much.


Now here is where it gets interesting – and this is just at medium details and quality settings. Since almost no-one games at 1680 x 1050, skip straight to 1080p and you’ll see what I mean. Yes, you’re reading that graph right – the HD7870 and the GTX660 have almost exactly the same performance. Now that could be read both ways – either the Geforce is just good enough to match the Radeon, or, even with all the collaboration done with AMD for a game that comes for free in the Never Settle bundle, this game still runs well enough on unoptimised Nvidia drivers. So this begs the question as to who will be the better pick, at least in the short term? If you don’t have the game yet, the HD7870 is great because it comes with a free copy of the game right off the bat. If you already have the game, the GTX660 might look like the better value proposition because it still has to go under driver optimisations and there’ll surely be a raft of improvements from both companies, as well as multiple price drops. We’ll have to wait for Techreport’s frame latency data to accurately pick a winner, but judging by how much stutter we’re seeing with the current beta drivers in their piece, “Does the Radeon HD 7950 stumble in Windows 8?”, it won’t look very inviting in the short-term. Perhaps dropping back to the 12.10 drivers would improve things a bit?

In reality though, this game actually scales pretty well to price points. Go on, look them up – the GTX660 and the HD7870 are priced to compete against each other and here they’re neck-and-neck. If you can’t afford either, the HD7850 fills in the gap nicely between the GTX650 Ti and the GTX660. If you need a new GPU, don’t care about the settings and want the game as well, the HD7770 is your choice – just make sure to set things to Low details. What’s really interesting is there’s very little performance hit between 1680 x 1050 and 1080p. If you want that little extra boost, play at that resolution rather. I feel its a better compromise than 1080p and low settings.


And here I’m a little bit perplexed. There’s this little gap between Medium and Ultra settings called “High” and I’m just not seeing anything? Perhaps the testing team didn’t have enough time to get those results up. Lets look at the Ultra results first, though. Right off the bat, you can see that the only solution offering playable performance at 1080p is the GTX670. The HD7950 might offer passably playable performance but I consider anything above 30fps to be fluid enough to maintain some suspension of disbelief, so long as the frame latency isn’t to high and there’s no stuttering. In any case, the minor differences in each card’s performance is weird. Its not a CPU limitation, since every card is only marginally better than its competition just below it, which might point to a VRAM limit.

You can see that both SLI and Crossfire improve the situation but that’s using two mid-range cards with VRAM limited, on most variants, to 2GB DDR5 chips. Even with 3GB buffers on the Geforce GTX670 and the Radeon HD7970, there’s still a significant performance hit playing on Ultra. Things don’t improve when using three 1080p monitors, as you can see – everything’s unplayable. Yes, Far Cry 3 has the same insane requirements as the original Crysis. Soon we’ll hit that tag line, “Can it run Far Cry 3?” (meh, doesn’t have the same ring, does it?). If you are an Eyefinity or Nvidia Surround runner, you’ll have to play the game on medium settings, its just that heavy.


Take a look at the HD7770 results in the 1080p Low and Medium benchmarks. While its not a completely academic measurement, dropping the setting from Medium to Low translates to around an extra 25 frames per second for both the minimum and maximum frame rate. Look at the latency graphs again – they’re very similar, which indicates that this behaviour should scale all the way up.

That means that those of you playing on Ultra should benefit handsomely from the drop to High settings. Netting around 25 more frames would suddenly make the HD7870’s performance acceptable. It would also make the HD7970 the closest contender for the least amount of stuttering because you could leave V-Sync off and things should be fine. The GTX670, assuming that Nvidia’s Adaptive V-Sync actually works, would also provide the same experience, but would sync up better with your monitor.


Yeah, I’m pretty sure. All of the modern CPUs Tom’s tested produce enough performance to be playable at medium with a HD7970. The FX-4170 represents the worst-case scenario, with the dual-module, quad-core CPU tripping over itself pretty badly, dropping nearly twenty frames between the minimum and maximum frame rate. Strangely enough, the Core i3-2100 doesn’t do as badly, suggesting that Intel’s Hyper-threading works almost as well here as a true quad-core.

AMD’s FX-8350 doesn’t better any of the Intel chips but there’s a flip-side to the results: its not that the chip’s significantly weaker, its actually turning in reasonably comparable performance. The game doesn’t scale to more cores or to higher frequencies, so it isn’t integer-reliant but fairly dependent on good single-threaded performance, which is why minimum frame rate in the Intel camp is the same 61 fps. There might be a floating-point workload mixed in there somewhere as well, since the FX-4170 only has one FP unit sandwiched in each module and shared by two cores.

What’s disconcerting though is that there’s not more “oomph”.  At these lowered settings the highest framerate should be reaching above 80 fps, but its not. Are these CPUs just running out of juice, even at these lower settings? I think more testing should be done at 720p because these results can still be GPU-limited.


Well, I will, for starters. I run an AMD Athlon X3 3.0GHz triple-core and a Radeon HD6870 1GB. Since the HD7770 produces comparable performance to the outgoing HD6850, I’ll be limited to running the game at medium settings and 1080p or low settngs at the same resolution if I want to avoid looking at another slideshow like I did in 2008. Those of you running a dual-core chip or an older-model Athlon or Phenom chip will likely see a benefit in moving to a more modern platform.

If you’re already using an Intel quad-core chip like the Core i5-3330, you’ll be able to more or less match any of these results using the same GPUs. Pairing up with a HD7870 2GB card would alloy you to play the game with comfortable frame rates at high settings. If you’re on an APU, you can expect the same kind of performance as has been demonstrated with the FX-4170, only a little bit better. Sadly though, not even a Phenom II 955 can turn in enough performance to make a difference, even though it is four proper cores. An X6 chip? It probably wouldn’t to any better than the FX-4170, if it even gets to that level.

The bottom line, I guess, is that if you want playable performance in this game, you have two options: get beefier hardware, or play it on console-like settings or on a console itself. In that respect, Far Cry 3 is the Crysis of this decade and I wouldn’t be surprised if it spurs along some upgrade plans in the coming months. Its a lot more forgiving than its predecessors at lower settings, but at Ultra it simply crushes single-card configurations like a Nascar fan crushes beer cans using his head.

Source: Tom’s Hardware: Far Cry 3 Performance, Benchmarked 

Discuss this in the forums: Linky

More stuff like this: