Its something that all gamers aspire to, a performance metric by which all rigs are measures and marked by. Since the invention of film, 24 frames per second is the default speed at which movies and other video media are shot at to keep movement in the frame fluid and create the illusion of reality. To this day for games and interactive media, the frame rate considered to be the lowest playable minimum it still 35fps. But the one that feels most comfortable and playable changes between game types and it all depends on the player’s preferences, their hardware and the game settings they use.

Crysis running at 122fps with lots of things turned off. Ug-ly!

For example, for years PC gamers have aimed for a minimum framerate of 35fps because they know that when they get into a particularly intense firefight in Battlefield 3 or knocked about by a trio of rivals in DiRT3 that their rig’s performance won’t let them down. In competitive environments such as LANs, many gamers turn down their visual quality settings to achieve better performance to have the edge over their rivals. But what if you’re not the kind of player who goes to these sort of events? What if you’re looking for a rig that will last you ages and still give you playable performance on Ultra settings three years down the line? I ask you all to turn your attention to consoles for the moment.

Consider the Playstation 3, now in its fifth year of service and still playing titles like Crysis 2 at 720p resolution. Or an adventure title like Uncharted 3, showing off visuals that easily match that seen on the PC. The common feature among these games and similar titles on the Xbox is that they all run at 720p resolution. Very few games out there cater for playing at the full 1080p resolution because they have to be so carefully optimised to take advantage of the 256MB of RAM on the ageing customised Geforce 7800GT inside the PS3. The Xbox 360 also suffers the same kind of problem with its customised ATI Xenos graphics core, equivalent to a 512MB Radeon 2900 Pro. Both consoles have similar performance in games, but you’ll also find that a lot of games are actually running at 60fps average, or 35fps for titles that are more graphically demanding.  Its pretty cool to know that the hardware inside still kicks ass years later and its this kind of longevity that PC owners aspire to. So how can you do the same with your hardware?

Setting a Lower Resolution

In many titles, there’s a pretty big jump in visuals when moving from 720p to 1080p. Things look crisper and clearer and there’s more fine detail to notice. However, there’s a corresponding drop in frame rates. Its actually dependant on both the game’s engine and the manner in which your hardware scales up to higher resolutions. In reality, if you have a midrange card that plays games at 720p with High settings, you’ll have to scale down to medium settings at 1080p to enjoy the same kind of performance. So, if you’re gearing towards 1080p at medium-to-high settings, keep in mind that 720p would look as good and bring better performance to the table.

Its not just maximum framerates that are important – average and minimum are just as important to consider. At 1080p you may find that the dips in heated firefights in-game or online drop your playable framerate below 30fps, causing some lag and stutter in many games. Turning things to 720p would mean that your average and minimum framerate jumps to a higher level, assuring that you won’t find yourself in the same pickle with a lagging internet connection and PC.

Turning Down the Eye Candy

One of the chief reasons why framerate drops is thanks to the increased visuals we’re treated to by developers looking to create a more believable world. Higher resolution textures, individual leaves and blades of grass and realistic water are all things that help create a better experience, but conversely drive performance down with each feature turned on. But where to start?

In most games, you can turn off AA or only use 2x for better performance. Newer titles may support FXAA which, at 2x FXAA, would have the same visual quality as 4x AA or MSAA but with the performance penalty of 2X AA/MSAA. That’s one thing you can turn down. Additionally, you may want to turn off things like Nvidia Physx, water effects and some textures for things that you won’t miss. Many games have a lower body count setting, since having fresh corpses on the playing field that you just fried takes up extra memory as more NPC units and enemies arrive on the scene. Call Of Duty, for example, plays a lot better with a low body count than a higher one.

Other things to consider are the mods you might have applies to the game, since many Skyrim players have discovered a huge world of mods to enhance their Elder Scrolls experience. very often these aren’t as optimised as developer-approved versions, costing you performance in the long run with the chance of some memory leaks. There’s also the DX mode that you’re playing in that has an effect on your gameplay. Many titles today allow you to choose between running in DX9, DX10 or DX11, with few requiring DX10 at the very least. You’re stuck for choice if you play Battlefield 3, which restricts you to DX10 on the desktop, but many others will allow you to play using the DX9 runtime. I’ve got Metro 2033 running in DX9 at 1080p with high settings and 4x AA – smooth as butter with very few dips into unplayable territory. While I know that some people paid a premium for their DX11 cards, you’re not missing out on that much by moving down to DX9.

Planning for a Specific Resolution

Tom’s Hardware’s System Builder’s Guide famously puts graphical performance over CPU muscle, choosing to rather optimise performance for games than for productivity, even though that’s still a strong point of all their machines. I tend to focus on all-round ability in my guides since I consider productivity to be as important as gaming performance. That said, planning for a certain screen resolution is also as important as keeping in mind the settings you’re playing at.

For example, lets say you go out and get a bargain 19″ screen with 1366 x 768 resolution. For playable performance in all games, you’d have to start off with AMD’s HD6670 1GB or Nvidia’s GT630 1GB cards. Both would deliver playable 720p performance at medium settings as a start. If you had to move up to 720p high settings, I’d max the system out with either a Radeon HD6870 1GB or Nvidia’s GTX560. Both cards perform very well at 1080p with mostly high settings, so 720p with high or Ultra settings would yield average framerates above 60fps and a minimum close to 40fps – perfectly playable for most titles.

Moving up to 20″ monitors with 1600 x 900 resolution, I’d consider AMD’s Radeon HD6770 1GB and Nvidia’s Geforce GTX550Ti 1GB as the minimum hardware requirements for playable performance at the native resolution. If you’re looking to bump things up for playable performance at high settings with the native resolution three years down the line, you need to look at AMD’s HD7870 2GB and Nvidia’s to-be-released GTX660Ti 2GB, as that’s the most I’d spend on a graphics card that’s going to drive a 20″ monitor. Any more power than that is going to be wasted on the monitor.

At 1080p-boasting screens ranging from 22″ all the way to the massive 27 inch desktop monitors, I’d start off with AMD’s Radeon HD6870 1GB or Nvidia’s GTX560. They’d give you playable performance at High settings with low levels of AA around the 50fps mark, but you’d have to keep an eye out for the settings you use and the effect that they have on your performance. I’d top a 1080p-optimised system out with a single Radeon HD7950 3GB or Nvidia’s  GTX670 2GB. Both cards easily drive 30-inch monitors so its a no-brainer at 1080p. You could enable multiple levels of AA and Ultra settings and you’d still land up far above the 60fps mark that we’re looking for.

Going up to 30 inch resolutions, its gets a bit mad so I’m not going to make recommendations here since there’s too many ways to get the performance you’d want out of a system driving such a behemoth. But safe to say both the fastest single-GPU graphics cards as well as dual-GPU monsters would do well here, as well as Crossfire and SLI solutions using mainstream cards that excel at 1080p. You may encounter things like stuttering and frame drops in multi-GPU setups but both AMD and Nvidia are working with game programmers to fix those issues with updated drivers and game patches.

Multi-monitor configurations should also be considered here and I’d hazard using such a setup for gaming with a card that only has 1GB of RAM. 2GB is the minimum but it must also have the processing muscle to deal with the huge resolutions. If you’re looking at running a Eyefinity or Nvidia Surround setup, the default cards I’d recommend are Nvidia’s  GTX670 2GB and AMD’s HD7870 2GB at the cheaper price points. You’d either be driving three 19″, 20″ or 1080p screens with these cards, but remember that not all visual features should be enabled because that would push the performance drops more severely than with just one monitor to take into account.

Pushing the limits of games while enjoying performance at a high level is something we’d all like. Some people have to make do with the hardware they have, while others have to design their configuration around the games they play the most. Either way, that 60fps mark still remains the benchmark everyone’s aiming for and today it’s still the one we all think about. As more 120Hz screens enter into the market, it’ll become less of a goal because we’ll have a new standard to try max out. Until then, 60fps is the goal for any gamer and competitive player and that’s why its today’s Oldie But Goodie!

Discuss this in the forums: Linky

More stuff like this: