The Chicken?

Although incredibly beautiful at its highest settings, there is hardly a system on this planet that can cope with playing at this level when the resolution is cranked up. For a quick benchmark I'll throw some vague numbers at you. Running Crysis on a PC consisting of a dual-core CPU, 2GB RAM, and a nVidia 8800 Ultra (a £400 graphics card, bare in mind), with the in-game settings on High (one below the maximum setting) across the board we were struggling to get playable framerates at a resolution of 1,280 x 1,024, and this was with no anti-aliasing. Compare this to Bioshock, where we could probably play at 1,920 x 1,200, with the game settings maxed out, on the same system, and you realise how demanding Crysis is.

In actual fact, playing around with in-game settings there is plenty of scope for getting much more respectable framerates and I could dedicate half a dozen pages to going through the various settings for you. Of course, if that's what you would like then please feel free to let me know in the forums and I'll see what I can rustle up. For the time being, though, I'll just take a moment out from the key subject at hand and give you a few pointers.

The first place you should look to boost performance in Crysis is the post-processing setting. This adds effects like motion blur and depth of field (i.e. blurring everything that's closer or further away from the bit you're focussing on), which add to the immersion of the game but create a huge hit in performance. However, they don't affect the actual quality of environment you're seeing so standing still, and not using a scoped weapon, you'll not notice a difference between having the setting on full or turned all the way down. So, if you're looking to boost performance, start by turning post-processing to Low and then tweak all the other settings to get the best compromise between fidelity and performance. If you've got room to spare after everything else is up high then start re-enabling this setting.

All the other settings are more subtle and moving from the highest setting to the lowest setting gives a fairly linear tail off in quality. The main thing to know is that Medium is the absolute minimum you should aim for on any one setting as the Low setting, for the most part, removes things altogether giving you a horribly flat environment - think no shadows, a flat lifeless sea, static trees and foliage, etc.

The single most important setting, after post-processing, though, is the shader quality setting. Dumbing it right down, this setting is what knits together all the others so the environment feels like a whole rather than a mishmash of individual beautiful elements. It really is a lot more complicated than that but for now just know that keeping this setting on high, or above, while dropping everything else down low will look better than compromising on this one setting.

With that brief interlude out the way, let's return to the crux of my point, that I've actually not really yet made. The thing is, people were caught out by Crysis, and they're understandably a bit miffed that their very expensive gaming PCs are now struggling to play, perhaps, the one game that they've been waiting for. Fair enough, but isn't this what PC gaming has always been about? It's the old chicken and the egg story. Which should come first, the super fast hardware that has more power than we actually need or the software that demands more of our systems? There's no right or wrong answer, just the knowledge that one will always follow the other.

Going back to Far Cry, at the time it was also extremely demanding and I, for one, put off playing the game until several months after it was released so I could see it at its best and so get the most enjoyment from it. So it is with Crysis. Just because our year old, super fast graphics cards (i.e. nVidia 8800 GTX) have been able to deliver, arguably, more than we thought we'd ever need for so long it doesn't mean we should expect that to last forever. Yes, Crysis is a very demanding game and to play it at it's best we may have to upgrade but with the next generation of graphics cards just around the corner, isn't that just what we've been waiting for?

comments powered by Disqus