HDR

Thankfully for ATI, it now supports 32-bit precision, as of the release of the Radeon X1800 series of cards back in October 2005, so this method of HDR rendering is available to both graphics manufacturers. However, there is a very important issue that needs to be mentioned – the current nVidia hardware can not run this type of HDR rendering concurrently with anti-aliasing. This is a point that ATI stressed at the X1800 launch, but which didn’t really matter as no games supported it – even if you used a special patch for Far Cry that allowed both HDR and FSAA, the frame rate became completely unplayable. And that is nVidia’s argument, that with current hardware, implementing proper 16-bit precision based HDR as well as FSAA will result in an unplayable experience and I have to say that I’m inclined to agree.

/94/922a7a/720e/2380-hdr2.jpg

But that doesn’t change the fact that 3DMark06 does give you the option of running SM3.0 HDR with FSAA, and if you select these options on nVidia hardware these tests fail and you end up with no 3DMark score at all. You can argue the pros and cons of Futuremark taking this approach, but ultimately with nVidia hardware you’ve got to choose between SM3.0 HDR or FSAA, you can’t have both – that’s a fact.

What’s also important to remember here is that even if your graphics card can happily render 5,000 gradients between total dark and total bright light, if your monitor only has a contrast ratio of 500:1 it’s going only going to be showing 500 of those gradients and rounding the rest.

I guess what I’m saying is that I admire Futuremark for implementing this type of gruelling test, but don’t go thinking that your 7800GTX card is ready for the scrap heap because it can’t run it, because that’s far from the truth.

comments powered by Disqus