ATI X800

Talking of demos, ATI showed the X800 XT off with a newly commissioned demo. Now anyone who’s been keeping an eye on nVidia over the last year or so will have found it difficult to avoid Dawn – nVidia’s pixie-like beauty who’s used shamelessly to draw (predominantly male) crowds at trade shows and exhibitions. Not wanting to be outdone, ATI has come up with Ruby – a modern femme fatal who’s action prowess is only outdone by her ability to wear miniscule, skin-tight outfits.

Ruby - the latest cyber marketing babe.

Targeted male marketing aside, the Ruby demo did look pretty good considering it was in real-time, and although we’re probably not going to see this kind of cinematic quality in a game quite yet, it’s encouraging to see how far real-time rendering has come over the past few years.

The realtime rendering in the Ruby demo is very impressive.

Physically the Radeon X800 looks pretty much identical to the Radeon 9800 XT. This is no bad thing, since it will obviously fit in almost any PC you care to throw it at. We probably will see a few custom designs, like we did from Asus with the 9800 XT, but most board vendors are likely to stick to the reference design.

As I’ve already mentioned, the latest generation hardware from both ATI and nVidia look pretty similar (clock speeds aside). But, there is one major difference between the Radeon X800 and the GeForce 6800, and that’s Shader Model 3.0 support. Now, this is where the marketing battle will be fought between ATI and nVidia. Obviously nVidia will be pushing Shader Model 3.0 support quite heavily since ATI doesn’t have it, but how big an issue is this really?

Well there’s no denying that there are some interesting features in Shader Model 3.0 that could, potentially make developers’ lives a little easier. For one, Pixel Shader 3.0 removes the 32 instruction limit, allowing for more complicated effects. Also, Vertex Shader 3.0 has the ability to create multiple instances of the same model with very little overhead, thus allowing more detailed environments to be created without causing too much performance degradation.

As far as Pixel Shader 3.0 goes, ATI will argue that the same or very similar effects can be produced using Pixel Shader 2.0 and that’s probably true. Where it does become an issue is when a developer wants to use more instructions than PS 2.0 will allow. For me though, the best argument is not whether the same effects can be produced using PS 2.0 and PS 3.0, but rather whether we’ll see many games using PS 3.0 to its full potential any time soon. It’s all very well having hardware that supports the latest technology, but if nothing makes use of that technology is it worth having it? But then the other side of the coin is that it’s better to have a feature and not need it, than to need a feature and not have it.

The Shader Model 3.0 debate is going to be a long one, and only time will tell whether ATI or nVidia has got it right. Ultimately, it’s in the hands of the game developers, so we’ll just have to wait and see.

comments powered by Disqus