Final Thoughts

It is clear from our tests that the current generation of DX10 hardware simply can't handle DX10 games. While you could argue the merits of one graphics card company against the other the fact of the matter is that neither offer a compelling solution looking forward. However, it's still too early to start sounding the death knell of this current set of hardware as the games tested are not really using DX10 to its full potential or, we hope, in the most efficient way. Maybe when we've looked at the soon to be released BioShock and World In Conflict and further down the line, Crysis, we may have a bit more of an idea of the capabilities of these cards. In the mean time, though, we can just hope that there is some hidden performance locked away in these cards.

What is without doubt is that DX10 will eventually take over and bring with it some truly spectacular games, it just might take a while before we start to see them arrive.

Developers need to get to grips with the new tools at their disposal and learn the most efficient ways to use them. Meanwhile hardware manufacturers will need to tweak their products so they can actually run these new features properly.

/94/48cb69/ba63/5277-worldincolflict3nukes.jpg

The soon to be released World In Conflict is an amazing looking RTS that uses DX10

----
Unfortunately, what will hold this development back is the need for games to be able to run in DX9 mode, at least for the foreseeable future, so that consumers who haven't upgraded their hardware can also play the game. After all, a developer wants as many people to buy its software as possible so it's not going to limit itself to just the DX10 capable audience. Indeed, more and more games are being released across multiple platforms so it's even less likely DX10 only features will be implemented, until there is a compelling reason to.

The trouble with this scenario is that hardware is, generally faster when optimised for one task or another. Taking the scatter-gun approach of supporting both DX9 and DX10, not to mention the silicon used up by all the video decoding stuff on these new cards, has meant this latest generation of cards, particularly in the midrange, have failed to perform as hoped. But, with no obvious alternatives, there is nothing more nVidia and ATI can do except throw more transistors at the equation and hope they find a cost effective way to make their next generation hardware a truly worthwhile prospect.

comments powered by Disqus