Is it worth the upgrade?

So, that's why we find ourselves in this situation today. By introducing a huge change in the way hardware communicates with the operating system and subsequently the way games are to be developed, uptake on the new platform has been slow and there are less than a handful of games available to show off this supposed massive leap forward. However, that doesn't mean DX10 won't be worth the wait because for all its problems it looks set to revolutionise the way games look in the coming years.

By redesigning from the ground up Microsoft have removed a lot of the overheads and limitations of the old way of doing things so, for instance, more unique objects can be rendered on screen at any one time. So whereas previously an huge army of Orcs would've had many structurally identical features that were just cleverly tweaked to give the illusion that they were different, with DX10 it will be much easier to create hundreds of truly individual Orcs, or whatever else the artist is trying to create.

Hardware consistency has also been tightened so developers shouldn't need to write specific code paths for different vendors hardware in order to optimise performance, as in writing one code path for nVidia and another for ATI. Instead, the emphasis is put on hardware manufacturers and their driver development teams to make their hardware do what it needs to as fast as possiblem rather than on games developers to make their game work with the hardware. This frees up time for them to concentrate more on the look, feel, and enjoyment of the game, which can only be a good thing.

The most exciting development though, is the addition of geometry shaders. Have you ever noticed how the face of an in-game character can look incredibly life-like yet the outline of the head is all straight lines and pointy angles. This is because the actual wireframe model (the geometry) that makes up the solid shape of the head is very simple while all the details of the face are created using pixel level tricks that give the illusion of a more detailed model. The reason these tricks are needed is because the creation of the basic geometry of a scene has, up until DX10, been done on the CPU, which simply isn't fast enough to create more complex models. Geometry shaders look set to change all this by allowing manipulation of the underlying models to be done on a graphics card, theoretically consigning pointy heads to the annals of computing.



Geometry shaders can also be used to create sophisticated particle effects that simply weren't possible under previous versions of DirectX. Fur, water, foliage and other effects that involve highly complex physical calculations will also be much easier to represent. All we need now is for developers to start using these new features.

In the long run, then, when developers have got used to using the new tools at their disposal and hardware has matured, we can expect to see some truly awesome DX10 applications appearing. However, what we all really want to know is how does DX10 perform right now, can we see the difference, and is it worth the upgrade?

I'm going to be looking at the only three games currently available which have a DX10 code path, these are Company of Heroes, Lost Planet: Extreme Condition, and Call of Juarez. With the exception of Call of Juarez I'll start by comparing the differences between the DX9 and DX10 versions of the games, and then I'll look at the difference in performance. As we only have access to the demo version of Call of Juarez and it doesn't have an option for running in DX9 mode, I'll just look at the features that are explicitly highlighted as particular to DX10 and see how the game performs in this mode. Both Company of Heroes and Call of Juarez will be tested using the in-game benchmarks, while we're relying on manual benchmarking using FRAPs for testing Lost Planet.



The test platform I shall be using is the same one we've been using for a while now. We used the 32-bit version of Windows Vista as this is still our platform of choice for gaming. Rather than use beta drivers we've stuck with the latest publicly available drivers from both AMD and nVidia.

comments powered by Disqus