Home » Opinions » GeForce 6800 Roundup

Introduction

by | Go to comments

Share:


Looking around the TrustedReviews labs we noticed that we had a small collection of graphics cards that use the GeForce 6800 GPU. Of course that means that they share a number of common features and we thought it would be pointless for us to review them individually as we would end up repeating chunks of the specification. Far better, we thought, to do them as a round-up so we can focus on the differences between one graphics card and another.

You’ll recall that the GeForce 6800 (codenamed NV40 during development) was launched back in April 2004. The chip is fabricated on a 0.13 micron process with 222 million transistors, and crucially it has the 256bit memory controller that was so obviously lacking in the ill-fated FX5800. Initially there were two variants of the chip: the 6800 and 6800 Ultra, then in May Nvidia released the 6800 GT, and for a time there were also rumours of a 6800 Ultra Extreme.

The GeForce 6800 Ultra has all 16 of its pixel pipelines operating, as well as six vertex pipelines, but the basic GeForce 6800 only has 12 functioning pixel pipelines. It is clear that Nvidia is finding it hard to get decent yields of 6800 Ultra parts, as you may have noticed if you’ve been in the market for a £400 graphics card and found that every etailer has them on backorder. That said, ATI are having an even tougher time trying to get the yield up for the X800 XT Platinum Edition. Theoretically the AGP 8x interface is inferior to PCI Express x16, but in the real world AGP will be the dominant standard for at least six months, and we wouldn’t be surprised if it is significant for another two years or so.

The GeForce 6800 is a DirectX 9 design; however the pixel and vertex shaders use Shader Model 3, unlike ATI chips which use a version of Shader Model 2, but it’s only with the recent release of DirectX 9.0c that we have been able to get an insight into the capabilities of this GPU. If you’ve patched your version of Far Cry to v1.2 you’ll have seen that Shader Model 3 gives an immediate performance boost, and the sooner that Crytek releases the final 1.25 patch for Far Cry the better, in the opinion of this reviewer at least.

The five GeForce 6800 graphics cards in this round-up all have a core speed in the range of 325-350MHz, and the Galaxy, Leadtek, MSI and XFX models have 128MB of DDR memory that runs at an effective speed of 700-750MHz. The notable exception is the Asus with its 256MB of DDR3 which has a speed of 1000MHz, or 1GHz if you prefer.

Lars ran a stack of synthetic and gaming benchmarks on a Pentium 4 test rig with a 3.4GHz Extreme Edition CPU on an Abit AS8 i865PE Socket-T motherboard, while 1GB of PC3200 DDR memory and a Seagate 7200.7 200GB SATA hard drive were also employed. He used nVidia’s Forceware 61.77 driver, which is the latest WHQL driver that nVidia has listed, and the results gave the Asus a small but significant advantage, which was no great surprise as its memory runs so much faster than the opposition. Then again it costs £70-£80 more than the other cards so you’d expect to get something for your money.

Of course most gamers wouldn’t dream of running their new £200 graphics card on standard settings so we also had a look at the overclocking abilities of the five graphics cards on another test system. This used an Asus SK8V (VIA K8T800 chipset), an Opteron 148 running at 2.25GHz and 1GB of PC3200 Corsair TwinX memory with a Hitachi 120GXP hard drive.

The idea was to get a feeling for the overclocking abilities of the graphics cards, and we could have spent literally weeks testing various permutations of games, drivers, settings, core and memory speeds. Instead we did a quick and dirty run of the Doom3 time demo (1,024 x 768 High Quality) and the new 3DMark05 benchmark (standard settings) using 61.77 drivers to give a figure that is comparable with the Pentium 4 test system. After that we loaded up the manufacturers’ overclocking utility and ran the two tests again. Once we’d done that we installed Coolbits to see what results we could get from the nVidia drivers, and after that we loaded up the 65.73 Beta Forceware drivers. We’re not too keen on using Beta drivers, particularly for testing, but Shader Model 3 is so new that we felt we had to see what difference the new drivers made, and we’re glad that we did as the results make for some interesting reading.

Go to comments
comments powered by Disqus