Review Price free/subscription
Be aware though that you can get X800 GTs from some manufactures with a 128-bit memory interface. If you’re buying a system with an X800 GT in it do everything you can to ensure that it features a 256-bit memory interface, as without it performance will be seriously crippled.
As a mid-range GPU the card is sensibly sized, lightweight and sports a rather unattractive but medium sized heatsink and fan arrangement. There’s DVI and VGA ports on the rear, and a TV Out with cables and dongles for either S-Video, Composite or Component output. There’s a DVI to VGA converter for those with dual CRTs and a paper manual booklet. The software included is a driver CD with an overclocking utility, a two channel version (boo) of CyberLink's PowerDVD v.6 and Splinter Cell: Pandora Tomorrow: old hat now that Chaos Theory is here.
But onto the pudding testing. Luckily we had a 6600GT card on hand, so we put both through the wringer on our graphics card test bed platform – an Athlon 64 FX-55, an MSI Diamond SLI motherboard and 2 x 512MB of Crucial Ballistix RAM. The CPU is clearly not what would be likely to be paired with a mid-range graphics card but at least it ensures that any CPU bottleneck is kept to a near minimum. As with our last 6600GT group test, we limited out testing to 1,280 x 1,024 as with these cards you won’t really won’t to go above this with smooth frame rates. Budget wise a 17 or 19in TFT is likely to be the natural paring with one of these cards and these run nativley at 1,280 x 1,024 resolution.
Out of the starting gate the X800 GT pulled clear of the GeForce in 3DMark 03, with 800 more points at 1,024 x 768, maintaining the lead lower down the graph. Indeed in all our tests the Sapphire was comfortably ahead of the 6600GT with the notable exception of Doom 3, where the situation was reversed. It’s actually remarkable considering that even with a 128-bit memory interface the GeForce card is still faster and clearly demonstrates the extent to which Carmack tailored his coding for nVidia. That said there hasn’t yet been a flood of games on the market using the Doom 3 engine and as you can still get a decent performance out of the Radeon it’s no deal breaker.