2004 was quite a year for graphics technology, with a number of significant launches from the major discrete graphics vendors – by which we mean of course, ATI and nVidia. However, these launches tended to be blighted by turning out to be ‘paper’ only, meaning that while announcements were made and samples sent to the press, months passed before the products themselves appeared on the shelves. Possibly the worst example of this was ATI’s X800XT Platinum Edition - dubbed Press Edition. While we first reviewed this in May 2004, by the time samples were freely available ATI had moved on to the X850 XT Platinum Edition. And ATIs X700 XT has proved to be a total vapourware launch and will never appear at all.
This is a situation that makes life difficult for the press and frustrating for the consumer. nVidia wasn’t immune to the problem and there were chronic shortages of its top-end GeForce 6800 Ultra cards especially in PCI Express form.
Leadtek however, appears to have a great relationship with nVidia and has been able to obtain good supplies of GPU’s for its boards. We’ve looked at the AGP version of its 6800 Ultra back in September and we’ve seen Leadtek cards in PCs from the likes of Mesh and Evesham.
We’ve been sent a sample of the PCI Express version of its 6800 Ultra –- the WinFast PX6800 Ultra TDH and we decided to stick it in a testbed based on the latest Intel technology and see what we could get from it.
It’s still the highest end GPU from nVidia with no less than 16 graphics pipelines, and six vertex shader engines. The GPU is clocked at 400MHz and the memory at 550MHz. Interestingly, Leadtek’s AGP version of its card has the GPU clocked at 425MHz, but Leadtek has gone for standard settings for its PCI Express version. There’s 256MB of DDR3 RAM, which communicates over a 256-bit memory bus, giving plenty of bandwidth for memory hogging features such as FSAA and AF.
Let’s briefly remind ourselves of a couple of the architectural highlights of the 6800 Ultra. The CineFX engine means that the card supports Pixel Shader Model 3.0, part of the DirectX 9.0c spec, though very few games on the market are using this feature – only Far Cry with the latest patch and Lord of the Rings: Battle for Middle Earth spring to mind. The latter game also uses a CinFX 3.0 feature called Vertex Frequency Stream Divider or instancing, which enables multiple copies of similar objects to appear on screen at the same time - say an army of Orcs, or a battlain of battle droids.
Then there's the GeForce 6800's UltraShadow II technology. This is geared towards efficient shadow and lighting calculations and seems to give it a real boost in Doom III, which is a big title for nVidia.
Returning to the hardware, the Leadtek features two DVI ports, and thoughtfully Leadtek has supplied two DVI to VGA converters so you can use two CRT monitors if you don’t have LCD panels.
Leadtek has also heavily customised the heatsink and fan arrangement, and claims that it is quieter than the reference design. This it may be, but when it’s going at full speed it’s still pretty loud. There were two Leadtek GTs in a Evesham SLi system we reviewed and it was really quite noisy.
A killer feature though, is the connector that appears on top of the board. This enables users to use two cards together in SLi form to greatly increase performance. This has given nVidia a huge boost over the last few months in the high-end gaming enthusiast market and details of ATI’s response are only now starting to emerge.