nVidia Reveals the ‘Doom 3’ GPU

nVidia's claims its native PCI Express GeForce 6600 part will be the best affordable graphics GPU for Doom 3.

nVidia is definitely flavour of the month at the moment thanks to its frame rate superiority in the uber-game that is Doom 3 being plastered all over the web. In something of a coup nVidia has managed to get Doom 3 creator and general all round 3D graphics guru, John Carmack, to state that, “NVIDIA’s latest generation of chipsets gives exceptional performance and feature gains across the entire line, from the consumer cards to the speciality cards. I use a GeForce 6800 class card in my primary workstation, which is the best endorsement I can give.”

It’s no surprise therefore that nVidia is taking full advantage of this support by announcing that its new budget chip, the GeForce 6600, is the ‘Doom 3 GPU’.

nVidia claims that the GeForce 6600 will offer up to three times the performance of ATI’s mid-range PCI-Express part, the X600, in Doom 3. The company’s documentation is claiming that you’ll be able to run the game at 1,024 x 768 with High Quality settings and 4 x FSAA and 8x AF at around 40fps, or without any Image Quality (IQ) settings at up to 1,600 x 1,200. nVidia didn’t state which CPU this results were obtained with.

Aside from performance, the GeForce 6600 will bring most of the strengths of the plaudit winning 6800 series architecture, such as Shader Model 3, to the mainstream. nVidia can afford to do this as the 6600 series will only feature eight pixel pipelines, as opposed to the 12 or 16 offered by the 6800 series. Another major cost saving feature is that the 6600 will have a 128-bit, rather than a 256-bit, memory interface. The 6600 series will also feature the strong video features of the 6800, such as advanced de-interlacing, inverse 3:2 pull down, motion estimation and WMV9 acceleration.

Previously codenamed NV43, there will be two flavours of the GeForce 6600. The standard version will feature 128MB of DDR1 memory and sport a small fan that in the reference images resembles the one used ‘back-in-the-day’ for the GeForce 3. The GT however will sport 128MB of DDR3 memory, and have a larger fan. DDR3 consumes less power than DDR so the GT could be a good candidate for overclocking. The GPU on the GT will run at 500MHZ while the memory on both will be 500MHz (1GHZ effective). These high figures are achievable as the part is built on an efficient 0.11 micron process enabling higher clocks to be reached with lower power consumption. The GPU speed for the vanilla 6600 hasn’t been confirmed but it is inevitably going to be slightly lower than the GT figures. The good news is that both will be a single-slot solution requiring no extra power connectors at all, unlike the GeForce 6800 series that requires one for the standard and GT parts, and two for the Ultra variant.

However, the GeForce 6600 will be PCI Express only, which means that it won’t be an option for those looking for an inexpensive upgrade from older technology. To get the part, you’ll have to have a motherboard featuring a 16x PCI-Express slot, which at the moment is limited to Intel’s 915 and 925 chipsets. AMD fans therefore need not apply. However, it is likley to be a popular option with system builders.

A big incentive though, to move to a PCI Express system is that the GT version of the card will be SLI capable, which should make it a way of getting great performance for a relatively low outlay.

We expect to have boards in the TrustedReviews labs by the end of the month, where we’ll be putting the GeForce 6600 through its paces to see if the part is as good as nVidia claims. Final boards are expected in September with a host of board manufacturers and system builders already confirming that they will be relesaing products based on the GPU.

The GeForce 6600 GT will have a suggested retail price of €229 – equivalent to £153. Exact pricing for the 6600 standard has yet to be confirmed. nVidia can be found on the web at www.nvidia.co.uk.

Unlike other sites, we thoroughly review everything we recommend, using industry standard tests to evaluate products. We’ll always tell you what we find. We may get a commission if you buy via our price links. Tell us what you think – email the Editor