- Review Price: £316.00
ATI has done it, and 3dfx did it as well, but up until now there’s has never been a dual chip 3D card based on nVidia technology.
Anyone who knows something of the history of graphics cards won’t be too surprised by the 3D1. Impressed maybe, but not surprised. As soon as nVidia made the announcement out of the blue late last year that it was bringing back SLi technology, it seemed as though the clock had rolled back for graphics fans. 3dfx, the company that brought 3D graphics on the PC into the mainstream, first introduced the original version of SLi technology with its Voodoo 2 cards, and it ruled the roost until nVidia itself matched the performance with its single card Twin Texel Engine TNT line. nVidia then began to rapidly climb to the top of the 3D pile and by the time it introduced the GeForce, ATI had to pull something out of the hat to keep up. The result was the Rage Fury MAXX sporting dual Rage 128 Pro chips and an amazing 64MB of memory! By this time next year, dual cores will have become a mainstream part of the PC industry, but back then dual chips on one card was a radical concept.
While the Fury MAXX initially garnered praise for its technical ingenuity, it was in fact something of a desperate move by ATI who required two chips to do what at the time nVidia could do with one. Due to various issues such as no Windows 2000 driver support, the Rage Fury MAXX had a very brief spell at the top before nVidia’s GeForce 2 blew it away. The other dual chip graphics cards of note were the dual VSA-100 powered Voodoo 5 5000 and 5500, which were 3dfx’s last gasp products before it collapsed under the strain of trying to keep pace with nVidia.
So with such a chequered history, it could be viewed as a brave move by Gigabyte to reintroduce the concept of a dual chip graphics card. In fact, the technology behind the 3D1 is far sounder than its predecessors. In our tests it proved to be more reliable than regular two card SLi and it benchmarked without any hiccups, unlike all the other SLi set-ups we’ve tested.
The 3D1 consists of two GeForce 6600GT chips on one board. Billed as a 256-bit card with 256MB of GDDR3 memory, it is in fact a dual 128-bit card, with each chip still limited to a 128-bit memory interface and only able to address 128MB of memory each. As we’ll see this does limit the performance at high resolutions.
By way of small compensation Gigabtye has sourced faster memory than standard 6600GTs, hitting 600MHz (1,200MHz effective) rather than 500MHz. The core speed is the same at 500MHz for each chip. While there are two chips on the board, it doesn’t turn the 3D1 into a 16 pixel pipeline card – it’s still an internal eight-pipe solution outputting four pipes per clock.
Not surprisingly there are two fans integrated into the large heatsink upon which the Gigabyte logo is branded. It has to be said that the card is very impressive to look at, and feels pleasingly solid in the hand before you slot it into the motherboard. If you want a piece of hardware that going to impress your mates, this is certainly it.
As this point we should mention the biggest limitation of the 3D1. While it uses a standard PCI Express connector, at the moment it will only work in Gigabyte’s own K8NXP-SLI motherboard. This is why the board and card are sold together as a bundle. It’s possible that Gigabyte will update the BIOS’s of its boards to work with this card, but it’s unlikely to work with non Gigabyte products. Therefore if you buy this bundle you’re effectively buying into a proprietary Gigabyte solution.
The back plate of the card features a DVI and a D-Sub connector, as well as S-Video and component output via a break out box. Unlike on standard 6600GTs, 6800GTs and Ultras’s there’s no SLi connector at the top of the card, as all of the PCI bandwidth is used up running this card – so if you were dreaming of a quad 6600GT set-up, give up now!
While we’re not reviewing the motherboard here, rest assured that the K8NXP-SLI gives you everything you would expect to find at this level, as well as extras such as a Wireless card and Gigabyte’s own Dual Power System module for clean power. Also included in the box is the game Thief: Deadly Shadows, pleasingly supplied on DVD-ROM and the title Joint Operations: Typhoon Rising.
So how did the Gigabyte 3D1 perform compared to ‘conventional’ SLi? For reference we compared to the scores we obtained from the recently reviewed MSi K8N Diamond SLi motherboard, which was tested with two 6600GT and a single 6600GT. The Gigabyte board fared well. It was a tad slower in Half-Life 2, and a shade faster than the MSi set-up in Far Cry, while in Doom 3 it was both a touch quicker and a mite slower depending on the resolution. In SYSmark 2004 it matched the MSi, but lagged behind in PC Mark 2004.
Overall, it’s pretty even stevens, with the differences attributable to normal benchmark variations. In 3DMark03 it was the same story, while in 3DMark05 it was able to complete more tests than the MSi set-up was able to. This was due to their being a newer version of 3DMark 05 available when we conducted the tests on the Gigabyte, though the results are still comparable.
Taken as a whole though I was impressed with 3D1. It manages to do what it says on the tin and deliver the same performance of a two card SLi 6600GT set-up on a single card. The cheapest I was able to find the Gigabyte bundle was for £361 from Tekheads, which compares favourably with the price of the MSi motherboard and two 6600GTs, which comes in at around £398, £37 more expensive.
However, even more significant than the cost saving is the reduced heat and noise that the single card system delivers. It’s by no means a silent solution but it’s certainly quieter than having two graphics cards in your machine. It also produces less heat, which should ensure greater system stability.
However, if it’s real power you want for you’ll probably find that the 3D1 or indeed any 6600GT-based SLi system just won’t give you enough grunt to truly impress. In my mind at least, SLi implies being able to run at mouth wateringly high resolutions with all the image quality enhancements (IQ) turned on. As the scores prove this just isn’t possible, either with two 6600GTs or with the 3D1, at least if you want truly playable frame rates.
Unfortunatley at 1,600 x 1,200, the memory bandwidth limitations of the card became clear once we enabled image quality settings and it was not able to complete the tests at this resolution. In all SLi compatible tests, at 1,600 x 1,200 with 4x FSAA and 4x AF, the 3D1 just doesn’t have enough grunt and things aren’t that much more impressive at 1,280 x 1,024 with full IQ on.
It’s clear from this is that when really pushed at the highest settings the 128-bit interface and the 128MB addressable memory limit of the 6600GT core is a limiting factor and having two chips doesn’t help in this situation.
As you can see here, a single 6800GT is able to produce better numbers at these demanding settings. What’s more, these 6800GT scores were generated on a platform running a 3.46GHz Pentium 4 Extreme Edition – had we been running it on the same Athlon64 FX55 system as the 3D1, the difference would have been even more marked, as the Athlon is a significantly better performer with games than the Intel chip.
This means that I’m left wondering why you’d want the 3D1. As a piece of technology it’s impressive – definitely the most successful dual chip graphics card yet produced – but if you want to get decent performance without the noise, I’d rather go for a single 6800GT. Admittedly an SLi board and a single 6800GT will cost you around £420 – about £60 more than the 3D1 bundle, but you’ll get better numbers at high resolutions with IQ on, and you’ll also have the option to add a second 6800GT when the budget allows, to really push the numbers through the roof .
The 3D1 doesn’t quite match a single 6800GT when the settings are maxed out and doesn’t offer any further upgrade potential. It will also limit you in the future to using Gigabyte motherboards, which could prove to be too restrictive in the future considering the investment. But if you’ve already decided on buying a 6600GT based SLi system, the Gigabyte 3D1 bundle makes a lot of sense. It will save you money on a standard 6600GT set-up, is effective and also will reduce system noise and increased stability over a two card solution.
Score in detail