Summary

Our Score

8/10

Review Price free/subscription

:

ATI has done it, and 3dfx did it as well, but up until now there's has never been a dual chip 3D card based on nVidia technology.

Anyone who knows something of the history of graphics cards won’t be too surprised by the 3D1. Impressed maybe, but not surprised. As soon as nVidia made the announcement out of the blue late last year that it was bringing back SLi technology, it seemed as though the clock had rolled back for graphics fans. 3dfx, the company that brought 3D graphics on the PC into the mainstream, first introduced the original version of SLi technology with its Voodoo 2 cards, and it ruled the roost until nVidia itself matched the performance with its single card Twin Texel Engine TNT line. nVidia then began to rapidly climb to the top of the 3D pile and by the time it introduced the GeForce, ATI had to pull something out of the hat to keep up. The result was the Rage Fury MAXX sporting dual Rage 128 Pro chips and an amazing 64MB of memory! By this time next year, dual cores will have become a mainstream part of the PC industry, but back then dual chips on one card was a radical concept.

While the Fury MAXX initially garnered praise for its technical ingenuity, it was in fact something of a desperate move by ATI who required two chips to do what at the time nVidia could do with one. Due to various issues such as no Windows 2000 driver support, the Rage Fury MAXX had a very brief spell at the top before nVidia’s GeForce 2 blew it away. The other dual chip graphics cards of note were the dual VSA-100 powered Voodoo 5 5000 and 5500, which were 3dfx’s last gasp products before it collapsed under the strain of trying to keep pace with nVidia.

So with such a chequered history, it could be viewed as a brave move by Gigabyte to reintroduce the concept of a dual chip graphics card. In fact, the technology behind the 3D1 is far sounder than its predecessors. In our tests it proved to be more reliable than regular two card SLi and it benchmarked without any hiccups, unlike all the other SLi set-ups we’ve tested.

The 3D1 consists of two GeForce 6600GT chips on one board. Billed as a 256-bit card with 256MB of GDDR3 memory, it is in fact a dual 128-bit card, with each chip still limited to a 128-bit memory interface and only able to address 128MB of memory each. As we’ll see this does limit the performance at high resolutions.

By way of small compensation Gigabtye has sourced faster memory than standard 6600GTs, hitting 600MHz (1,200MHz effective) rather than 500MHz. The core speed is the same at 500MHz for each chip. While there are two chips on the board, it doesn’t turn the 3D1 into a 16 pixel pipeline card - it’s still an internal eight-pipe solution outputting four pipes per clock.

Not surprisingly there are two fans integrated into the large heatsink upon which the Gigabyte logo is branded. It has to be said that the card is very impressive to look at, and feels pleasingly solid in the hand before you slot it into the motherboard. If you want a piece of hardware that going to impress your mates, this is certainly it.

As this point we should mention the biggest limitation of the 3D1. While it uses a standard PCI Express connector, at the moment it will only work in Gigabyte’s own K8NXP-SLI motherboard. This is why the board and card are sold together as a bundle. It’s possible that Gigabyte will update the BIOS's of its boards to work with this card, but it’s unlikely to work with non Gigabyte products. Therefore if you buy this bundle you’re effectively buying into a proprietary Gigabyte solution.

Next page
comments powered by Disqus