- Page 1 nVidia GeForce 7950 GX2 Review
- Page 2 nVidia GeForce 7950 GX2 Review
- Page 3 nVidia GeForce 7950 GX2 Review
- Page 4 Battlefield 2 Review
- Page 5 Call of Duty 2 Review
- Page 6 Counter-Strike: Source Review
- Page 7 Quake 4 Review
- Page 8 3DMark06 Review
- Page 9 Overclocking Results Review
- Page 10 Call of Duty 2: SLI Optimsiations Effects Review
The chip in the above picture is what enables this card to work. It is a PCI Express to PCI Express Bridge and is fully compliant with the PCI Express standards. This means it should work in any motherboard, SLI ready or not. Some boards may require a BIOS update as they aren’t used to seeing such a device, but we didn’t have an issue with our Asus A8N32-SLI. This chip has 48 lanes – 16 lanes connecting to the motherboard slot and then 16 lanes replicated to each chip containing PCB.
Inevitably, this means that the card is bottlenecked by the 16 lanes of the PCI-E slot, but by each PCB have 16 lanes each it means that if running in a game that doesn’t support SLI, the first PCB will get full use of the bandwidth. It also means communication between the cores should have little or no latency.
However, the fact that this card uses only one slot, doesn’t require an SLI motherboard and is detected as one GPU by the operating system doesn’t mean it’s a single GPU. SLI technology is still at work and requires an SLI profile to be in place in order to gain full performance. nVidia has made quite a point of advertising this as a single GPU, replacing SLI on/off with “Dual Display” and “Dual GPU” modes. We couldn’t find anything to do with SLI profiles in the drivers used (91.29), which meant we couldn’t specify any manually.
For this reason, I find it a little misleading to advertise this card as a 48 pipeline card with a 1GB frame buffer. I must emphasise, this is a two 24 pipeline part, each with a 512MB frame buffer. As with the 7900 GT, each core has 16 pixel output engines and eight vertex shaders.
Unlike any other reference card to date, this card is fully HDCP compliant, which is great news. There are two dual-link DVI connectors. However, just as with SLI, you can’t run dual displays while SLI, or “Dual GPU” mode is enabled. To use two displays, you must use “Dual Display” mode, or in the case of SLI, turn off SLI. This is a complete annoyance. I don’t understand how nVidia can create something like TurboCache, which will only use system memory when a 3D application is loaded, but they can’t apply a similar principle to SLI.
As you can see above, it’s a dual-slot card – not surprising really. But unlike SLI, you have only two DVI connections. This means you can only use two displays, unlike the four displays available when running two cards in a non-SLI display mode. This isn’t a huge draw back really and there are many ways around this problem should you need more displays.
Anyone with a keen eye will have noticed the SLI connector on top of the card. This card will be used in OEM supported Quad SLI machines. However, it is not currently supported for anyone who wants to build their own and drivers for this won’t be available for some time. This is a clear upgrade path though, for anyone who knows what they are doing. Previous Quad SLI implementations have been a little on the poor side, so I would like to see a decent setup before I suggest this as upgrade path – but the option is still there.