Trusted Reviews may earn an affiliate commission when you purchase through links on our site. Learn More

Asus EN9800GX2 Review

For a number of reasons we’ve had mixed opinions about graphics card manufacturers making cards that use two graphics chips on one board. For a start, they tend to be expensive, which is understandable considering the extra hardware but still lamentable. Then there’s driver/game support that can, if not up to scratch, mean some games simply don’t work or don’t see any benefit from the second chip. Finally, there’s the fact that even when everything’s working fine, the performance increase from the second chip can be limited. It’s for these reasons that ATI’s last attempt at gaining our top graphics card recommendation, in the shape of the HD 3870X2, fell just short of the mark, receiving a good but not quite there, 8/10.


So when nVidia launched its competing dual-chip card, the 9800 GX2, we had that same sinking feeling in our bellies. That said, the 9800 GX2 does combine two very fast cards into one package that, unlike conventional SLI configurations, doesn’t require an SLI motherboard and, at least theoretically, has the potential to be the fastest ”single” graphics card on the planet.


Just as with its last attempt at a dual-chip card, in the form of the GeForce 7950 GX2, for the 9800 GX2 nVidia has gone with the arguably crude route of just bolting two whole cards together in one package – as opposed to the HD 3870X2, for which ATI managed to fit two chips on a single PCB. Not that this resulted in the HD 3870X2 being any smaller than the GeForce 9800 GX2 but it did allow the former to have a better cooling solution (more about that later).


Where nVidia has improved on its last attempt is by refining the package so that both cards are now completely contained inside a metal shroud, which keeps all the delicate electronics protected from our grubby mitts. The two cards then communicate over an internal ribbon cable that’s linked to a dedicated PCI-Express bridge chip on one of the PCBs. This enables the SLI communication that’s needed to power all this dual-chip madness to be performed completely ‘on card’. The massive advantage here being you don’t need an SLI capable motherboard.


The two cards have also been manufactured in mirror image so that both chips face inwards, which enables nVidia to use a single large fan at the back of the card to suck air in, blow it across both GPUs, and exhaust it out the back. Unfortunately, because of the way the card’s outputs are configured the actual exhaust on the back plate is tiny, which is going to restrict airflow considerably, leading to potential overheating. Indeed our friends over at bit-tech found there were some other heat issues with the card that resulted in the north bridge chip of certain motherboards overheating. Basically, be aware that you’ll need a very well ventilated case to successfully use this card.

Display output options are a little different to what we usually expect as joining the regular duo of dual-link DVI-I connectors is an HDMI v1.3 socket, which has replaced the normal component/composite/S-video output that we’ve come to expect. All three outputs support HDCP, for playing back protected HD content like Blu-ray discs, but only the HDMI connection allows you to carry a digital audio signal from your computer to your AV equipment. Like previous nVidia cards, this also requires you to use an internal S/PDIF connection from your sound card to a socket on top of the GX2, which is hidden under that small rubber plug that can be seen just near the back of the card in the shot below.


You’ll also notice another plastic tab next to the S/PDIF connector. This covers the two auxiliary power connections that are required to provide enough power to the card. One is a six-pin PCI-Express socket and the other is an eight-pin socket. So you’ll need a hefty modern power supply to run this card.


One final socket is hidden away along that top edge and it is, of course, an SLI connector that will enable you to run two of these cards in a quad-SLI configuration. Something we like to call the ‘more money than sense’ setup. We’re not just being jealous so and so’s when we say that, though. It’s well known that the performance of SLI doesn’t scale linearly as you add more and more graphics cards as overheads in calculating how to distribute the computing load across the graphics chips makes it less and less efficient. You will get more performance but it won’t be anywhere close to four times what you’d get from a single card so the investment is not a sound one.


One feature we really like on this card is its use of coloured lights to help guide you when installing it. First up are the two auxiliary power connectors which are backlit so they glow red if there is no connection (or if you’ve tried to plug a six-pin PCI-Express plug into the eight-pin socket) and turn green when the card is correctly connected. The display outputs have had a similar treatment with the primary display being illuminated by a blue LED and another LED repeating the power status indication on the back panel, which is very useful because it enables you to quickly check you’ve plugged everything in correctly without opening up your PC.


The arrangement of the auxiliary power connectors is a little peculiar as, just like the PCBs inside the card, they face each other. The result is that the little clips that hold the plugs in are crammed in the middle making them a real pig to remove. It’s a small issue that will only be of concern when troubleshooting, upgrading, or, in our case, when you have to regularly swap out graphics cards for testing!


Another complaint we have is with the multi-monitor support. Essentially you can’t run multi-monitor with the card running in SLI mode so surround gaming is impossible. By turning off SLI, you can use multi-monitor setups for desktop work but overall it isn’t exactly an elegant solution. In contrast ATI has managed to enable multi-monitor support in not just its dual-GPU HD 3870X2 but all Crossfire setups.

Up to this point I’ve talked little about the technology that powers the two graphics chips sitting at the heart of the 9800 GX2 but this is for one very good reason. There’s actually very little in the way of new features. The chips are based on the same G92 cores that power the 8800 GT and 8800 GTS 512, which in turn are based on the same basic architecture that powered the 8800GTX. So, if you want to bone up on the details of how these chips go about doing their graphics processing business we suggest you give those articles a read.


In a nutshell, on each chip you get 128 stream processors (split into eight shader clusters) running at 1,500MHz. Each shader cluster then has eight texture addressing and eight texture filtering units. These run at the core clock frequency of 600MHz, along with the 16 render backends, or ROPs. Each chip is then accompanied by a hefty 512MB chunk of memory, which runs at 1,000MHz (effectively 2,000MHz) and communicates over a 256-bit memory channel.


If all that went straight over your head, think of it like this; the 9800 GX2 is two 9800 GTX cards strapped together. When you combine the power of the two cards, you theoretically get 768 GigaFLOPS of shader processing power, a texture fillrate of 76.8 Gigatexels per second, and pixel fillrate of 19.2 Gigapixels per second. These are undeniably impressive figures but of course harnessing that power requires game and driver support to be top notch and this is certainly something that’s difficult to guarantee. However, what we can do is tell you how it performs with the games we have at our disposal. So without further ado, let’s look at the numbers.


”’Test Setup”’


* Intel Core 2 Quad QX9770

* Asus P5E3

* 2GB Corsair TWIN3X2048-1333C9 DDR3

* 150GB Western Digital Raptor

* Microsoft Windows Vista Home Premium 32-bit


”’Drivers”’

* nVidia: forceware 175.16

* ATI: Catalyst 8.4


Due to some problems with our test bed (we think the hard drive was corrupting things) we weren’t able to test as many games as we did when we looked at the HD 3870 X2 so we can’t say quite so comprehensively what compatibility is like with this card. However, in the games we did test it’s pretty clear from our results that the 9800 GX2 is indeed the fastest card on the planet. All except in our Counter-Strike: Source test, which obviously wasn’t taking any advantage of SLI – no doubt due to our offline version of the game being slightly out of date. Bearing this in mind, and the fact that some of our other games (in particular Crysis) didn’t work, we will certainly be retesting sometime soon and updating this article accordingly.


In the mean time though, it’s clear that if money is no object then this is certainly the card to go for. Yes, power consumption is quite high but then you can take advantage of HybridPower to run things more efficiently. Also, buying two 8800 GTS 512 cards and running them in SLI will get you similar performance and save £50 but then you’re limited to a large SLI motherboard. Finally, while compatibility may be a problem, even if SLI isn’t supported properly, just one of the internal ”cards” will still give impressive performance in most of today’s games. Indeed what the 9800 GX2 does is, once again, highlight how graphics hardware is waiting for games to catch up, rather than vice versa. With the exception of Crysis, and unless you run games at the absolute highest resolutions, any card over £200 will serve you just fine.


”’Verdict”’


nVidia’s GeForce 9800 GX2 is undeniably a superbly fast card if and when it works. The only real kicker is the price, which is understandably high. Still, if you’ve got a few hundred pounds burning a hole in your virtual pocket then we’d recommend it.

”’Enemy Territory: Quake Wars”’




—-

”’Call Of Duty 4”’


”’Call Of Duty 2”’




—-

”’Counter-Strike: Source”’


Trusted Score


Score in detail

  • Value 7
  • Features 8
  • Performance 9

Unlike other sites, we thoroughly test every product we review. We use industry standard tests in order to compare features properly. We’ll always tell you what we find. We never, ever accept money to review a product. Tell us what you think - send your emails to the Editor.

NAV BUG FIX