With AMD’s recent acquisition of ATI, everyone has been wondering how the industry is going to be affected. In reality, we aren’t going to see any major differences until at least 2008, but obviously plans could well change.
The launch of the X1950 can be marked out as the final true ATI launch, as it was taped-out and ready prior to AMD’s buy out. Quite honestly, it couldn’t have made a better card to end on as an independant company.
ATI has now launched several cards – the X1950 XT-X, the X1900 XT (256MB Edition), the X1650 Pro, and the X1300 XT. Although I’ll only be looking at the X1950 XT-X today, these are all important launches and we will be looking at them all at a later date.
Pictured above you can see the reference X1950 XT-X (below) next to a Sapphire X1900 XT-X. The cooler is a lot fancier looking on the X1950, although there is something about chromed plastic that reminds of cheap children’s toys. – I’d sooner have had it without. The X1950 is quite a bit heavier than the X1900 due to its all copper cooler, but heavy is good, heavy is reliable. You’ll notice the fan has been moved to the rear of the cooler too, and is of a slightly different design.
From the back, it’s hard to tell the two apart as the PCB is almost identical. In fact, this isn’t far from the truth. The core itself is still an R580 and still operates at 650MHz, with 48 pixel shader processors, 16 texture units, eight vertex shaders and 16 pixel output engines.
There are two key differences. This is a slightly newer revision of the R580 with some minor improvements (which may or may not filter their way down to the X1900 range). The final difference is memory, which is now GDDR4 instead of GDDR3 and running at an effective clock speed of 2GHz.
GDDR4 is a welcome improvement over GDDR3 and is what the R580 core was designed to be paired with from the start. Aside from the improved frequencies, it is also considerably more efficient. There are several reasons for this, but one of the key ones is down to some clever ‘bit flipping’ – if you’d rather not understand how this works, skip to the next page. If you do stay here.
As you probably know, there are eight bits in a byte, and this can be represented as 00100110. It requires more power to represent a zero, than it does a one. So in that example, you would flip the bits and set a flag so the memory controller knows the data is flipped. So the memory controller would receive 11011001 and then flip it back to 00100110. If you think about it, using this technology, the maximum number of zeros ever being transmitted will be four, whereas before it could be as high as eight. That’s quite a clever way of reducing power consumption!
One of my biggest complaints with the X1900 series was the cooler. ATI actually sat a group of journalists down a few months ago and interviewed us for opinions on its products. The biggest complaint that came up, from both myself and other journalists were noise and drivers. The positive side is both of these issues have been addressed.
This cooler is a godsend. At boot it’s whisper quiet and although it does spin up during intense gaming it remains barely audible. From a noise point of view, this is by far the best card to be pairing with a Core 2 Duo. Ironic, considering ATI would sooner see us pairing it with an AMD AM2 system.
Catalyst Control Center is also loading considerably quicker. Some of this can be attributed to its move from .net 1.1 to .net 2.0. This month also heralded a move on the Linux front, with considerably better drivers there too. I’ve got these running in my Fedora Linux office machine, which is running a Radeon 9200 quite happily.
Quieter Cooler? Decent Linux support? Better drivers? All I can say, it’s about time – nVidia has had all of these for quite some time. But better late than never, and these are all things that would definitely sway my buying decision somewhat.
For testing this card, I used our reference Intel 975XBX “Bad Axe” motherboard, with an X6800 Core 2 Duo. Coupled with 2GBs of Corsair CMX1024-6400C4 running at 800MHz 4-4-4-12. I used WHQL 6.8 Catalyst drivers, and for the GeForce 7950 GX2 I used the WHQL 91.31 drivers.
Our “Spode Mark 3D” testing suite has had a little bit of an update. This involved updating all of the games to their latest versions. The only game that wasn’t updated was Quake 4, as I was having stability issues with the new SMP patch. I have also added Prey in to the mix, which is an OpenGL game based on the Quake 4 engine. I’ve also changed the resolutions we test at, replacing 1,920x 1,440 with the widescreen resolution 1,920 x 1,200, which due to the popularity of 24in monitors is becoming a more common resolution.
”’These changes mean that results aren’t directly comparable with previous tests.”’
I ran Call of Duty 2, Counter Strike: Source, Quake 4, Battlefield 2, Prey and 3DMark06. Bar 3DMark06, these all run using our in-house pre-recorded timedemos in the most intense sections of each game I could find. Each setting is run three times and the average is taken, for reproducible and accurate results. I ran each game test at 1,280 x 1,024, 1,600 x 1,200, 1,920 x 1,200 and 2,048 x 1,536 each at 0x FSAA with trilinear filtering, 2x FSAA with 4x AF and 4x FSAA with 8x AF.
The X1900 XT-X ran perfectly without any issues, but both the GeForce 7950 GX2 and the new X1950 XT-X gave me some issues.
At first, the 7950 GX2 would not boot in the machine unless I put it in the x4 PCI-E slot. From there, I had to update the BIOS in order to get the board to support it. This is a known issue with GX2 boards, but this is the first time I’ve experienced it. After the update, it ran fine. However, you might notice that Prey performance is less than spectacular. This is because the used 91.31 drivers don’t have an SLI profile for Prey. Although there is a beta driver available that adds such a profile on nZone.com, I don’t like to use beta drivers unless I have to. We can take a pretty good guess at Prey performance by looking at Quake 4 figures, but this does illustrate a key problem with the 7950 GX2 – if there is no SLI profile, you’ll get considerably lower performance.
The X1950 XT-X had problems of its own too. I would erratically get a solid screen of green. Our testing suite runs timdemos for quite a few hours and comes in and out of 2D/3D modes a lot. It seems the service that ATI uses for switching between 2D and 3D clocks was not happy with the speed at which we were doing this. I solved this by forcing the 3D clocks on as default and turning off the service. This is no the first time I’ve had issues like this, so I’m hoping ATI will take a look at the service.
So how was performance? The X1950 XT-X was generally 5-10 per cent faster than the X1900 XT-X. Ignorning the Prey results, which do little more than give an example of what happens when there isn’t an SLI profile, the 7950 GX2 was consistently faster than the X1950 XT-X, sometimes by considerable amounts. In the lower resolutions, things often played in ATI’s direction, but in the resolutions that matter and with FSAA and AF switched on, nVidia pulled away.
However, frame rates are not everything – we all know there is more to the story. Firstly, there is no denying that ATI’s filtering methods give considerably better quality images than the competitors, and I’d sooner sacrifice a few frames per second for this. Secondly, ATI support full precision HDR and FSAA simultaneously , something nVidia hardware can’t do. Granted, there are only three titles on the market that can take advantage of this (Oblivion, Black & White 2, Far Cry), but I’m sure this list will get longer.
ATI has announced X1950 XT-X CrossFire cards, which means there is an easy upgrade opportunity that should reap considerable benefits. Although you can now add a second 7950 GX2 in to a machine for Quad SLI, there are diminishing returns when scaling four GPUs.
Finally, there is price. Overclockers.co.uk has several cards at around the £330-350 mark including VAT. That’s phenomenal for a top-end card. In comparison, the 7950 GX2 is around £360-380 including VAT.
It’s almost silent, the drivers are better, has a better feature set, better price, better image quality and it’s ready to be paired with a CrossFire edition card. The 7950 GX2 may be slightly faster and doesn’t need a specific card to run in SLI, but if it was my choice, I know what I’d be taking home to meet the parents.
Score in detail