Our Score


Review Price free/subscription

Today sees the launch of AMD's latest top-of-the-range DirectX 11 graphics card, the ATI Radeon HD 5970 (codenamed Hemlock). Like AMD's previous high-end cards, it uses two graphics chips housed on a single board and has an internal CrossFire interface to get the ultimate in performance. Since AMD's current top-of-the-range single-chip card, the Radeon HD 5870, is already one of the fastest in the world, this new card is sure to take performance to the next level. Shortly we'll see if it does but first, a bit of a history lesson.

For the last few generations of its graphics cards, AMD's strategy has been to create a chip that will compete at what it sees as the more lucrative mid- to high-end market of £200-£300, rather than the ultra high-end of £400+. It then uses two of the chips to create a single monster of a card to compete at that top-end. The same architecture is then used to design scaled down versions of the chip to create lower end cards. In contrast, nVidia has aimed its previous top-end chips much higher, thus creating much more expensive and faster single chip cards. This is still nVidia's strategy, as its latest 'Fermi' architecture proves.

The result of this is that AMD has had to bring its dual-chip cards to market simply to compete with nVidia's top-end single chip cards (not to mention nVidia's dual-chip cards as well). However, this time around we have a slightly different situation. While the HD 5870 still isn't the out and out fastest card - the nVidia GeForce GTX 295 and ATI Radeon HD 4870 X2 (both dual-chip cards) still trade places for top spot with it - it's the clear choice of the three due to its support of DirectX 11, its lower power consumption, and better compatibility thanks to its single chip (while CrossFire/SLI support is getting ever better, some games still have issues with these dual-chip solutions). As such, the need for AMD to release the HD 5970 is limited right now; it could easily sit back and wait for the HD 5870 to reap its rewards for a bit longer. Nevertheless, AMD has chosen to launch the card already so let's take a closer look.

The HD 5970 uses two Cypress chips mounted on a single board with 1GB of GDDR5 memory for each chip. The GPUs run at 725MHz while the memory is clocked at 1GHz (effectively 4GHz), which theoretically means this card will have performance that sits somewhere between two HD 5870s in Crossfire and two HD 5850s also in CrossFire - it has the same number of SIMDs as the 5870 but the clock speed of the 5850.

Physically the card is something to behold, at 12.2in (310mm) long and weighing 1.2kg, it is simply colossal. This immense length means the card overhangs the back of a standard ATX motherboard by 2.5in (6.5cm) and as such won't fit in many cases, so check there's space before opting for this card (incidentally, the ATX specification does allow for cards up to 13.3in long so AMD isn't breaking any rules here).

Conversely, by picking prime examples of its Cypress chips and keeping the clock speeds down, AMD has managed to keep the HD 5970's total power consumption to below 300W, so it only requires an 8-pin and a 6-pin auxiliary power connector, rather than two 8-pin connectors. This means it stands a better chance of being compatible with your power supply.

Next page


November 18, 2009, 11:43 pm

that poor motherboard...


November 19, 2009, 12:00 am

FYI - Looks like the page numbering has gone a bit funny here, guys.


November 19, 2009, 2:08 am

@Tobeman: Not sure what you mean?

tom 6

November 19, 2009, 3:36 am

you have messed up the comparison table the third column has the wrong figures in the wrong places.


November 19, 2009, 5:24 am

Thanks for including noise tests - looks like the Nvidia cards are a fair bit louder when idle.


November 19, 2009, 2:25 pm

@tom: Thanks, I've updated it now.

George 13

November 19, 2009, 5:09 pm

From your graphs I can see that the ati HD 4870 X2 matches and even outperforms GTX 295 in some cases. Is this due to new drivers from ati?


November 19, 2009, 5:18 pm

@George: Not especially, it's long competed with the GTX 295 in our tests. There is a great deal of variation between games, though. Something that's particularly well demonstrated comparing Far Cry 2 to Crysis.

George 13

November 19, 2009, 10:45 pm

Thanks Ed. What is great though is the gain of 31fps in the 2560x1600 4xAA at Call of Duty, since the last test.


November 19, 2009, 10:54 pm

Yeah, obviously performance is going to have improved since over a year ago when that card was brand new.


December 4, 2009, 4:39 am

You gave a £500 graphics card a 7 for value. Are you kidding me! At university 70% is a first, the highest grade boundary you can get! That would be a B at A level, but you give it to the most expensive piece of equipment on the market!? This is insanity.


December 4, 2009, 9:51 am

@Kaiser202 - I completely understand where you're coming from on both accounts. In defence I'll say that value isn't the same as affordability. We feel this card has exceptional performance and is therefore reasonably justified in its premium price.

That said I also feel quite strongly about the notion of scoring. In my personal opinion 5 means average and I'd certainly like to see our scores adjusted to represent that. The problem is it would create a massive inconsistency with our back catalogue of 1000s of reviews over the years so it may not be feasible. After all, the world would decimalise time if it were remotely practical!


December 5, 2009, 7:59 pm

@ Gordon

I understand, but then maybe the best response would be to completely revamp the ratings system. Have a different approach to other websites as well, maybe a graded system (like school A*-F) or a class based system with quirky names to indicate levels of awesomeness (for example, Editors choice you already have, but lower grade ones like, Wallet Buster for the 'inexpensive but good' reviews and 'PowerHouse' or 'Bragging rights' for something like this graphics card that costs about the same as the computer I built 2 years ago (that still plays most games at 1920x1200) for the expensive but awesome things.

This way you could avoid inconsistency. Hope this is enlightening, I look forward to seeing the whole website change based on my recommendations! :P


December 5, 2009, 10:48 pm

@Kaiser202 - quite possibly ;) We've discussed these and many more. Scrapping all scores apart from an overall could well be the answer. Either way, many changes are coming in 2010!

comments powered by Disqus