Home / Computing / PC Component / ATI HD 4670

ATI HD 4670 review




  • Recommended by TR

1 of 16

ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670
  • ATI HD 4670


Our Score:


We were suitably impressed when ATI released its last range of high-end graphics cards. The HD 4870, HD 4850, and HD 4870X2 all produced impressive performance and most importantly delivered it at a decent price. However, relative prices are one thing but if you simply can't afford or can't justify spending more than £50 just to play the latest 3D games then you'll be wanting something a little cheaper. Something like this ATI HD 4670, in fact.

It's based on the same basic design features of the Graphics Processing Unit (GPU) chip that lies at the heart of the 48x0 series of cards but, as you would expect considering its price, there's just less of all the processing bits inside. So, rather than 800 stream processors, the HD 4670 features just 320 and instead of 16 ROPs it has only 8. It also uses the more conventional GDDR3 memory found on the HD4850, rather than the super-fast GDDR5 memory used on the HD 4870, and the GPU talks to this memory over a 256-bit interface. Core clock speeds and memory speeds also differ though these aspects will vary depending on the exact configuration the card manufacturer's release - expect there to be plenty of exotic overclocked versions.

As well as the HD 4670, there is the HD 4650, which uses the same smaller GPU but along with lower core clock speeds, the memory type has changed to DDR2 and the interface has been reduced in width to just 128-bit.

In terms of features there's little to distinguish the HD 4670 from its more powerful brothers. As well as the 3D graphics processing elements, you get ATI's Universal Video Decoder (UVD) and Avivo video that accelerate and improve the quality of video playback. There's also support for the latest edge-detection anti-aliasing mode, which offers by far the best image quality we've seen on any graphics card.

If you're looking to spread the cost of your graphics upgrade, you can buy one of these cards now then in a few months time you can buy another one and take advantage of CrossfireX to use the power of both cards together - assuming you have a CrossfireX or SLI capable motherboard, of course.

We're always a little sceptical of these dual/multi card configurations, due to their potential problems with compatibility and stability, and if you had a choice of getting two cards or a single one for the same price we'd always recommend going for the latter. However, in such a price-sensitive market this route makes sense. I'd certainly rather buy one card now and have some gaming ability in time for Christmas, then save for a few more months and improve performance further, rather than save the entire time.


October 20, 2008, 4:34 pm

another great review! any chance of seeing 3DMark06 benchmark results in the future graphics cards reviews?


October 20, 2008, 5:11 pm

Well, it was something we'd intentionally dropped as many think of it as irrelevant. However, if people would like to see it I can happily reinstate it.


October 20, 2008, 7:39 pm

just a question: i still have an old pentium 4 (3.2 Ghz) with 1GB ddr ram as my main fail safe "in case anything goes wrong i'm still ok" pc. my motherboard has a single pci express slot. will i ever get similar frame rates in something like race driver grid, or is it just as dependent on the cpu as it is on the gpu?

just asking ;-)


October 20, 2008, 8:57 pm

You should be absolutely fine. Most games are largely dependent on the graphics card rather than the CPU. Does your P4 have hyper threading?


October 20, 2008, 9:09 pm

I have the Sapphire implementation of this card and its loud, its not a screeching noise just a very loud whirring from the fan and thats with just a web browser going no gaming or strenuous work at all. The cooler also blocks the use of the PCIe x1 slot immediately next door.


October 20, 2008, 9:32 pm


yes. it's the old prescott chip, too, so sometimes i feel it'll burn my house down.

i just splashed out just over a grand for a new macbook pro, and i've been on a laptop for a very long while. but every now and then i need a little more oomph, and my geforce 6600 doesn't cut it anymore.

Martin Daler

October 20, 2008, 9:43 pm

excuse my slightly philistine outlook here, but what exactly are these graphics cards doing at idle in order to burn over 100W? I mean, a 100W lightbulb gets pretty darn hot, so I guess having one of these cards in your PC is like having a 100W lightbulb burning away inside. No wonder they need a fan. I just can't get my head around what (watt) goes on inside to expend all that energy, when they are idle. Lets not talk about the 200W plus when they are thinking...


October 21, 2008, 1:11 am


That 100w isn't just the card, that's the whole system, which uses a whole load of other high-end components. Also, because it's the power draw from the plug socket that we measure, you can straight away discount 15-20 per cent of that figure because it's lost through the power conversion process in the power supply.

With a more modest system you'd be looking at more like (and this is completely of the top of my head) 60W.

As for the power consumption figure of them under load; do you have any idea of the computing power that is required to perform real-time 3D rendering? It's phenomenal what these modern day cards can do.


Sounds like this card could be a good option for you.


Sounds like the card you've got uses a dual slot cooler, which the card we tested doesn't. Not sure what the loud fan problem is though.


October 21, 2008, 1:42 am

Hi, I have a PCI Express 1.0 m/b... and i was planning to buy this card, but it says on the review that it needs a PCI Express 2.0 slot to get all the power from there.

Do you think it will still work on a 1.0 slot?


October 21, 2008, 4:22 am

Usually a PCI-Express 2.0 card works fine in a 1.0 slot.

Im using a ATI 4850 in a PCI-E 1.0 slot. Works fine.

Jeremy Betteridge

October 25, 2008, 3:23 am

I'm upgrading from an X600 on a Dell Dimension 9150 with 4GB RAM. Will my machine cope with this card okay do you think or should I look at the 3650 or 2600? Great reiews by the way


July 20, 2011, 2:24 am

Can anyone give me a link to a shop with this exact model?

comments powered by Disqus