- Page 1 nVidia GeForce GTX 470 Fermi Review
- Page 2 Fermi Architecture Review
- Page 3 Fermi Architecture Cont. Review
- Page 4 The Cards Review
- Page 5 Test Setup Review
- Page 6 DirectX 9 and DirectX 10 Gaming Review
- Page 7 DirectX 11 Gaming Review
- Page 8 Power Consumption and Noise Review
- Page 9 Results Analysis and Conclusions Review
- Review Price: £319.99
It’s a tale as old as the industry that the fortunes of one tech company wax while those of another wane. Trends come and go, one technology supersedes another, one left-field, blue-sky idea takes off while another crash lands. It’s no surprise, then, that the two biggest graphics card manufacturers of the last decade have seen their fortunes rise and fall. Of late, it’s been ATI that’s had a good time of it thanks to its Radeon HD 5xx0 series of graphics cards. For just shy of six months they’ve been the clear choice thanks to class-leading performance, features, and power consumption and of course they’re the only DirectX 11 compatible cards on the market. Finally, however, ATI doesn’t have the DirectX 11 party all to itself as nVidia has launched the GTX 480 and GTX 470, two cards based on its latest chip technology codenamed Fermi.
On paper, both these cards look like they should be well on course to take performance to a new level. The GTX 480 has 480 stream processors, which is double that of nVidia’s previous top-of-the-range card the GTX 285, while the GTX 470 has 448. Coupled with an upgrade in memory from GDDR3 to GDDR5 and a whole myriad of architectural changes these cards seem to have everything they need to catch up with or overtake ATI’s best.
Normally when a new range of graphics cards arrives we look at the flagship part first and take that opportunity to analyse the underlying architecture as well. However, due to nVidia having a limited number of review samples, we’ll actually be looking at the slower GTX 470 card in this review. We will still, however, take an in-depth look at the overall Fermi architecture that will be powering this card and nVidia’s entire range for the foreseeable future.