Home / News / PC Component News / nVidia GeForce GTX 295 Revealed

nVidia GeForce GTX 295 Revealed


nVidia GeForce GTX 295 Revealed

We all knew it was only a matter of time before nVidia launched a successor to the Geforce 7950 GX2 and GeForce 9800 GX2 based on its GT200 chip, as powers the GeForce GTX 280, to take on the AMD ATI Radeon HD 4870 X2 - the current fastest GPU available to buy. With the launch of the GTX 295, nVidia will be snatching back that performance crown.

On paper the GeForce GTX 295 looks like this:

  • Fabrication Process: 55 nm

  • Core Clock (texture and ROP units): 576 MHz

  • Shader Clock (Stream Processors): 1242 MHz

  • Memory Clock (Clock rate / Data rate): 2000 MHz

  • Total Video Memory 1792 MB

  • Memory Interface 448-bit per GPU

  • Total Memory Bandwidth 224 GB/s

  • Processor Cores: 480

  • ROP Units: 28

  • Texture Filtering Units: 160

  • Texture Filtering Rate 92.2 GigaTexels/sec

  • Connectors: 2 x Dual-Link DVI-I, 1 x HDMI

  • Max Board Power (TDP): 289 watts

The individual cards used to make the GTX 295 are a sort of hybrid between the GTX 280 and GTX 260, in that the clock speeds and memory interface match the latter, but the number of stream processors is the same as the former. The shrink to 55nm is an important one as almost definitely the only reason nVidia was even able to fit two GT200 chips next in such close proximity, thanks to the lesser heat output of the smaller dies. And, of course, it means that this two-in-one card draws less power than two separate GTX 280s in SLI, which it should be performance competitive with. 55nm also means nVidia can get more chips out of each silicon wafer it uses, meaning better profitability and, hopefully, not too expensive a price tag on the GTX 295.

Against the 4870 X2, nVidia has two advantages. First, nVidia's top-range single card solution is faster than AMD's and second, nVidia's SLI dual-GPU performance is generally better than AMD's Crossfire, in terms of scaling. The upshot should be a pretty decent performance advantage nvidia's way. We'll have to get a sample in to benchmark to see if that prediction holds true, though.

Add, if such things are of interest, Quad SLI support and PhysX and nVidia has a pretty compellign offering. I can only hope the lack of an MSRP so far isn't an ominous portent.


I've just heard back from our friendly local nVidia representative. the US MSRP is $499 (£325), which is really very good and about in line with the 4870 X2 over there. That considered, it seems likely we'll be looking at the £400 region when the card is available here - not too bad all things considered.


December 18, 2008, 7:30 pm

Just to stay consistent with the reported specs, it should state that the card has 56 ROPs. The card has 28 ROPs per GPU and thus 56 ROPs on the card in total.

The card better be quite a bit faster than the 4870X2, NVIDIA has had more than enough time to prepare a proper counterattack. As for pricing, rumors indicate that it has a MSRP of $499, which is lower than what the 4870X2 currently has. Although the price of a 4870X2 will probably be lowered upon the GTX295's launch, that's just my prediction though.


December 18, 2008, 11:49 pm

*ati is unaffected by nvidia's attack*

*ati unleashes hd 4970/4870 x3/x4*

*nvidia fainted!*


December 19, 2008, 12:55 am

289w TDP ! That'll take some cooling. Hope its not the Nvidia Dustbuster mk2.


December 19, 2008, 5:30 am

Frankly, at 289W TDP, that's a lot less then the 4870 X2, and may be even less then my single 4870 512MB. Considering it's two GPU's on one card, 289W is a worthy achievement.


December 19, 2008, 4:07 pm

Any graphics card guzzling the best part of 300w is no achievement in my eyes. The fact that the 4870 X2 sucks up 265w flat out (xbit labs) is bad enough but that it uses 80w idling is deplorable. Intel and AMD realised a while back that the 'CPU nuclear arms race' was reaching meltdown and have concentrated on efficiency as well as performance. Cool and quiet seems to have passed Nvidia and ATI by.


December 19, 2008, 7:37 pm

>The fact that the 4870 X2 sucks up 265w flat out (xbit labs) is bad enough but that it

>uses 80w idling is deplorable

I think the 265w flat out isn't a big problem, but the 80w while idle I would agree.

If you don't want 265w flat out, then underclock the card, its horses for courses I suppose, do you want fast frame rate, or save the planet. Maybe Nvidia can borrow Intels 45nn manufacturing lab to bring the wattage down a tad.


December 19, 2008, 11:29 pm

Indeed. What I am really waiting for is a decent passive card from the current generation for my living room machine.


December 20, 2008, 1:56 am

I know Gigabyte do a passively cooled 1GB 4850 (model no. GV-R485MC-1GH).

As far as the 4870X2's power consumption goes, I was going on Bit-tech's figures where it's load consumption was measured at a mind-boggling 483W, with the 4870 512MB coming in at 333W. That's why I thought Nvidia's 289W sounded so good. Must be different testing methodologies between Bit-tech and X-bit labs though.


December 23, 2008, 4:09 am

were you factoring in the fact that Bit-Tech takes the power reading from the wall socket and reports that as-is? They do this deliberately as there's no sure-fire way of knowing what the power consumption without a graphics card is because you need a card of some sort to boot the machine enough to take a reading! You could do a relative measurement, but that wouldn't give you the total card output then!


December 26, 2008, 8:06 am

No, thanks for the explanation. Talking of Bit-tech, why no cheese-cake reviews on TR this Christmas? Where do I turn to for an alternative opinion? Ah well, I doubt anyone will read this now with this news-story being so old :)

comments powered by Disqus