Our Score


Review Price free/subscription

The release of the G80 based nVidia GeForce 8800 GTX in November 2006 was, in hindsight, a paradigm shift in the computer graphics world. Not only was it the first DirectX 10 graphics card but it also completely blew away the competition in DirectX 9 games before the arrival of Windows Vista and held this lead for an unprecedented amount of time even when Vista and DirectX 10 did finally arrive. Indeed, not until February of this year, when AMD released the ATI Radeon HD 3870 X2, which used two RV670 cores on one card, did the G80 in its various guises have any real competition.

Not that this competition lasted long. Within a few weeks nVidia released the dual G92 based 9800 GX2, which used a similar two-chips-on-one-card method to the HD 3870 X2, and comfortably regained the top performance crown - at least in the games it worked with. nVidia then followed this up with the 9800 GTX, which used a single G92 based chip to marginally extend nVidia's performance lead within the single-chip graphics card market. Of course, ATI still had many good cards and it competed very fiercely in the mainstream sub-£150 market but it just couldn't claim the top spot.

Still, while nVidia held onto its lead with the 9800 series cards, it didn't really push forward any frontiers. Performance was good but not overwhelming and, while new features like HybridPower are useful, the whole range felt a little disappointing.

Just a few months later, though, nVidia has just launched a brand new graphics processor called GT200 that, at least on paper, looks like it should have all the performance necessary to be a true successor to G80. Made up of 1.4 Billion (yes, that is billion with a 'B') transistors, packing in 240 stream processors, 32 ROPs, a 512-bit memory interface, and a whole host of other under-the-bonnet improvements, it is an absolute monster. In fact, with it still being made using the same 65nm process used on G92, it is not just a monster internally, but also externally - at 24 x 24mm this is the biggest single die TSMC has ever commercially produced.

Indeed, if you look at a typical production silicon wafer, which is 300mm in diameter, at most there is only room to produce 94 GT200 chips on each one. Compare this to something like Intel's Conroe CPUs, which are built on the same size manufacturing process but are only 143mm^2 in size, so 426 dies can be produced per wafer, and you get some idea for just how large and expensive GT200 is.

Two variants on the GT200 will be available at launch, and these will be the first parts to take on nVidia's revised branding. All the rebranding boils down to is the switching around of letters and numbers so the new cards are called GTX 280 and GTX 260 rather than the x000 GT/GTX/GTS sort of arrangement we're used to.

The GTX 280 will use the full extent of GT200 with its shader clock running at 1296MHz, 1GB GDDR3 memory running at 1107MHz (2.2GHz effectively), and the rest of the chip purring away at 602MHz. As the power requirement for all this lot will be 236W, the GTX 280 will not only need a conventional six-pin PCI-Express connector but an extra eight-pin one as well.

Meanwhile, the GTX 260, which will be released on the 26th of June (the GTX 280 will be available by the time you read this), has two SM clusters disabled (I'll explain more about this on the next page) and one ROP partition has also been removed. As well as this the clock speeds have been throttled resulting in vital statistics of: 192 shaders running at 1242MHz, 28 ROPs running at 576MHz, and 896MB GDDR3 1000MHz memory. As a result of these speed and component cuts, the GTX 260 will draw less power, 183W to be exact, and consequently needs only a single six-pin PCI-Express power connector.

List pricing is as astronomical as you would expect with the GTX 280 demanding £449 and the GTX 260 £299. What's more, early indications suggest this pricing won't be deviated from too much by stores running discounts or trying to out-price each other. Nevermind, hey.

We'll take a proper look at the GTX260 in a separate article and we'll have a poke around the physical GTX280 card in a few moments but first let's look at what makes nVidia's latest graphical wonder tick.

Next page


June 26, 2008, 3:33 am

"What can we say about Counter-Strike: Source that hasn't been said before? It is simply the benchmark for team-based online shooters and, four years after its release, it's still the most popular game in its genre."

I would argue against that one a quick look at shows COD4 at about 9 mil, COD2 at 5 mil and CSS at 2 mil minutes played today.

Otherwise a very interesting article, the only thing it makes me sad about is the size of my wallet :)


June 26, 2008, 1:19 pm

Okay, it's obviously taken a hit in recent years. I'll amend that line. still doing damn well for such an old game, though.


June 27, 2008, 6:20 am

Well call of duty is a full production game. with single player mode and many different versions of multi player... Cs is A Mod it is not a stand alone game. And there is only one mode of play (i.e. there is no capture the flag or free for all modes)


June 27, 2008, 7:17 pm

Sorry, I'm not sure what your point is Intex?


June 27, 2008, 11:16 pm

I thought this was a fantastic review Ed, very in-depth and informative. Looking forward to a GTX260 review, as there is no way I'm shelling out 400 quid on a graphics card! The 260s seem to be going for 𧶲-300 at the moment... which is still very high in my book, but tempting given the potential performance gains over my current 7900GT KO.

P.S. Ed - I think Intex was replying to Exitialis, justifying the currently lower usage stats of CS:S.

Varis Vitols

July 3, 2008, 4:25 pm

life said on 27th June 2008

In that case, why don't You have a look at Radeon HD 4870? It outperforms GTX 260 in almost every case, particularly with AA enabled - by 20 percent. In many cases it stands very close to GTX 280, with AA enabled, but costs only 185-230 pounds at online stores, depending on manufacturer.


July 3, 2008, 11:43 pm

What are your comments on the explosive heat and noise that the GTX 280 generates?

Flight Instructor

March 27, 2009, 6:49 pm

I like your reviews, in particular when they refer to games hardware, but I only use flight simulator, is it possible to include this type of game when reviewing hardware.

comments powered by Disqus