Home / News / PC Component News / nVidia Unveils 'Fermi' Next Generation GPU Architecture

nVidia Unveils 'Fermi' Next Generation GPU Architecture

Gordon Kelly


nVidia Unveils 'Fermi' Next Generation GPU Architecture

Looks like GPU technology is ready for another major leap...

Late last night nVidia announced its first major architecture upgrade since the G80 nearly three years ago. It goes by the (unexplained) name of 'Fermi' (Enrico Fermi dedication?) and will feature a mind-blowing 512 cores and three billion transistors. Cores will be broken down into 16 streaming processors with 32 cores each while GDDR5 memory will be its drug of choice.

nVidia's 'Parallel DataCache' technology and 'GigaThread' engine also make their debut and it's the first GPU to support ECC (error checking and correction). So while Fermi will form the heart of future GeForce and Quadro lines, if you haven't guessed already, we're not just talking pure gaming but GPU computing too.

"Fermi delivers supercomputing features and performance at 1/10th the cost and 1/20th the power of traditional CPU-only servers," nVidia proclaims on the Fermi official site page.

The first real world graphics card to feature Fermi is likely to be the 'GT300' series and while we don't yet know pricing or real world availability it's safe to say ripping video and playing Crysis won't cause it any problems!


nVidia Fermi Architecture


October 1, 2009, 5:18 pm

*Cough* NVIDIA not nVidia *cough*


October 1, 2009, 5:32 pm

@Mik3yB - we don't write all caps company names unless they are acronyms. It'll always be nVidia to me ;)


October 1, 2009, 5:57 pm

Looks like a pretty weak paper launch to compete with the very real 5870 from ATi.

Though the actual content is decent enough, Nvidia definitely need to do this to stay relevant in the next 5-10 years. The trouble is that they need to stay relevant in the near term to ensure they survive. Difficult balancing act, but I'm sure they'll eventually merge with Via and produce an x86 chip worth talking about. Whether that will be enough to compete with the AMD-Intel duopoly though is uncertain.


October 1, 2009, 6:19 pm



October 1, 2009, 10:15 pm

1/20 the power?

Easily misunderstood, methinks! ;¬)


October 1, 2009, 10:41 pm

It's nVidia, trust me. Also: ATi not ATI =)


October 2, 2009, 2:00 am

I've always said "En-Vidia" (nVidia. "Ne-Vidia" doesn't sound right to me


October 2, 2009, 1:45 pm

@ Hugo. Last time I spelt it the old way, one of the boys from NVIDIA corrected me.. I was only trying to prevent them from sighing at you too haha :D


October 2, 2009, 4:19 pm

Nvidia are playing a very clever game. With Cuda and physx they have me. When I played through batman (It rocks)I wanted that tiny extra immersion that physx brings. I know only a few games support it but added in with Cuda (I know ATi has something similar) it just means im sticking with them. So long as the price is competitive ( I got my 260GTX with batman for £125) I will buy Nvidia over Ati


October 4, 2009, 6:14 am

I had one of those faulty NVIDIA GPUs so as long as ATi make comparable GPUs that's the reason I'll always buy ATi over NVIDIA plus I think the ATi cards look way better than NVIDIAs offerings

Hamish Campbell

October 5, 2009, 2:15 pm

heh, I wanted to see what the name meant, and interestingly wikipedia spells it Nvidia consistently while mentioning that the company themselves write NVIDIA in their material.

comments powered by Disqus