Home / News / PC Component News / Nvidia VP Claims Moore's Law Is Dead

Nvidia VP Claims Moore's Law Is Dead

Gordon Kelly


Nvidia VP Claims Moore's Law Is Dead

It is well known - both commercially and legally - that Intel and Nvidia have been at each other's throats for some time now and this isn't likely to improve relations...

Speaking out in a column on Forbes.com Nvidia vice president Bill Dally (pictured) has announced "Moore's Law is now dead." For those in need of a backgrounder: Gordon Moore is Intel's much lauded co-founder who made the bold prediction in a 1965 research paper that the number of transistors on a circuit can be doubled inexpensively roughly every two years. It is often misquoted as the 'speed' of processors (similar, but not the same) will double every 18 months.

Remarkably Moore's Law has stayed largely true for the last 45 years and is spoken about with great reverence at Intel, so what has Dally been saying? In short: that we need to move into an era of parallel processing instead.

"We have reached the limit of what is possible with one or more traditional, serial central processing units, or CPUs," he explains, describing serial verses parallel operations as akin to one person adding up a word count verses many people each counting a paragraph then adding these numbers together.

"Going forward, the critical need is to build energy-efficient parallel computers, sometimes called throughput computers, in which many processing cores, each optimized for efficiency, not serial speed, work together on the solution of a problem. A fundamental advantage of parallel computers is that they efficiently turn more transistors into more performance. Doubling the number of processors causes many programs to go twice as fast. In contrast, doubling the number of transistors in a serial CPU results in a very modest increase in performance - at a tremendous expense in energy."

Uncannily enough, Nvidia's GPUs are well versed in parallel operation with Dally pointing out "Every three years we can increase the number of transistors (and cores) by a factor of four. By running each core slightly slower, and hence more efficiently, we can more than triple performance at the same total power. This approach returns us to near historical scaling of computing performance."

Of course there are a great deal of barriers to making what Dally is talking about truly viable on a mass scale, not least re-educating programmers and moving on from ageing bedrock software that cannot take advantage of parallel processing. That said, the efficiencies in GPUs has seen a trend in recent years to increasingly offload traditional CPU tasks such as video playback and reformatting to the GPU while both Internet Explorer 9 and Firefox 3.7 have shown how even browsers can offload web page rendering onto the GPU for faster page loading.

How will Intel respond? One of the joys of covering the tech sector is waiting to find out...


Via Forbes.com


May 4, 2010, 12:22 am

He looks too grey to appreciate colour graphics.

Gareth 4

May 4, 2010, 2:55 pm

While parallel computing is where we're headed, I dont think it's quite a case of double the cores double the power. Havent we seen that in Sli and crossfire configured graphics cards that you never get anywhere near as much a performance boost as expected?


May 8, 2010, 6:21 am

If these 2 companies indeed start a war it will be a very strange war between entities that need each other to survive.

We have seen the relative importance of the GP units in a specific computer system rise and rise since they first appeared. The raw processing power of a high end GPU is now so large that for specific uses the choice of GPU will be a deciding factor when choosing a specific system: we have seen that for years in notebook computing and I myself always have that in consideration.

Parallel processing is a very specific task and as we see more and more software prepared to take partial or full advantage of this feature we will still need the more general capabilities of the CPU.

The way I see it disregarding any of the 2 would be like having a library without a librarian or a librarian without a library.

comments powered by Disqus