Home » Opinions » 40 Years and Counting

40 Years and Counting

by | Go to comments

Share:

It's Intel's 40th birthday this year so, being the world's largest integrated circuit manufacturer, you'd expect it to be putting on some big showy events to celebrate. Well, we're seven months into the year and we've not really seen much yet. Maybe Autumn's IDF in San Francisco will bring with it all the fireworks and Atom showers we've been hoping for. Time will tell.

In the meantime though, a selection of European journalists did just get to take part in a conference call with Intel's senior vice president and co-general manager of its Digital Enterprise Group, Pat Gelsinger. He talked to us for about an hour about where Intel stands today, what developments we will see in the near future and, most interestingly, what he feels are the four most important moments in Intel's history and what the four biggest challenges the company has yet to face are. So we thought we'd share with you the thoughts of a man that has been at the heart of one of the most influential technology companies for the last 25 years.
/94/2d6ec4/f5c8/8077-idffall2006Gelsinger21.jpg
Intel 80386 - The First 32-bit processor

The introduction of the 386 in 1986 was a true paradigm shift in the computing industry. Not until 2003 when AMD introduced 64-bit computing with its x86-64 enhancements, which were part of the Athlon64, would there be such a significant change for the x86 processor family. Not that that everyone saw it like that.

Pat recalls how he was 'hounded' by press at the time wanting to know what on earth Intel was thinking because 'What program could possibly require 32-bits of addressing?'. Of course, with hindsight we can all have a good laugh at the naivety of such questions but then we only have to look at the current state of play with 64-bit computing and people's continuing scepticism of it. Even though it's been 5 years since AMD first introduced 64-bit hardware and it's common knowledge that running Windows Vista is optimal on a system with at least 2GB of RAM, there are still people out there that see 64-bit as an unnecessary change.

As well as 32-bit computing bringing with it a whole new raft of hardware it also ushered in a new era of software, or at least that's the way round Pat sees it. Around this time the first version of Windows arrived and, although they didn't use Intel processors, Apple's MacOS was also in its infancy. Ok, it wasn't until Windows NT arrived in 1993 that Windows became truly 32-bit but the seeds of the 32-bit future were sown when Intel introduced this venerable CPU.

Pentium Pro - RISC vs CISC

One of the longest running debates in computing circles is that of Reduced Instruction Set Computers (RISC) vs Complex Instruction Set Computers (CISC).

In the early days of the computer industry, programmes were written in assembly language or machine code. These low-level languages deal almost directly with the computer's underlying hardware, only adding a basic human readable element to proceedings so programmers don't have to write code in straight 1s and 0s. As a result, complicated procedures can involve hundreds of lines of code. This is in contrast to modern high-level languages that use a high degree of abstraction from the underlying hardware (then use compilers to convert it into something the machine understands) to enable programmers to write complicated programmes using only a few lines of code.

Aware of the difficulties of programming with low-level languages, CPU designers started to implement complicated instructions on-chip, cutting out the amount of code that needed to by written. Retrospectively this method of CPU design became known as CISC.

When high-level programming languages started to come to the fore, the need to reduce programmers work was lessened as complicated operations could be interpreted by the compiler and converted into simple code the CPU could understand. The emphasis thus shifted back to making CPUs that could power through a smaller set of generic instructions as quickly as possible. This method became known as RISC.

For a number of years the two design methods were seen as mutually exclusive but with the introduction of the Pentium Pro, Intel proved this didn't have to be the case. By using a complicated, CISC-like, outer instruction set and reducing this internally to a more basic RISC instruction set, the processor could maintain compatibility and performance in a wide variety of situations.

Now all desktop CPUs use this same principle but of course Pat was keen to cite Intel's switching of allegiance as a key part in shifting the industry. We'll concede, he's probably not far off the mark.

Go to comments
comments powered by Disqus