Home / News / PC Component News / Intel Cancels Larrabee

Intel Cancels Larrabee

Gordon Kelly


Intel Cancels Larrabee

It was meant to signal the start of a brave new dawn in the graphics industry, but Intel's Larrabee has ultimately fallen flat on its face.

The chip giant has confirmed to DailyTech that Larrabee has been cancelled as it struggled to keep up with the price/performance benchmarks being set by rivals ATI and Nvidia.

"Larrabee silicon and software development are behind where we hoped to be at this point in the project," said Intel in a email to DailyTech. "As a result, our first Larrabee product will not be launched as a standalone discrete graphics product, but rather be used as a software development platform for internal and external use. While we are disappointed that the product is not yet where we expected, we remain committed to delivering world-class many-core graphics products to our customers. Additional plans for discrete graphics products will be discussed some time in 2010."

Well. That's. Disappointing. Larrabee was a hugely ambitious project and something of a CPU/GPU hybrid, but the writing had seemed on the wall for some time. Most obvious to this was Intel's declared intention to break the two teraFLOPS performance barrier at launch next year, but ATI has already passed five teraFLOPS with the Radeon HD 5970. Riyad had also already expressed his Larrabee doubts back in September when Intel gave the GPU a public demonstration.

Let's hope Intel's "additional plans for discrete graphics products... sometime in 2010" can get our juices flowing again...


via DailyTech


December 7, 2009, 11:11 am

Typo - break not brake.

Shame that Larrabee won't be around to shake things up - just goes to show how hard it is to stay in the GPI business.


December 7, 2009, 11:14 am

@Greg - agreed, and thanks!

PS, typo yourself ;)


December 7, 2009, 1:58 pm

Good spot, only I can't edit mine!

To expand a little on what I'd said before, it is quite telling that Intel have pulled out of this (or I suspect, mothballed the concept waiting for cheaper and smaller silicon). Intel have vast resources that they can throw at something like this - in some respects I have to commend them for stopping and not 'doing a Sony' and blindly pushing ahead indefinitely.

The reason I posted again was to highlight the potential problems with both ATI and Nvidia, particularly the latter as it doesn't have a large parent to look over it. Nvidia's bankers must be in a tough position - keep bankrolling the operation in the hope of the elusive future payback, or drop out with huge losses. Intel can afford to make the decision themselves - I wonder what will happen over the next 3 years or so in this sector.

prag fest

December 7, 2009, 3:18 pm


Very good points, I cant see how the high end GPU market makes money / justifies the development costs these days. I may well be wrong, but with the rise of the 360 / PS3 and more than good enough general GPU solutions, do they sell enough add in cards these days to make any sort of real money?

Shame about the Larrabee mind, too much too soon perhaps. I still see multi core expansion as the future of computing, perhaps higher core count traditional CPU's coupled with a more 'standard' GPU architecture will be their future...


December 7, 2009, 3:28 pm

Wow, there's some serious misconceptions about the size of the GPU market here. Nvidia made $107 million last quarter and ATI is actually propping up AMDs CPU division. Yes gaming is floundering a little but it's by no means a dead duck yet.

prag fest

December 7, 2009, 3:39 pm


Well for once I'm pleased to say I was wrong :) I guess the point I was trying to make though is, over the coming years how much R&D will go into developing and evolving high end performance parts as has been the case for the last 10 years or so. I would guess a large percentage of NVIDIA's profits come from the chipset, lower end GPU (9400M etc) sector.


December 7, 2009, 4:32 pm

They don't make much profit from their chipset business as far as I know. It is mostly their graphics unit. They have huge margins on their workstation class GPUs and Tesla also has some promising future in terms of profit margins.

I haven't done any extensive research on where NVIDIA gets their money from, but if I where to guess then I'd say 70% of their revenue is from consumer graphics chips, about 20 is from their workstation class GPUs (including things like Testla) and the rest is Tegra and chipsets. They have huge margins on their workstation class GPUs though, so that would probably account for 40 to 50% of their profits. Just my guess though, haven't done any proper research.

b o d

December 7, 2009, 5:29 pm

If all the hype is to be believed, 3D will be the big thing next year. Not sure how it will work for TV, but Nvidia have 3D vision already supporting several hundred games and the results are stunning. It is the greatest leap in gaming since the switch from 2D to 3D. Risen, DA origins, avatar, batman in 3D are more immersive than playing in 2D. It won't make a bad game good and it isn't for everyone. Having to use glasses and minor ghosting may put some people off, but personally I don't really notice either.

I'd be more concerned about ATI. Nvidia already have a 65% market share which are mostly low end cards. It should have the advantage in high end PC gaming with a 3D product already launched. ATI seem to be taking a multiple monitor approach with eyefinity. Wonder if they will stick with this or also launch an 3D alternative?


December 7, 2009, 6:40 pm

Ed - I'm aware of the Nvidia profit as well as AMD propping up ATI. But please, have a look at the mountain of debt underpinning that profit (harder in ATI's case now).

Intel thought they could crack into that market, and then shift the market towards parallel computing, attacking AMD on two fronts. The fact that such a cash-rich business decided to pull the plug is quite telling about how wafer thin the margins are once you actually get into this business.


December 7, 2009, 6:54 pm

I think the title is slightly misleading. Intel have said they have cancelled the first version of Larrabee.

I heard that the second version is also out of the window and all efforts are now focussed on version 3 which is a completely different design from 1 and 2 and shows a lot more promise.

Anyway, the original report came from Charlie at SemiAccurate which is something I would read with a very heavy pinch of salt as he trolls everything that isn't ATi/AMD.

Either way, Intel have produced their 48 core chip prototype, and Larrabee will eventually come. Computing is going that way as the GPU and CPU converge.


December 7, 2009, 7:21 pm

@Max Power - I'd argue your point on the title as I suspect any subsequent versions of Larrabee won't be called Larrabe. There'd be little point making the association with a cancelled project.


December 7, 2009, 11:40 pm

But it isn't a cancelled project, they 'merely' decided to not commercialize it. The Larrabee project itself is still running, but the fruits of their labor will only be seen quite some time from now and not in 2010 as Intel had hoped. My guess would be something in 2012 or 2013.


December 8, 2009, 2:59 am

just a quick reminder, because i'm confused - was larrabee a discrete graphic solution or the one intel were planning on integrating onto the cpu/have done already, i've just missed the launch?


December 8, 2009, 4:24 am

I actually think Nvidia have been pretty smart lately. They seem to have focussed quite a lot on expanding their business by making their products more useful to the masses. Laptops are an increasingly large proportion of PCs purchased these days and previously you had to hunt pretty hard to find one with anything other than Intel graphics. By bringing out Cuda and making their graphics chips desirable to non-gamers they're essentially eating into Intel's territory. I'm not at all surprised that Intel has decided to do something about it and they (Intel) won't be at all happy about this delay. Competition is good though so i hope Intel do manage to pull something out of the hat.

On an unrelated note, i was brought right back to this page after signing in. I'm sure others have mentioned it already but just wanted to give you guys a (non-patronising) pat on the back :)

comments powered by Disqus