Wednesday, May 26, 2010

Intel just announced that Larrabee is officially, really, NOT going to be released as a discrete GPU any time soon. Or for the next 5 years. Or more.

Why, you may ask? Well, because it made a really weak GPU that would probably cost too much to manufacture and have returned bad margins. How could the mighty and wise Intel have spent over three years working on this project and not know it would suck as a GPU? Well, it could be because they were not reading my blog post of December 5, 2007 where I said:

"Oh, and when did we all decide that x86 was the most perfect instruction set for graphics? Hmmmm, it's not. The example where the x86 is the preferred instruction set for either graphics or high performance computing doesn't exist. The x86 instruction is an burden on the chip, not an advantage, for graphics. In fact, graphics instruction sets are hidden by the OpenGL and DirectX API, so the only people who care are the driver programmers and we are free to design whatever instruction set is most efficient for the task, while Intel has handcuffed themselves with a fixed instruction set. Not the smartest move for graphics. Intel's plans may work better for HPC though. Still, there are many alternatives to x86 that are actually better (shocking isn't it)!"

"Larrabee is based on a simplified x86 core with an extended SSE-like SIMD processing unit. In order to be efficient, the extended-SSE unit needs to be packed efficiently. How is Intel going to do that? The magic is all left to the software compiler, which is going have a tough time finding that much parallelism efficiently."

And that was what happened. In addition, with many coherent cores, the coherency traffic and stalls eat up a lot of bandwidth and limited scaling across cores.

My prediction back then gave Intel too much credit:
"But you heard it hear first: Larrabee will not reach the potential performance capability (by a long shot) and will not displace NVIDIA as GPU leader."
Because Larrabee never even got to compete.

Oh and I still stand behind this quote:
"Let me also ask this question: when has Intel EVER produced a high performance graphic chip? (Answer: never)"

A hard lesson learned by the i guys. But I could have saved them all those many millions of dollars. If only they'd have listened. Sigh.

1 comment:

Anonymous said...
This comment has been removed by a blog administrator.