It's a bit late (actually you may note I tend to blog late at night anyway), but I've been wanting to take on a topic that's of interest to me from both an NVIDIA point of view and my Microprocessor Report analyst background. It's the chances that Intel's Larrabee will be a successful GPU.
I would like to offer the opinion that the deck is stacked against it, despite Intel's hype machine pushing it. For one, the design looks more like the IBM Cell processor or the Sun Niagara (UltraSPARC T2) than a graphics chips. The Cell was proved to be ineffectual as a GPU (which explains why we got the graphics chip in the PS3) and Niagara is focused on Web services (lots of threads to service).
Let me also ask this question: when has Intel EVER produced a high performance graphic chip? (Answer: never)
Oh, and when did we all decide that x86 was the most perfect instruction set for graphics? Hmmmm, it's not. The example where the x86 is the preferred instruction set for either graphics or high performance computing doesn't exist. The x86 instruction is an burden on the chip, not an advantage, for graphics. In fact, graphics instruction sets are hidden by the OpenGL and DirectX API, so the only people who care are the driver programmers and we are free to design whatever instruction set is most efficient for the task, while Intel has handcuffed themselves with a fixed instruction set. Not the smartest move for graphics. Intel's plans may work better for HPC though. Still, there are many alternatives to x86 that are actually better (shocking isn't it)!
Larrabee is based on a simplified x86 core with an extended SSE-like SIMD processing unit. In order to be efficient, the extended-SSE unit needs to be packed efficiently. How is Intel going to do that? The magic is all left to the software compiler, which is going have a tough time finding that much parallelism efficiently.
Larrabee will have impressive theoretical specs that will wow the press, but it's going to be very hard to use the chip up to its potential, which sounds exactly like Cell. And Cell has not lived up to its hype. So my corollary is Larrabee = Cell; Cell not = to hype; Larrabee will not = hype.
Now we wait until sometime in mid-2008 to see some silicon. But you heard it hear first: Larrabee will not reach the potential performance capability (by a long shot) and will not displace NVIDIA as GPU leader.