Thursday, August 07, 2008

I don't know why I bother with this blog, but I've decided I need to keep writing because I want to polish my skills. That and I REALLY want to get my opinion out in some sort of public forum. Every time I talk to somebody about processors, I find myself going into my analyst mode and it just seems like where I feel most comfortable. My biggest problem as an analyst is that I can be too rational, and sometimes the success or failure of a product, or even a market, is not based on rational reasons.

What I want to do in this post is say that I'm not completely against Larrabee as my previous posts would have suggested. I am completely against the hype around Larrabee. Realistically it's over a year away from introduction, most of the work to make it a graphics card involves software, and Intel has a terrible track record on quality graphics.

What I become intrigued about with Larrabee is the possibility of developing custom software APIs and renderers that don't use DX or OpenGL. The last graphics company that went up against Microsoft and its DX juggernaut was 3Dfx with Glide. Intel has a much stronger position than 3dfx, but still there going to be some tension with Microsoft.

Jon Peddie had an interesting perspective: he thinks it can expand the market by "validating" discrete graphics in the mainstream. I also have another reason to encourage Larrabee - it is likely that Intel will limit the power of its integrated graphics chipsets in order to not overlap with Larrabee. Intel will willingly throttle its own chips to make way for Larrabee, which also means NVIDIA (being unconstrained) will continue to lead Intel's integrated graphics.

2 comments:

Anonymous said...

Why would Intel want to sandbagging its integrated graphics to protect its fledgling offering in the highly competitive discrete GPU market? Instead, Intel will try to quickly kill off the entire discrete GPU market by integrating some number of Larrabee-like cores on to every x86 chip it sells (but probably not until 2010 or 2011). Intel is going to make many-core chips anyway, so why not use some Larrabee-like cores and gain the ability to do graphics, too?

You could say, oh, well NVIDIA will still reign in the market for users that want more performance than a single chip can provide. Well, instead of a seperate CPU and GPU chip, Intel instead will sell multi-socket systems that let two or more of its GPCPUs chips talk to each other via coherent QuickPath. This both takes away the "discrete" GPU market and the emerging multi-socket multi-GPU systems.

Such a future would be bad for NVIDIA. The question is, what is NVIDIA doing about it?

Kevin and Fely said...

Two responses: there's no way Intel will have Larrabee integrated into a multicore chip until at least 2012. Intel is only going to be shipping the discrete version in late 2009/early 2010. Integrated will be likely be at least 2 years after that. The integrated Nehalem graphics is derived from the existing integrated graphics cores.

Second, you have to know Intel's history to understand why they would throttle integrated graphics. An excellent example: to make way for 64-bit Itanium, Intel refused, for years, to add 64-bit instructions to it's Xeon server processors, even in the face of AMD's AMD64 campaign. Intel has a long history of building walls like that to clearly differentiate product lines.