Now for something close to my heart: microprocessors. I have a question for you: is microprocessor design dying? I ask, because chips like Intel's Atom are perceived as innovative, when, in fact, Atom is a through back design. An analyst was asked by a publication is he thought Atom would be one of the innovations of the year. Atom is dual-issue, in-order design where low power trumps performance. The architecture is stripped down and basic, not innovative. The innovation is in the marketing of it.
Another piece of evidence: MIPS is planning as layoff. It's been pretty obvious that MIPS ran out of ideas and has been coasting on the designs of Cavium and RMI to keep it relevant and setup boxes and PS2 for revenue. It was destined to be squeezed by POWER from above and Arm from below, but MIPS gave in with a whimper, not a real fight.
The big iron guys like IBM and Sun still keep moving along, but the excitement now comes from the promise of many smaller cores working together (what Sun calls Throughput Computing) with processors like Niagara and Larrabee, only there's this problem with creating mainstream software for many-core processors. And if the goal IS more cores/mm2, those cores are going to be rather simple in design (like the x86 cores in Larrabee).
So, we dump the problem onto the programmers and sit back and yell: "What's the problem? Can't you programmers figure out how to split that code over a bunch of cores? What's so hard about that?"
Well for graphics, it isn't too tough, because there's all these pixel to work on in parallel. It's also reasonably well known in data base, web serving, and some super-computer programming. But it's still going to be a challenge for many other applications.
Then again, maybe we have to ask: are processors already good enough for typical client workloads? Therefore, we don't need particularly powerful client microprocessors, just really good Internet connectivity to cloud computing and low power. Gamers and professional content creators will be the lone users of powerful client computers. Like with cars, everybody is worried about gas mileage today and only a few crazies still care about performance.
I'm just asking.
Sunday, August 17, 2008
Life now has a renewed sense of urgency. A close family member of my wife (actually, the cousin of her ex-husband, but we're actually very close to him) was diagnosed with a malignant brain cancer. The only symptom he had was he had difficulty speaking for the last 2 weeks. No other symptoms. After he insisted on getting it check, they found a tumor and today the biopsy came out as malignant. We're devastated about it. He just turned 60 and was in excellent health. We're very upset.
The thing that it does make you think about is how much time we have in this world. At 53, I'm thinking about what I am doing that is not what I want to do. I've made recent career decisions based on monetary gains, not based on my passions. It just not going to work for me over the long haul. So changes will have to be made, and made soon.
The thing that it does make you think about is how much time we have in this world. At 53, I'm thinking about what I am doing that is not what I want to do. I've made recent career decisions based on monetary gains, not based on my passions. It just not going to work for me over the long haul. So changes will have to be made, and made soon.
Thursday, August 07, 2008
I don't know why I bother with this blog, but I've decided I need to keep writing because I want to polish my skills. That and I REALLY want to get my opinion out in some sort of public forum. Every time I talk to somebody about processors, I find myself going into my analyst mode and it just seems like where I feel most comfortable. My biggest problem as an analyst is that I can be too rational, and sometimes the success or failure of a product, or even a market, is not based on rational reasons.
What I want to do in this post is say that I'm not completely against Larrabee as my previous posts would have suggested. I am completely against the hype around Larrabee. Realistically it's over a year away from introduction, most of the work to make it a graphics card involves software, and Intel has a terrible track record on quality graphics.
What I become intrigued about with Larrabee is the possibility of developing custom software APIs and renderers that don't use DX or OpenGL. The last graphics company that went up against Microsoft and its DX juggernaut was 3Dfx with Glide. Intel has a much stronger position than 3dfx, but still there going to be some tension with Microsoft.
Jon Peddie had an interesting perspective: he thinks it can expand the market by "validating" discrete graphics in the mainstream. I also have another reason to encourage Larrabee - it is likely that Intel will limit the power of its integrated graphics chipsets in order to not overlap with Larrabee. Intel will willingly throttle its own chips to make way for Larrabee, which also means NVIDIA (being unconstrained) will continue to lead Intel's integrated graphics.
What I want to do in this post is say that I'm not completely against Larrabee as my previous posts would have suggested. I am completely against the hype around Larrabee. Realistically it's over a year away from introduction, most of the work to make it a graphics card involves software, and Intel has a terrible track record on quality graphics.
What I become intrigued about with Larrabee is the possibility of developing custom software APIs and renderers that don't use DX or OpenGL. The last graphics company that went up against Microsoft and its DX juggernaut was 3Dfx with Glide. Intel has a much stronger position than 3dfx, but still there going to be some tension with Microsoft.
Jon Peddie had an interesting perspective: he thinks it can expand the market by "validating" discrete graphics in the mainstream. I also have another reason to encourage Larrabee - it is likely that Intel will limit the power of its integrated graphics chipsets in order to not overlap with Larrabee. Intel will willingly throttle its own chips to make way for Larrabee, which also means NVIDIA (being unconstrained) will continue to lead Intel's integrated graphics.
Subscribe to:
Posts (Atom)