Sunday, December 14, 2008

At least this economic recession has had one benefit - it has flushed out a number of financial cheats and scoundrels. The greedy have been found out and in many cases dealt with. Companies that used financial tricks have run out of options. Companies are going to have to make money the old fashioned way - by creating something of value. I've never understood a system that rewards people for moving money around without creating any real value. But that's just the engineer in me.
As the agony of AMD's spiral down continues, it couldn't have helped when the San Jose Mercury columnist Chris O'Brien put AMD on his list of companies that are in serious trouble. http://www.mercurynews.com/business/ci_11200791?nclick_check=1

Why did AMD fail to sustain its success after its triumph over Intel with Opteron and Athlon 64?

First off, AMD underestimated Intel's ability to rebound from the Pentium 4/Tejas debacle. Intel is a very resilient competitor with a lot of resources and a process and manufacturing group second to none. Intel, to some extend, was also lucky. It had a very sharp team in Israel that was working on lower power processors that could be enhanced for more performance. And so Conroe (Core 2 Duo) was born and the start of the long slide down for AMD.

AMD's management conceived of a grand idea of merging the raw performance of GPUs with the general purpose CPU (Fusion). It's an elegant and exciting concept. It was design to put AMD back ahead of Intel. The problem was that AMD grossly overpaid for the number 2 GPU vendor, ATI, and AMD's execution of the Fusion development concept has been slow and painful with a number of canceled versions.

Meanwhile, Intel has embarked on it's own GPU and its own version of Fusion. Right now, it looks like Intel might have a Fusion part in the market ahead of AMD. This was clearly a project that AMD could manage with too many new teams trying to work together. It just hasn't jelled.

So, AMD struggles to keep afloat by selling off anything it can. The latest part of the company to be jettisoned is the fabs. Before that it was Legarity, Spansion, Alchemy, and its PLD division.

Frankly, I think AMD's given up competing with Intel. AMD will try to hold onto enough market share to keep in business, but right now, there's very little upside until 2010 or 2011 when 32nm Bulldozer-based cores hit the market. No, instead, AMD management is going to try to eek out some minimal self respect by taking on NVIDIA as much as possible. But we're as resourceful and has adaptable as Intel. So I see no
opportunity there. And, afterall, ATI was rarely very profitable, so even a healthy ATI can't hold up a sick CPU business.

There a re a lot of companies that don't want AMD to go out of business, but at the same time, I don't see these companies digging into their own pockets to save AMD.

Friday, November 14, 2008

I listened to parts of AMD's Financial Analyst Day (gotta keep up with our competitors). First off, I have to say Rick Bergman's attempt to get a majority share of the graphics business by lumping in consoles was lame to the nth degree. Neither ATI nor NVIDIA controls or drives the sales of consoles. Also, those are contract designs from years ago that aren't even competitive with today's parts.

Other than that, as a former AMDer, I found Dirk's pitch on AMD's reason to play in the market, well uninspiring. It came down to: well the market needs an alternative to Intel, it might as well be us. How inspiring it is to say you're tired of losing money, or your soon to ex-CFO saying there was finally a change to cash flow positive for the first time under his watch. The message was not about leadership, but about competence. AMD wants to be a competent x86 microprocessor supplier. Good for them, I guess. The days of talking about 30% market share and kicking Intel's butt are over.

I guess the only sign of competitive juices is their competition with NVIDIA. They caught NVIDIA flat footed with an unconventional strategy. Remember how that played out with Intel? First AMD got the upper hand on Intel with Athlon64/Opteron, then got cocky, didn't bring anything new, and then got creamed by Conroe. It could happen again. AMD has not been able to sustain leadership - it moves in short spurts of real innovation, followed by years of mediocrity. I think AMD lost of some of the engineers willing to take risks - it became a customer driven company and lost the ability to inspire passion.

Wednesday, November 12, 2008

I was reading AMD's Pat Moorehead's blog post on mini-notebooks (Netbooks in Intel-speak; research companies are using the mini-notebook name). Pat's become Mr. Gadget for AMD, which is funny considering AMD is not really in the gadget business.

Back to mini-notes - Pat divides the use model into two types: at home and away from home. Pat's ideal "at-home" mini-note is really just a small notebook like a MacBook Air. The "away-from-home" mini-note is also overkill. I kinda agree with the 1024x768 display, but actually I'd like 1280x768 like my dear departed Crusoe-based Fujitsu P2110 (I still have the parts; I tried to figure out why it died, but couldn't diagnose the reason it wouldn't power up for more than a second or two). I like having a solid state drive because it make the mini-note more rugged to use - like a remote control or a handheld game console. If the content is going to be stored in the cloud or my home network storage - I don't need a large drive. I would like the same amount as an iPOD Touch - 16-32GB. To get to the aggressive price point- there are compromises that must be made. But it clear that consumers won't pay a premium for small size.

Looking over at my gadget collection, you can see that I own the original ASUS Eee PC 701 with the Linux OS and have used it. It's just to difficult to use on a regular basis. I'm planning to keep our wine list on it and it's still useful for casual browsing or chat.

I'm holding out for a mini-note with NVIDIA graphics :-)
On the eve of AMD's Financial Analyst Day, did anyone find it odd that Intel decided to release its lower guidance? If that doesn't scare the bejesus out of the financial analysts, I don't know what will. That's just plain mean (of Intel).

Thursday, October 23, 2008

I pause from the "usual" processor technology talk, to address another topic that's been a favorite interest of mine since grade school: space travel. Buzz Aldrin thinks that we should send the first explorers to Mars on a one-way trip to the red planet:
http://www.physorg.com/news143972922.html

I can't completely agree with him. Much like the early plot of Kim Stanley Robinson's Red Mars, the first explorers should return to Earth in order to be able to relate the experience to the world. Otherwise, I think the exploits of the Mars explorers will be too remote, too unreal, to the people who must fund supply missions. Only when we have the technology to make the "settlers" self sufficient, should we consider the one-way trip option.

What we should be investing in is better robotic AI ("Open the pod bay doors, Hal") and the underlying technology needed for colonization. Both these endeavors could have additional benefits here on Earth.

I hope I live long enough to see the pictures of Man on Mars. Just going there and not colonizing is a waste of money. Look at what happened when we went to the Moon - close to 40 years later and we haven't been back.

Sunday, October 12, 2008

Sorry for the lack of posts - worklife and funlife has been busy. Frankly, not really much new to report on the processor front. The Larrabee watch continues - still no sign of real silicon.

The AMD "asset smart" split was announced and it didn't help the stock much. A lot has been written about it, although the present economic crisis has overshadowed it. I see it as mixed news. The infusion of more capital should help AMD compete with Intel - more money for the fabs will build the capacity AMD can't afford to fund itself. But you still have a foundry completely dependent on AMD for its revenue and AMD is almost completely dependent on the foundry company for its silicon. It will take at least 2 years for the foundry company to build up a reasonable portfolio of processes that will attract other customers (it needs bulk silicon). Most importantly, the foundry company will have to learn a new mindset as a foundry, not just a captive fab. IBM made that transition, but it is still not a flexible and cost effective as TSMC.

Tuesday, September 16, 2008

I'm on the organizing committee of a conference called Hot Chips. Our last conference included a very funny panel from a number of well known chip designers, analysts, and a legendary educator. You can see a shortened version of the panel here:
http://www.spectrum.ieee.org/video?id=562

Tuesday, September 09, 2008

Well, it's not quite a month since my last post. For all you Catholics out there, there will be a familiar ring to that admission.

So I need to keep this blog posting going. The problem, as usual, is time, compounded by short post on Twitter. Well, that plus the fact I have only one proven reader, maybe two. Well, that's a classic chicken and egg situation - I might get more readers if I posted regular and posted interesting things.

WRT to interesting things, there's a lot of interest in the "Godson" processor from the Chinese Academy of Science. Previous versions were just a MIPS-compatible processors for very low cost PC-like devices. But then everyone decided the low-cost PC model was actually a growth opportunity, so the market became more interesting (although the size is still questionable). On top of that, the Godson-3, introduced at Hot Chips 20, is designed with software emulation of the x86 instruction set in mind. The translation layer is still under development, but say the magic word "x86" and everybody wants to know about it and suddenly it's an Intel competitor. Not a place that I would strive for.

Well, it's still isn't available for testing, so we'll have to wait to see how good it is.

Sunday, August 17, 2008

Now for something close to my heart: microprocessors. I have a question for you: is microprocessor design dying? I ask, because chips like Intel's Atom are perceived as innovative, when, in fact, Atom is a through back design. An analyst was asked by a publication is he thought Atom would be one of the innovations of the year. Atom is dual-issue, in-order design where low power trumps performance. The architecture is stripped down and basic, not innovative. The innovation is in the marketing of it.

Another piece of evidence: MIPS is planning as layoff. It's been pretty obvious that MIPS ran out of ideas and has been coasting on the designs of Cavium and RMI to keep it relevant and setup boxes and PS2 for revenue. It was destined to be squeezed by POWER from above and Arm from below, but MIPS gave in with a whimper, not a real fight.

The big iron guys like IBM and Sun still keep moving along, but the excitement now comes from the promise of many smaller cores working together (what Sun calls Throughput Computing) with processors like Niagara and Larrabee, only there's this problem with creating mainstream software for many-core processors. And if the goal IS more cores/mm2, those cores are going to be rather simple in design (like the x86 cores in Larrabee).

So, we dump the problem onto the programmers and sit back and yell: "What's the problem? Can't you programmers figure out how to split that code over a bunch of cores? What's so hard about that?"

Well for graphics, it isn't too tough, because there's all these pixel to work on in parallel. It's also reasonably well known in data base, web serving, and some super-computer programming. But it's still going to be a challenge for many other applications.

Then again, maybe we have to ask: are processors already good enough for typical client workloads? Therefore, we don't need particularly powerful client microprocessors, just really good Internet connectivity to cloud computing and low power. Gamers and professional content creators will be the lone users of powerful client computers. Like with cars, everybody is worried about gas mileage today and only a few crazies still care about performance.

I'm just asking.
Life now has a renewed sense of urgency. A close family member of my wife (actually, the cousin of her ex-husband, but we're actually very close to him) was diagnosed with a malignant brain cancer. The only symptom he had was he had difficulty speaking for the last 2 weeks. No other symptoms. After he insisted on getting it check, they found a tumor and today the biopsy came out as malignant. We're devastated about it. He just turned 60 and was in excellent health. We're very upset.

The thing that it does make you think about is how much time we have in this world. At 53, I'm thinking about what I am doing that is not what I want to do. I've made recent career decisions based on monetary gains, not based on my passions. It just not going to work for me over the long haul. So changes will have to be made, and made soon.

Thursday, August 07, 2008

I don't know why I bother with this blog, but I've decided I need to keep writing because I want to polish my skills. That and I REALLY want to get my opinion out in some sort of public forum. Every time I talk to somebody about processors, I find myself going into my analyst mode and it just seems like where I feel most comfortable. My biggest problem as an analyst is that I can be too rational, and sometimes the success or failure of a product, or even a market, is not based on rational reasons.

What I want to do in this post is say that I'm not completely against Larrabee as my previous posts would have suggested. I am completely against the hype around Larrabee. Realistically it's over a year away from introduction, most of the work to make it a graphics card involves software, and Intel has a terrible track record on quality graphics.

What I become intrigued about with Larrabee is the possibility of developing custom software APIs and renderers that don't use DX or OpenGL. The last graphics company that went up against Microsoft and its DX juggernaut was 3Dfx with Glide. Intel has a much stronger position than 3dfx, but still there going to be some tension with Microsoft.

Jon Peddie had an interesting perspective: he thinks it can expand the market by "validating" discrete graphics in the mainstream. I also have another reason to encourage Larrabee - it is likely that Intel will limit the power of its integrated graphics chipsets in order to not overlap with Larrabee. Intel will willingly throttle its own chips to make way for Larrabee, which also means NVIDIA (being unconstrained) will continue to lead Intel's integrated graphics.

Sunday, July 13, 2008

Well, its been a long time since I last posted, but I couldn't resist this one "I told you so" post about AMD.

http://biz.yahoo.com/ap/080711/amd_writedown.html?.v=6

In a filing with the Securities and Exchange Commission, AMD said it recently discovered that its business units that make chips for cell phones and digital televisions, both acquired as part of the ATI transaction and considered "noncore" parts of AMD's operations, weren't performing up to expectations.


This was something I told anyone who would listen: AMD should sell off the Cellphone and DTV divisions. Those divisions had NO synergy with the rest of AMD. It would have even been better off bundled with Spansion.

Wednesday, June 11, 2008

The week after Computex, it's a good opportunity to look at the hot topic - MIDs and small, cheap notebooks (Intel calls Netbooks, others Laptots) and Intel's Atom processor.

The MID market is the newest battleline between the PC and the smartphone. Smartphone solutions are based on the ARM instruction set that dominates the smartphone business, including the iPhone. Leading chip suppliers include Broadcom, Qualcomm, and TI. NVIDIA released our version called Tegra. These designs focus on ultra-low power for long battery life and the small, fanless form factors of handsets. For minimum space, these chips are system-on-a-chip (SOC) designs with multiple dedicated media and networking accelerators for maximum power efficiency. The higher-end chips are closely related to media players chips, so media functionality is quite good. The leading operating systems include Blackberry, Linux, Symbian, WinCE, and Windows Mobile.

Intel’s Atom approaches the MID market is far different from above. Intel is trying to scale an x86 processor and chipset to fit into a very small form factor. The system design is presently a direct scaling down model of the standard PC architecture, baggage intact. The advantage of the scaled down PC is that it is flexible enough to run full Windows, as well as Linux and other x86 embedded operating systems. The disadvantage is that is on the order of 10 times less power efficient today than on ARM SOC.

Still, the Intel Atom is a work in progress. The first version has a separate processor and chipset, not optimal for power or size. The chipsets themselves are derivatives of existing Intel chipsets running on older process technology. It’s not until the next generation of Atom arrives next year, called Moorestown, that Intel achieves some level of integration, but even that processor doesn’t approach the level of integration of NVIDIA's Tegra SOC.

While Intel promises leading edge process technology, and the first Atom launched in 45nm process, the chipset uses older technology (n-1 or N-2 process) and by the time Moorestown ships in 2009, the 45nm process will be considered a mature process. Intel still looks to be years away from matching ARM designs in lowest power.

Ultimately it will be the consumers who decide the victor: the long battery life, media savvy and lower power of the ARM designs, or the PC-like flexibility of the Atom design.

Tuesday, May 20, 2008

Hi All,

And by "all" I mean anyone who stubbles accross this blog and a cousin or two (hey Den).

I promised more commentary on the PA Semi deal: I heard news from a reliable source that Dan and company are going to work on a new design for Apple and it's not PowerPC and it's not x86. So you go figure it out for yourselves. It'll be 2-3 years before we'll see the fruit of this deal in an Apple product.

As for Montalvo Systems - I'd be surprised if Sun can figure our how to effectively use most of the IP. Maybe it was the team Sun wanted, but it doesn't look like many (any?) of the senior people will go to Sun. Also, the team never did deliver a product to tapeout.

Tuesday, April 22, 2008

This is crazy! Forbes reports that Apple is buying P.A.Semi:
http://www.forbes.com/technology/2008/04/23/apple-buys-pasemi-tech-ebiz-cz_eb_0422apple.html
There is some bad speculation in the Forbes story, but it's really hard to imagine this turn of events.

It could be that Apple wants on in-house desing team and buying PA Semi gives them one. Dobberpuhl has designed chips for just about every instruction set except x86. Hmmmm. Naw, Apple won't try to design its own x86 processor, would it? Not after Montalvo Systems just got sold for scrap.

More than likely Apple wants more control for its iPhone and media devices. And we know Steve likes control. There might even be something in the works that Apple couldn't find a vendor that had the exact right solution.


Here's part of the CNet take from Tom Krazit:
http://www.news.com/8301-13579_3-9926461-37.html?part=rss&subj=news&tag=2547-1_3-0-20
Apple acquires low-power chip designer P.A. Semi
Posted by Tom Krazit Apple has reportedly made a rare acquisition, snapping up low-power chip company P.A. Semi one day before reporting its quarterly earnings.

Forbes reported late Tuesday that Apple has agreed to purchase the company for a middling $278 million, quoting Apple spokesman Steve Dowling as confirming the deal. P.A. Semi made its debut a few years back designing low-power chips based on Apple's old friend, the Power architecture.

Thursday, April 17, 2008

The AMD news today was pretty bad, but it was expected. It did raise some questions:
1. What happened to the Bobcat core? We were told Bulldozer is in development and is scheduled to ship in 2009, but Dirk dodged answering about Bobcat (light weight core for products that would compete with Intel's Atom).

2. Will we EVER see asset light? Hector has been using this allusion for a year now with no visible movement. But let's be real - if AMD splits into two (foundry and fabless processor company), the two parts would still be inexorably linked to each other's fate.

3. It's about time AMD was realistic about the ATI media processor (DTV, Cell phone, settop box) business and sell it off for some badly needed revenue. I expect the sale of those business units will be part of the 10% personnel reduction.

4. I feel bad for those who will lose their jobs. I hope its none of my friends still there.

5. Dirk, I think you should get back involved in fixing CPU engineering - it doesn't look like its working well without you. Barcelona needs ~500MHz more; Bulldozer has to hit schedule; you better decide on your 32nm process ASAP; Fusion threatens to disappoint in practice; you can't rely on Intel going off track again (although Nehalem could still be too bloated to be cost-effective in mainstream CPUs).

Thursday, April 10, 2008

My CEO was all over Intel at today's financial analyst meeting. Intel has been just plain buttheads lately, saying just things to create FUD. The problem is that too many people just assume Intel will follow through without any real understanding of Intel's strengths and weaknesses. Look at Intel's track record outside of x86 processors for PCs: DRAM (fail), flash memory (spin-off), i860 (fail), i960 (fail), 432 (fail), XScale (fail), Itanium (niche, never replaced x86 = fail), optical (sold off), supercomputers (fail), integrated graphics (crap to date), i740 discrete graphics (fail), bit slice (fail), LCOS (fail). The record speaks for itself - on a level playing field Intel fails. Only if it has a proprietary volume product (x86 processors) does it succeed. Intel understands that now, so its using x86 for everything - Router SoC chips, applications processors (cell phones), embedded, and now graphics. To the world, the x86 is Intel's hammer and every market now looks like a nail.

Monday, March 24, 2008

Barrons wrote a positive article about my employer, but Larrabee continues to be mentioned as a threat. I wish more analysts and reporters would take a wait-and-see attitude WRT to Intel graphics. Let's face it, Intel's track record is poor in this area.

Monday, March 17, 2008

I really suck at posting to my blog. I just don't seem to have the time - or is it inclination? There's a lot of noise coming out of Intel on Larrabee and graphics, but so far it's just noise. Larrabee is the perfect chip - no benchmarks, only Powerpoint. Oh, I've been told they have these great simulators for Larrabee 2 and 3. Come on folks, why are you giving Intel so much credibility for something that doesn't exist yet. Sure the silicon may work just fine, that is not the challenge. Intel has to take a completely non-traditional graphics architecture and make it work efficiently against NVIDIA and AMD/ATI architectures that have been finely tuned for years. Intel has a very long way to go in matching our experience in writing graphics drivers. Just look at the disaster that is Intel graphics today. Read the recently revealed e-mail reports from Microsoft over the "Vista capable" program to allow sub-standard Intel graphics to be compatible with the Vista program. Intel still can't ship DX10 drivers.

Here's a perfect example of why Intel is wasting it's money and time building chipsets - we can do a much better job. But Intel want to control the platform, even if that means consumers are stuck with inferior components. It's a crying shame.

Wednesday, February 27, 2008

It's nearly the end of February and I have been remiss in posting. It's been a busy month, along with a flu bug and some weekend trips.

I'll be catching up on all the action soon.

The most interesting news so far in 2008 has been the surprising Isiah processor from VIA and I really want to know more about Sun's Rock. AMD and Intel have been a bit boring of late, although some details of Intel's Silverthorne deserve further exploration (especially if it's evidence of Intel going backward on processor design). Remember folks, just because Intel has executed well over the last 2 years is not a guarantee that they will continue to execute. Nehalem has a lot of potential to go off the track (big die, new interface, complicating the microarchitecture, new on-die memory controller, etc.)

Sunday, January 27, 2008

Back again with some time to post. There's been some interesting news I'd like to cover.

First, Apple introduced new products at MacWorld. The most controversial product was the MacBook Air, the thinnest notebook shipping today (maybe, the thinnest ever). I found the Air made too many compromises to be anything but an executive toy. I know something about sub-notebooks having owned a Fujitsu P2040 with the Transmeta Crusoe processor and an having recently purchased an ASUS Eee PC. Both of those laptops made compromises, but Apple chose the style, keyboard, and larger display over flexibility, connectivity, and expandability. While I will wait until I see one in person before casting my final opinion, I can't imagine that the one USB port on the Air is in any way sufficient for external mic/headset, wired Ethernet, external storage, etc. Beside, the Air turns out to be Apple's slowest notebook and is using Intel's older 65nm processor, not the latest 45nm Penryn processor.

The other big news (as far as I'm concerned) is the announcement of VIA's "Isaiah" processor (also known as the CN processor). This is a major leap forward for VIA's Centaur design team. The processor had been announced at the 2004 Fall Microprocessor Forum, back when I was still at Microprocessor Report. Glenn Henry announced the design at the 2004 Fall MPF and gives a detailed description at ExtremeTech.com.

In looking over the wafer pictures, I figure the die is about 65mm2 in size, larger
than I thought it would be in 2004, but then I didn't expect VIA to put a 1MB L2 cache on die. I also expected VIA would put an on-die memory controller in the design, but they did not. I guess VIA prefers a clean division of labor - processors in Texas and chipsets in Taiwan.

Overall the processor looks interesting, splitting the difference between the Core 2 processors (Penryn is 107mm2) and the upcoming UMPC/MID processor Silverthorne (at 25mm2). Performance should be right in the middle of those two Intel processor families. The question is: is there enough room for VIA to operate or is it too tight a squeeze.

Sunday, January 06, 2008

I've had a very relaxing vacation with stays in San Francisco and Carmel Valley. Lots of movies, but none were really outstanding - just entertaining. Monday it's back to work.

CES kicked off with Bill Gates' keynote. The best part was the comic video of Bill's last day on the job done in the style of "The Office."

So far, none of the CES product announcements has excited me - plenty of new HDTVs, PCs, Cameras, media players, etc. I'll keep looking this week.

Friday, January 04, 2008

I just got back from a couple of days away from the Internet. While catching up on things, I found out that Om Malik had a heart attack last week. Get well Om - you'll have to make some changes in your lifestyle now.

With the holidays, I've been distracted and haven't posted. I'll be back soon as we get set for CES. I'm watching CES from a safe distance - San Jose. There will be enough reporters and bloggers and analysts there that I can cover more territory by reading their reports than I could on foot at the show.

I got an iPod Touch for Christmas and hope to spend more time with it. The interface is very interesting, but I see more work that needs to be done to make touch screens more useable - Apple's work is not done yet.