Thursday, December 16, 2010

Yes, I've been neglecting my blog, with Twitter and Facebook stealing my time away from it. But I have more time as I'm taking some time between jobs. I've left NVIDIA and will be returning to Microprocessor Report, now in the hands of The Linley Group.

As I've told many people, I've always loved writing for the Microprocessor Report, but it was the corporate owners that were strangling the life out of it. Under the new ownership, things are really looking up and I'm happy to be returning - technically for the 4th time.

In the meantime, I'm having some downtime. Although, I did create this huge list of things to do...I'm way behind on that list.
More to come...

Saturday, November 13, 2010

It's clear that the future of processor design has much less to do with raw performance and everything to do with power efficiency. Computing is now bound by issues of power constraints - be it a cellular handset or a supercomputer data center.

In this area, ARM has the advantage over x86. The issue will be whether Intel's superior process roadmap can overcome the inherent inefficiencies of the x86 legacy.

Monday, August 30, 2010

While my blogging has been quite sporatic, I did have the opportunity to blog on the NVIDIA corporate blog "Inner Geek". In my post "The Collectors Edition", I showed off a part of my collection of PC graphics cards dating back to the mid-90s.

Sunday, August 29, 2010

Microprocessor Startup Apocalypse?

This past week, I was a participant in the annual Hot Chips symposium at Stanford University. As part of the organizing committee, I help the conference with PR, including press and analysts. The conference is a mix of academic and commercial presentations, but the general leaning of the conference is more commercial – especially after the demise of the Microprocessor Forum.

This year’s attendance was up sharply from 2009 and pretty much matched 2007 and 2008. That’s a healthy sign for the business – companies are spending money to send people to conferences. But there’s a dark side to this year’s conference – paper submissions was down. Many of the papers this year came from stalwarts AMD, IBM, and Intel. There were a few new companies, but one presenter’s company (Tier Logic) had already been declared dead by Microprocessor Report (July 19th edition - $$)

And on the same day the conference kicked off, Jim Turley, editor in chief of Microprocessor Report, released an editorial on the futility of new microprocessor startups. With a title of “How to Blow $100 Million,” the editorial goes on to develop a rule of thumb that a new microprocessor start-up needs about $100 million in order to fully launch the company. After the idea rattled off a few people, the general consensus that if the new microprocessor requires extensive tools support (compilers, debuggers, etc), that $100 million is not ridiculously high.

And there is the problem – how can you raise a $100 million form VCs these days, when the risks are high, the returns questionable, and many other “sexier” markets (like green tech, social media, location aware services) are out there. A look at recent microprocessor start-up is not encouraging: Raza Micro (where a spent a few months) was sold to Net Logic, PA Semi was sold to Apple, Intrinsity was also sold to Apple, Montalvo was shuttered, Stream Processor Inc. (SPI) is no more, and the list goes on.
I find it sad and disheartening that the microprocessor business is undergoing such major consolidations and innovative new architectures will likely not get funded. The next new architecture may come from the emerging powers such as China. We saw the Chine “Godson” processor evolve rapidly until now we see a are talking about a processor with a wide SIMD unit, running over 1GHz that soon could be the foundation of a major supercomputer. They even talk about an x86 emulations software layer supported by special instructions. With the support of the government and access to a competitive foundry, I wouldn’t underestimate where there program could be a few more years.

Wednesday, May 26, 2010

Intel just announced that Larrabee is officially, really, NOT going to be released as a discrete GPU any time soon. Or for the next 5 years. Or more.

Why, you may ask? Well, because it made a really weak GPU that would probably cost too much to manufacture and have returned bad margins. How could the mighty and wise Intel have spent over three years working on this project and not know it would suck as a GPU? Well, it could be because they were not reading my blog post of December 5, 2007 where I said:

"Oh, and when did we all decide that x86 was the most perfect instruction set for graphics? Hmmmm, it's not. The example where the x86 is the preferred instruction set for either graphics or high performance computing doesn't exist. The x86 instruction is an burden on the chip, not an advantage, for graphics. In fact, graphics instruction sets are hidden by the OpenGL and DirectX API, so the only people who care are the driver programmers and we are free to design whatever instruction set is most efficient for the task, while Intel has handcuffed themselves with a fixed instruction set. Not the smartest move for graphics. Intel's plans may work better for HPC though. Still, there are many alternatives to x86 that are actually better (shocking isn't it)!"

"Larrabee is based on a simplified x86 core with an extended SSE-like SIMD processing unit. In order to be efficient, the extended-SSE unit needs to be packed efficiently. How is Intel going to do that? The magic is all left to the software compiler, which is going have a tough time finding that much parallelism efficiently."

And that was what happened. In addition, with many coherent cores, the coherency traffic and stalls eat up a lot of bandwidth and limited scaling across cores.

My prediction back then gave Intel too much credit:
"But you heard it hear first: Larrabee will not reach the potential performance capability (by a long shot) and will not displace NVIDIA as GPU leader."
Because Larrabee never even got to compete.

Oh and I still stand behind this quote:
"Let me also ask this question: when has Intel EVER produced a high performance graphic chip? (Answer: never)"

A hard lesson learned by the i guys. But I could have saved them all those many millions of dollars. If only they'd have listened. Sigh.

Wednesday, February 03, 2010

Today, Linley Group posted their latest newsletter and it was pretty interesting.

In addition to the nice things the Linley Group has to say about Tegra 2 (which BTW IS actually in production now), they also have some interesting conjecture about the Apple A4 processor used in the new iPad.

This may prove to be the most likely scenario:
"A third idea is that the A4 uses the 1GHz Cortex-A8 CPU known as Hummingbird, which is designed by Intrinsity and manufactured by Samsung. This choice would allow Apple to continue working with Samsung, a long-time Apple supplier that makes the Cortex-A8 processor for the iPhone 3GS. Staying with the Cortex-A8 would also simplify software development. Samsung announced that the Hummingbird CPU had already been validated in silicon last July, putting it on track to be production-ready in time for the iPad launch."

People who handled the iPad remarked that it was very snappy - but well optimized software on a 1GHz ARM Cortex A8 processor with an integrated memory controller could run pretty fast. The other speculation around the graphics core also seems fair "Samsung typically uses Imagination's PowerVR cores, and Apple is an investor in Imagination." But I'm surprised that Apple was showing native iPhone games on the bigger iPad screen using pixel doubling but without any antialiasing. Reports are that it looked pretty bad.

Tuesday, February 02, 2010

As I mentioned yesterday, the iPad has kicked off some interesting discussions. Here's one I found particularly interesting because it referenced some very interesting posts:

I have already envisioned a great app for wine cellars. But I could use that camera :(
I expect there will be (external?) camera/video options eventually.

For all the complaints about Flash - remember Flash 10.1 is coming soon with GPU acceleration which will help make Flash a much better experience - except for the iPhone and iPad.

Monday, February 01, 2010

Apple’s iPAD Announcement

There has been no end to the conversation surrounding the Apple iPAD, officially announced on January 28. For months, even years, before Steve Jobs walked out on the stage last Wednesday at the Yerba Buena Center and let the perverbial cat out of the bag, there had been rumors of an Apple tablet, a device that would fit between an iPhone and a Mac PC. That device is called an iPad and it will ship in 60 days (or so Apple promises).

For all the conjecture about what this device could do, Wednesday’s announcement was a let down, a revelation, and a relief. A relief, because there was now something substantial to talk about; a revelation for some because it could lead the way for a new class of computing device; and a let down, because the iPAD couldn’t live up to every ones expectations and some vendors were left off the parts list. The reaction by press, bloggers, and anyone who cares has been all over the spectrum. Some people consider it deeply flawed; some consider it the next personal computing revolution.

Clearly this is only the 1.0 version, there’s plenty of room for growth and adaption. A lot of the nit picking revolves around limitation of the iPhone operating system v3.2 such as no multitasking and other limitations such as no Adobe Flash support. Other hardware limitation come from Apple’s unique aesthetic (no USB ports, dongles for flash card reader) and possibly limitations from Apple’s first home-grown processor (no full 1080p HD support). Some of the hardware limitations were obviously cost driven (the 1024x768 display resolution and no camera). Still, people who have touched the iPad have been impressed by the overall design, even if it doesn’t stray far from Apple’s design language.

The IPS screen with its color quality, LED back light, and aspect ratio indicate that the iPad will be good for web browsing and digital magazines. It will also make a killer digital picture frame when it’s not on the go. For video – it will do well with standard definition, but as noted earlier it lacks the 1280 by 720 display required for entry level HD quality. HD movies will either be letterboxed or cropped. In addition, the iPad dock does not support HDMI HD output (unlike the Microsoft Zune HD).

While the tablet is often thought of as a “tweener device,” between a smartphone and a laptop, the tablet could wind up standing apart from both. As a home Internet access device, it is light enough to carry about easily and has the nearly instant-on characteristic of an appliance. The iPad will certainly be embraced by certain verticals such as home automation and home theater controllers.

With the 3G and optional Bluetooth keyboard, it can be sufficient for a road warrior. The access to text books could be a boon to students. The price point is just above netbooks, so it can appeal to many netbook users. While iPad lacks full Mac OSX support, Apple is porting a sub-set of iWork to it that should be sufficient for most people. This product has the best chance to jump-start the tablet market.
But, let us not forget there are significant limitations. The iPad (like its predecessor the iPhone) lacks Adobe Flash support, so some video sites (like Hulu) and web constructions will not play on the iPad; maybe that will change.

Some of the best thinking about the new iPad has come from developers who are beginning to envision extending existing applications and developing new applications.

With the iPad opening the door, there are also going to be a wide range of competitors, with different strengths and weaknesses. Certainly the two leading software contenders to the iPad’s iPhone OS will be Google’s Android and Microsoft’s Win7. Android is making a considerable splash in smartphones and is being extended to include tablets. The Android application store is 2 year behind Apple, but is making quick strides in adding new applications. The Microsoft Windows 7 operating system offers true desktop level support and applications. With Android and Windows 7, multitasking and Flash are supported and there will be many different hardware platforms supported at many different price points.

Now the clock is ticking – Apple announced the iPad would be available in 60 days from the announcement and a few competitors are beginning to come out. Soon, we will be awash in tablets. I am going to following this very closely, partly because my company NVIDIA has a horse in this race (the very capable Tegra 2) and partly because the tablet could be the most innovative computing product since the laptop. The tablet also brings to mind science fiction visions of the future of computing. I will definitely be buying one or two tablets this year to learn how it fits in with my life. It should make for an interesting year. I can hardly wait.

Monday, January 25, 2010

This Wednesday, Apple will announce something new. The bulk of the conjecture is that it will be a Tablet/slate computer. Every pundit has an opinion. I'm feeling left out of all the punditry.

So, here's my thoughts (not that you asked):
If there is an Apple Tablet/Slate (BTW, I have not seen a good definition of the category), It will be more like a large iPOD Touch - ARM-based, designed for media consumption and web browsing. But the difference is that it will also have a cellular 3G modem, GPS, along with WiFi and BlueTooth.

What I'm not sure of is whether it will have a camera on the back, front, both, or neither. I can make arguments for all. A camera on the back will be useful for videos and augmented reality; a camera on the front for personal videoconferencing and video blogging.

As such, the platform has a lot of possibilities - mount in a car dock for navigation w/realtime traffic, music, email. For the car, it should have voice commands and text to voice. It will have digital magazines, books, as well as play video and music.

My bet is that it will be ~10" in size to maximize screen area while not making it too big to carry anywhere. I also expect a nice high-resolution, multi-touch display (more than 1024x600).

While this version may not have the processor being developed by the former PA Semi team, eventually, it will.

I'd really like to see this product be the beginning of a new future in computer UI. Something like the future we've seen in Star Trek and Minority Report. And that's why Wednesday will have everyone on the edge of their seat - will we be transported to a shiny new future, or will we just get a big iPOD?