The reason I awoke the blog was that yesterday I pre-ordered the Microsoft Surface tablet and I wanted to explain why I think Windows RT is an important development and a potential competitor to Apple’s iPad AND to Windows 8 laptops and tablets. Yes, Microsoft is competing not just with Apple, but is also competing with itself. I think this is a good thing.
I chose the Surface tablet for a number of reasons, one of which is that it runs on the Tegra 3 processor from my former employer Nvidia. It’s not that Tegra 3 is perfect – it’s built in an older semiconductor process, uses an older ARM core (Cortex-A9), and its GPU is barely capable of supporting Windows 8/RT requirements – but Nvidia does know how to build Windows drivers like few other companies can. I also expect to see some well-optimized games.
The main reason I chose Microsoft’s Surface is that Microsoft is building it. It HAS to be the premier example of what a Windows RT tablet should be. I expect updates will be more stable and comprehensive when everyone in the company will have one (or so I’ve heard).
An exciting part of Windows RT is that it is a slimmed down version of Windows 8, without a lot of the legacy baggage the full version drags along. It’s a real reboot of the Microsoft OS legacy. The touch interface is new for Microsoft, so I expect it will evolve over the next few years, but it’s exciting to see touch become more ubiquitous.
From a user point of view, I hope it becomes my trusty ultra-mobile companion that can perform work functions, and also entertain me. Its traveling companion will be either the Macbook Air, for more work-oriented requirements, or the Nexus 7 for personal travel.
At 1.5lbs, it’s about the same weight as the original iPad, but it’s thinner and longer with a higher resolution display. It’s not third-gen iPad retina resolution, but then, what other tablet or PC offers that level of crazy-high screen resolution? Amazingly, it uses mostly open standards for I/Os: USB 2 and MicroSDXC cards. It offers users a method for memory expansion by plugging in a MicroSDXC card. How radical and useful is that! Why won’t Apple do the same? Unfortunately, it does require a proprietary dongle for VGA or HDMI, which I will probably get.
With the USB connection, I can plug in PC peripherals and there might be a driver available (although not likely at first). But certainly I can connect a keyboard, a mouse, or even an Xbox controller (I’m not completely sure about the last one). And here’s another difference with the iPad – Windows RT supports native mouse control, while iOS does not. For Office applications, the touch screen may not be precise enough for some work and a mouse is best. With Bluetooth support, I can use a cheap wireless travel Bluetooth mouse.
I believe the price point is about right. At $499, the tablet is far cheaper than any Ultrabook, but it is still $200 more expensive than a netbook. Having used the Macbook Air, the advantages of an SSD in the Surface tablet is worth the extra bucks. The closest competition will be the other Windows RT tablets and the Clover Trail-based Windows 8 tablets. So far the Windows 8 tablets are generally $100+ more, and don’t include Office 2013 RT. The retail cost to buy Office for a Windows 8 tablet is ~$100 additional, although we could see some discount bundles. For companies with a corporate Office license, the Office 2013 RT may not be considered worth anything, but to a small business user, consumer, or student, it does bring value.
I hope you can understand why I’m excited about Surface and Windows RT. I hope other are as well, and it does not turn into another ZuneHD where a superior piece of hardware was left to wither and die. I expect that within a year I will be regretting being an early adapter, because there will be better tablets, with better processors next year, but I also learn much by getting in early and who knows, this might be considered a seminal product years from now.
EntroPC
Ruminations on technology by Kevin Krewell
Thursday, October 18, 2012
Tuesday, October 16, 2012
Well, after nearly two years, I'm back with some updates, I updated my devices and introduction. I hadn't updated EntroPC since I left Nvidia and joined The Linley Group. I guess Twitter and Facebook remain the enemy of blogging. But I focus on personal connection on Facebook and the 140 character limit of Twitter cuts short any long-form opinion.
My reason for revisiting is that I just pre-ordered the Microsoft Surface tablet and I wanted to talk more about it. Stay tuned.
Thursday, December 16, 2010
Yes, I've been neglecting my blog, with Twitter and Facebook stealing my time away from it. But I have more time as I'm taking some time between jobs. I've left NVIDIA and will be returning to Microprocessor Report, now in the hands of The Linley Group.
As I've told many people, I've always loved writing for the Microprocessor Report, but it was the corporate owners that were strangling the life out of it. Under the new ownership, things are really looking up and I'm happy to be returning - technically for the 4th time.
In the meantime, I'm having some downtime. Although, I did create this huge list of things to do...I'm way behind on that list.
More to come...
As I've told many people, I've always loved writing for the Microprocessor Report, but it was the corporate owners that were strangling the life out of it. Under the new ownership, things are really looking up and I'm happy to be returning - technically for the 4th time.
In the meantime, I'm having some downtime. Although, I did create this huge list of things to do...I'm way behind on that list.
More to come...
Saturday, November 13, 2010
It's clear that the future of processor design has much less to do with raw performance and everything to do with power efficiency. Computing is now bound by issues of power constraints - be it a cellular handset or a supercomputer data center.
In this area, ARM has the advantage over x86. The issue will be whether Intel's superior process roadmap can overcome the inherent inefficiencies of the x86 legacy.
In this area, ARM has the advantage over x86. The issue will be whether Intel's superior process roadmap can overcome the inherent inefficiencies of the x86 legacy.
Monday, August 30, 2010
While my blogging has been quite sporatic, I did have the opportunity to blog on the NVIDIA corporate blog "Inner Geek". In my post "The Collectors Edition", I showed off a part of my collection of PC graphics cards dating back to the mid-90s.
Sunday, August 29, 2010
Microprocessor Startup Apocalypse?
This past week, I was a participant in the annual Hot Chips symposium at Stanford University. As part of the organizing committee, I help the conference with PR, including press and analysts. The conference is a mix of academic and commercial presentations, but the general leaning of the conference is more commercial – especially after the demise of the Microprocessor Forum.
This year’s attendance was up sharply from 2009 and pretty much matched 2007 and 2008. That’s a healthy sign for the business – companies are spending money to send people to conferences. But there’s a dark side to this year’s conference – paper submissions was down. Many of the papers this year came from stalwarts AMD, IBM, and Intel. There were a few new companies, but one presenter’s company (Tier Logic) had already been declared dead by Microprocessor Report (July 19th edition - $$)
And on the same day the conference kicked off, Jim Turley, editor in chief of Microprocessor Report, released an editorial on the futility of new microprocessor startups. With a title of “How to Blow $100 Million,” the editorial goes on to develop a rule of thumb that a new microprocessor start-up needs about $100 million in order to fully launch the company. After the idea rattled off a few people, the general consensus that if the new microprocessor requires extensive tools support (compilers, debuggers, etc), that $100 million is not ridiculously high.
And there is the problem – how can you raise a $100 million form VCs these days, when the risks are high, the returns questionable, and many other “sexier” markets (like green tech, social media, location aware services) are out there. A look at recent microprocessor start-up is not encouraging: Raza Micro (where a spent a few months) was sold to Net Logic, PA Semi was sold to Apple, Intrinsity was also sold to Apple, Montalvo was shuttered, Stream Processor Inc. (SPI) is no more, and the list goes on.
I find it sad and disheartening that the microprocessor business is undergoing such major consolidations and innovative new architectures will likely not get funded. The next new architecture may come from the emerging powers such as China. We saw the Chine “Godson” processor evolve rapidly until now we see a are talking about a processor with a wide SIMD unit, running over 1GHz that soon could be the foundation of a major supercomputer. They even talk about an x86 emulations software layer supported by special instructions. With the support of the government and access to a competitive foundry, I wouldn’t underestimate where there program could be a few more years.
This past week, I was a participant in the annual Hot Chips symposium at Stanford University. As part of the organizing committee, I help the conference with PR, including press and analysts. The conference is a mix of academic and commercial presentations, but the general leaning of the conference is more commercial – especially after the demise of the Microprocessor Forum.
This year’s attendance was up sharply from 2009 and pretty much matched 2007 and 2008. That’s a healthy sign for the business – companies are spending money to send people to conferences. But there’s a dark side to this year’s conference – paper submissions was down. Many of the papers this year came from stalwarts AMD, IBM, and Intel. There were a few new companies, but one presenter’s company (Tier Logic) had already been declared dead by Microprocessor Report (July 19th edition - $$)
And on the same day the conference kicked off, Jim Turley, editor in chief of Microprocessor Report, released an editorial on the futility of new microprocessor startups. With a title of “How to Blow $100 Million,” the editorial goes on to develop a rule of thumb that a new microprocessor start-up needs about $100 million in order to fully launch the company. After the idea rattled off a few people, the general consensus that if the new microprocessor requires extensive tools support (compilers, debuggers, etc), that $100 million is not ridiculously high.
And there is the problem – how can you raise a $100 million form VCs these days, when the risks are high, the returns questionable, and many other “sexier” markets (like green tech, social media, location aware services) are out there. A look at recent microprocessor start-up is not encouraging: Raza Micro (where a spent a few months) was sold to Net Logic, PA Semi was sold to Apple, Intrinsity was also sold to Apple, Montalvo was shuttered, Stream Processor Inc. (SPI) is no more, and the list goes on.
I find it sad and disheartening that the microprocessor business is undergoing such major consolidations and innovative new architectures will likely not get funded. The next new architecture may come from the emerging powers such as China. We saw the Chine “Godson” processor evolve rapidly until now we see a are talking about a processor with a wide SIMD unit, running over 1GHz that soon could be the foundation of a major supercomputer. They even talk about an x86 emulations software layer supported by special instructions. With the support of the government and access to a competitive foundry, I wouldn’t underestimate where there program could be a few more years.
Wednesday, May 26, 2010
Intel just announced that Larrabee is officially, really, NOT going to be released as a discrete GPU any time soon. Or for the next 5 years. Or more.
Why, you may ask? Well, because it made a really weak GPU that would probably cost too much to manufacture and have returned bad margins. How could the mighty and wise Intel have spent over three years working on this project and not know it would suck as a GPU? Well, it could be because they were not reading my blog post of December 5, 2007 where I said:
"Oh, and when did we all decide that x86 was the most perfect instruction set for graphics? Hmmmm, it's not. The example where the x86 is the preferred instruction set for either graphics or high performance computing doesn't exist. The x86 instruction is an burden on the chip, not an advantage, for graphics. In fact, graphics instruction sets are hidden by the OpenGL and DirectX API, so the only people who care are the driver programmers and we are free to design whatever instruction set is most efficient for the task, while Intel has handcuffed themselves with a fixed instruction set. Not the smartest move for graphics. Intel's plans may work better for HPC though. Still, there are many alternatives to x86 that are actually better (shocking isn't it)!"
"Larrabee is based on a simplified x86 core with an extended SSE-like SIMD processing unit. In order to be efficient, the extended-SSE unit needs to be packed efficiently. How is Intel going to do that? The magic is all left to the software compiler, which is going have a tough time finding that much parallelism efficiently."
And that was what happened. In addition, with many coherent cores, the coherency traffic and stalls eat up a lot of bandwidth and limited scaling across cores.
My prediction back then gave Intel too much credit:
"But you heard it hear first: Larrabee will not reach the potential performance capability (by a long shot) and will not displace NVIDIA as GPU leader."
Because Larrabee never even got to compete.
Oh and I still stand behind this quote:
"Let me also ask this question: when has Intel EVER produced a high performance graphic chip? (Answer: never)"
A hard lesson learned by the i guys. But I could have saved them all those many millions of dollars. If only they'd have listened. Sigh.
Why, you may ask? Well, because it made a really weak GPU that would probably cost too much to manufacture and have returned bad margins. How could the mighty and wise Intel have spent over three years working on this project and not know it would suck as a GPU? Well, it could be because they were not reading my blog post of December 5, 2007 where I said:
"Oh, and when did we all decide that x86 was the most perfect instruction set for graphics? Hmmmm, it's not. The example where the x86 is the preferred instruction set for either graphics or high performance computing doesn't exist. The x86 instruction is an burden on the chip, not an advantage, for graphics. In fact, graphics instruction sets are hidden by the OpenGL and DirectX API, so the only people who care are the driver programmers and we are free to design whatever instruction set is most efficient for the task, while Intel has handcuffed themselves with a fixed instruction set. Not the smartest move for graphics. Intel's plans may work better for HPC though. Still, there are many alternatives to x86 that are actually better (shocking isn't it)!"
"Larrabee is based on a simplified x86 core with an extended SSE-like SIMD processing unit. In order to be efficient, the extended-SSE unit needs to be packed efficiently. How is Intel going to do that? The magic is all left to the software compiler, which is going have a tough time finding that much parallelism efficiently."
And that was what happened. In addition, with many coherent cores, the coherency traffic and stalls eat up a lot of bandwidth and limited scaling across cores.
My prediction back then gave Intel too much credit:
"But you heard it hear first: Larrabee will not reach the potential performance capability (by a long shot) and will not displace NVIDIA as GPU leader."
Because Larrabee never even got to compete.
Oh and I still stand behind this quote:
"Let me also ask this question: when has Intel EVER produced a high performance graphic chip? (Answer: never)"
A hard lesson learned by the i guys. But I could have saved them all those many millions of dollars. If only they'd have listened. Sigh.
Subscribe to:
Posts (Atom)