Tag Archives: intel

On Apple moving Mac from x86 to ARM

Blomberg is reporting that Apple is considering moving the Mac computer line off of Intel x86 CPUs to in-house developed ARM CPUs. I have my doubts.

I’m sure it is true that they are considering it, but that doesn’t mean it will be happening any time soon. They considered moving to Intel for the better part of a decade, on and off and they didn’t make their move until their old CPU choice, the PowerPC line, had fallen behind and showed no signs of recovering.

For the time being, Intel seems to be on the right track. No one offers a faster or more power-efficient desktop/laptop CPU. Intel was apparently slower than Apple would have liked in driving down power-consumption down or graphics performance up, but they seem to have changed their tune. They’ve accelerated their pace of GPU performance improvement, a move that some have attributed to Apple’s demands. They’ve also become increasingly serious about low-power chips for both mobile devices and “ultrabooks” (Windows version of the MacBook Air). Also, given that so much of their market these days are low-margin, low-price windows PCs, I’d think they’d be eager to have the opportunity to keep a customer who can afford to pay a premium to get the best Intel has to offer.

It is true that Apple does have its own in-house chip-design talent working on ARM CPUs, and it might make sense to leverage their efforts over more of Apple’s product line, but right now, Mac volumes are really a fraction of the iOS device volumes, so it isn’t clear it would be a big win, especially since their laptops and desktops currently occupy and performance and power-consumption envelope that is significantly different than iOS devices.

In the long-term though, I think it is pretty likely that Apple move off of Intel. I think that in many ways, the writing has been on the wall for Intel for a long time. Their competitive position has been eroding, and the pace is going to accelerate if they fail to get some major mobile-phone and/or tablet design-wins in the next year or so. It will be further hastened if high-performance ARM CPUs start putting pressure on their margins and volumes in the server-space.  At some point, Apple will have to switch horses, and if they time it just right, they might actually gain some competitive advantage by weakening competitors who are less-prepared and less able to abandon Intel.

I’ve never been great at estimating when the inevitable will finally come to pass, but I can’t imagine it will be less than 3 years before Apple would make such an announcement, and probably more like 4-6 before they have to.

On Intel’s Decline

The latest installment in a theme I’ve been iterating on for a while now: The Decline of Intel.

This was posted as a comment on a recent post by Jean Louis Gassee on the subject of CPUs and SoCs:

Intel’s strategic position has been eroding since 3DFx came along 15 years ago. From that point on, every PC sold included a bunch of transistors that were paying off the amortization on someone else’s big fab investment. Intel is finally clawing some of that territory back, but I have my doubts that it is going to be enough.

The bottom line is that Intel’s competitive position depends in its ability to invest ahead of competitors in next gen fab technology. It can afford to do so because of its dominant position in desktop and server CPUs and the healthy margins that brings.

The problem for Intel is that each new fab generation is more expensive than the last, and doubles the number of transistors they can produce. Together, this more than doubles the number of transistors they have to sell to maintain their margins. History suggests that this has been a long term challenge. Their average selling price has declined over the last decade in order to balance the supply/demand equation.

They’ve been clever about maximizing their ROI. They use lower margin CPUs to keep the latest generation fabs full, underwriting the cost of fabbing their higher-margin server CPUs. They maximizing the productive life of their older fabs by using them to build support chips (this is why Intel made incursions with x86 chipset makers in the mid-2000s). Unfortunately, this model has its own limits. One of the ways they’ve been managing to schlep transistors is by integrating more and more features directly onto the CPU die, but by doing so, they undermine their own support chip business, cutting in to opportunities to continue to extract revenue from their older fabs.

Which brings us to mobile. Mobile devices can drive a lot of volume, but they don’t drive a lot of revenue. Apple’s new A6 SoC is roughly the same size of one of Intel’s low end i3 CPUs, but intel sells the i3 for 5-10x what Apple, or any other mobile vendor, is likely to pay for a cutting edge ARM SoC.

It appears that Intel may finally have achieved the power/performance ratio needed to play in mobile phones, but it will probably be at least another year before they even have a chance of having design wins that pay-off in significant volume. And even if they do, their growth is limited. Samsung is a major player in phones, and they tend to favor their own SoCs. Apple is the other big player, and they have obviously made their own bet. That leaves Qualcomm’s market share for Intel. I expect that will be a tough fight. Qualcomm will integrate the SoC with the baseband, and they have a lot of patents to bring to that fight. And then there are all these ARM licensees. It is crazy looking at the evolution of ARM SoCs going into cheap Chinese Android tablets.

I just don’t see a big opportunity for Intel. They have a narrow window to gain any sort of real foothold, and the territory they can gain is unlikely to be enough to hold back the tide of ARM licensees which will start eating into their server revenue.

Intel is vertically integrated around the design, fabrication and marketing of CPUs and related components. The advantages of that strategy are in decline. For chips, that seems to be giving way to merchant fabs, which can get the best ROI on their fab investment by leaving the design and marketing of chips to other companies, like Apple, who are vertically integrated around their end-user, and for whom designing their own SoC allows them best serve their customers and drive economies of scale.

Intel’s Foundry Business

Intel is starting to offer 3rd parties the opportunity to build chips on their cutting-edge “fabs.”  I’m not surprised to see this happening. With each new generation of fab, Intel has more transistors they have to sell in order to recoup their costs and get an ROI. Intel themselves haven’t been that great at creating products to sell those additional transistors. If they are going to keep investing in their process leadership, they have to find a way to pay for it. If they don’t keep investing in their process leadership, they are going to be disrupted by ARM and its licensees.

Intel is opening up its manufacturing facilities to third parties, as it takes the further tentative steps toward building a chip-to-order foundry business. The microprocessor giant announced last year that it would build FPGAs for Achronix Semiconductor, and on Tuesday a second FPGA designer, Tabula, said that it would have its chips built by Intel.

In its announcement, Tabula emphasized that it would be using Intels cutting-edge 22nm process with 3D trigate transistors. Intels manufacturing capabilities are world-leading, with none of the established microprocessor foundries—including TSMC, UMC, and AMD spin-off GlobalFoundries—able to match the companys process.

Compared to the 28 and 32nm processes offered by the competition, Intels 22nm process should offer higher speeds with lower power usage, at lower cost. The company will start shipping its first 22nm x86 processors, codenamed Ivy Bridge, in the coming months.

via Ars Technica.

Whether they can actually sell enough of their capacity without opening their fabs to competitors remains to be seen. At some level FPGAs already compete with Intel’s products, in that they take a different approach to creating general-purpose chips that can be used for a variety of applications.

“Real” OS X Apps on the iPad, Are You Crazy ?!?!?

I’m posting this from my iPhone, so I can’t be bothered to name names, but trust me when I tell you, there are a lot of people criticising Apple for basing the iPad off the iPhone OS, rather than making it capable of running Macintosh applications by using a touch-enabled version of the version of OS X that runs on Mac desktops and laptops. I understand the impulse, but these people are either crazy, or they just aren’t thinking things through.

Let’s get one thing out of the way right up front. There is just no damn way to profitably sell a device that matches the price, performance and portability of the iPad that also runs existing Mac apps. This will change in the future, but it’s just not something that can happen this year.

There are many reasons for this, but it really starts with the fact that modern Macs use Intel CPUs and Intel CPUs just aren’t as power efficient for a given level of performance as the ARM CPU in the iPad, iPhone and other mobile devices. So, an iPad that runs existing Mac apps would have to have an Intel CPU, and so would have shorter battery-life, which hurts portability. This could be compensated for with a larger battery, but a larger battery would be heavier, which hurts portability and add expense which would already be higher, because intel CPUs cost more than ARM CPUs.

So on the hardware side, price and weight are two strikes against an iPad that runs Mac apps, but that is only half the problem. The bigger problem is software. Mac apps are built to be run on fairly powerful computers that are capable of multitasking and which people use a mouse and keyboard to interact with. There are a few implications of this.

First, there is a good chance that existing OS X apps, designed to run on powerful hardware, would be both slow and power hungry on an iPad becuase acceptable performance and efficiency on a iMac or MacBook could be completely unacceptable on an iPad. This is on top of the inherent power-efficiency disadvantages of compatible hardware.

An even biggher problem is that the user interface would be ill-suited to the iPad. There are no apps for the Mac designed for the type of interaction the iPad supports. On the otherhand, there are hundreds of thousands of iPhone apps that will run on the iPad. They were designed for touch interaction, so their UIs are more likely to translate to the iPad, which is in many ways an iPhone with a larger screen, than UIs designed for mouse and keyboard interaction. Perhaps less obvious, but at least as important, developers of iPhone apps are likely to be futher down the learning curve for multitouch UIs than developers of “desktop” Mac applications, so they will likely be able to release versions of their apps that take advantage of the possibilies of the larger screen on the iPad than Mac developers.

Further, all of those apps were designed for a device that is even more constrained interms of power consumtion and performance than the iPad. If anything they should run better on the iPad the the iPhone.

I understand why people would like the iPad to run OS X apps. People conflate running Mac OS X applications with “openness” that we don’t get when iPhone and iPad apps all have to go through the iTunes App Store approval process. But let’s face it, an iPad than ran Mac apps would come into this world some combination of slower, more expensive, less portable (in terms of both weight and battery life) and with far fewer good quality multitouch apps.  I don’t think anyone really wants that, so rather than asking for it, lets focus narrowly on the real issue because while it may be unlikely that Apple opens the iPad to apps distributed through channels other than the App Store, at least it possible.  An iPad that ships in 60 days runs Mac apps and doesn’t suck is pretty much impossible.

ARM vs Intel-lovers

Fascinating to see how myopic people are.  I was going through the forums on Ars Technica and came across a thread titled “why are upcoming ARM netbooks hyped so much? Noone wants ARM.”  The thread kicked off with this:

This is getting really silly. Why does the IT media seem so obsessed with hyping upcoming ARM-based notebooks? You’d think that the netbook manufacturers actually learned and took a few hints from their first few releases, most notabling that: the market at large wants to run full blown Windows on their notebooks. Linux is a niche and a small one at that. ARM CPUs can run Linux / WinCE / Android, neither of these 3 are “full blown Windows”, therefor ARM based netbooks are doomed to fail by default.

There were arguments both ways, but I was struck, again, by how invested some people get in defending their captors.  I ended up posting the following as a comment.

Noone” wants ARM? How about this, I know a bunch of people who don’t care if their netbook has Intel or Windows inside, because everything they use if for happens in a web browser. Cheap, compact, and long battery life? That’s something they actually want.

And don’t forget all the people in developing markets who either do no computing at all, or they are already ARM users because their primary access to computing is through a mobile phone. An ARM netbook would be a nice step up. Not that they have any particular affection for ARM, unless, of course, its made by a local ARM licensee.

As for all you “IT” guys who feel your pants tighten when you think about upgrading your ESXi cluster to some new Nehalems, there are guys who are responsible for more virtual machine instance hours in a day than you will be in your whole lifetime, swapping fantasies with guys who’se applications won’t run on the biggest, baddest Nehalem box, much less a long aisle of racks stuffed full of Sandybridges. Those guys, they are fantasizing about how much power and money they can save with assloads of systems build on upcoming high-volume, low-cost ARM SoCs. Those guys are going to put nine out of ten of you Windows-cleaners out of work if you cling to the old-way of doing things for too long.

And “chipguy,” don’t forget, Intel got its start in the microprocessor business supplying chips for desktop calculators. Read some Clayton Christensen. Modest beginnings have led to big things, and toppled old empires in the tech business again and again. No reason to think it won’t happen again.

Intel has to invest a lot of capital to hold on to its lead. I don’t know about you, but in the long run, I’d bet on a competitive ecosystem over a command-economy for the efficient allocation of capital.