The pros and cons of an Apple-Intel divorce

There's no need for an OS X-iOS merger, but Apple could ditch Intel in its Macs -- or adopt Intel in the iPhone and iPad

1 2 3 4 5 Page 3
Page 3 of 5

Until 2011, ARM didn't even have a 64-bit architecture. ARM's first 64-bit chips are still under development, expected to reach the market this year. They are intended for lower-power servers, which will allow comparisons with competing x86 processors. No company can defy the laws of physics; as ARM processors grow more powerful, they will inevitably need more juice. The critical factor is the power/performance ratio -- the amount of processing performed per watt. If there's hope for ARM to crack the desktop, it will come from delivering more performance per watt than Intel.

Intel's ace is superior manufacturing technology, which is about four years ahead of everyone else's. In the semiconductor industry, that's huge. Besides its lead in process geometry -- Intel is comfortably mass-producing chips in a 22-nanometer process while the rest of the industry is adopting 28nm -- Intel is using trigate (3D) transistors called FinFETs, whereas everyone else is still using planar (2D) transistors.

This manufacturing lead is an enormous advantage that ARM-based processors must try to overcome with superior design and efficiency. Yes, the ARM architecture is more efficient than the x86 in some ways, but that's a slimmer advantage than Intel's manufacturing prowess.

Another factor is Intel's relentless design pipeline. The company introduces new or improved processors every year and rarely misses an announced production date. Any company hoping to compete with Intel must not only create a superior design, but also follow that chip with even better designs on a similar schedule. As AMD can tell you, it ain't easy. Although Apple has much more cash to spend than AMD, it's doubtful that Apple employs enough chip-design expertise to match Intel's aggressive pace.

Over time, switching the Macintosh from x86 to ARM could doom the computers to inferior performance. Apple remembers full well how its vaunted PowerPC chips in 1994 lost their lead against Intel chips within a few years, as IBM and Motorola fell further and further behind Intel's aggressive x86 improvements. As a result, Apple dumped the PowerPC for x86 in 2005. (That's about when Apple stopped using "Power" in most of its product names and switched back to just "Mac.")

The fact is that Apple was able to move from Motorola 680x0 chips to the PowerPC in 1994 and from PowerPC to x86 in 2005. Apple proved twice it could make a drastic forklift upgrade to its Mac hardware without wrecking its software ecosystem. So speculators' thinking goes today, why not again?

In those two big chip transitions, Apple prevailed by using nearly transparent emulation and other clever tricks, like "fat binaries" that bundled two versions of the same program in a single executable package, one for each processor. True, each transition took software developers on a wild ride, but most users found the switch tolerable and often seamless.

Still, the difficulty of swapping CPU architectures should not be underestimated. Although Apple can undoubtedly do it again, switching the Mac to ARM may not gain it the same advantages as previous switches. Remember that in the two previous switches, the old architecture was falling way behind the performance curve, but that's not so with the x86 today. Indeed, Intel is still leading the curve.

1 2 3 4 5 Page 3
Page 3 of 5
How to choose a low-code development platform