top of page
  • Writer's pictureDoug Ennis

Splitting from Intel: Is MacBook just a Big iPad now?


It’s no secret that Apple has wanted to use their own silicon in MacBooks as they have in iPhone, iPad, and other mobile products. Their legacy has always been steeped in the proprietary realm - does anyone remember lightning cables? And that is just recent memory. Apple has always loved their proprietary connectors, hamstringing hardware upgrades, and software upgrades sending some hardware into oblivion. Yet, 15 years ago Steve Jobs ditched PowerPC processors for Intel based processors… They were not the only ones…


In those 15 years, Intel based processors started popping up everywhere. The Cloud, a term we’re all familiar with, is just a computer running elsewhere. That computer is running on Intel processors. Azure, AWS, Google? Doesn’t matter; they’re all running on the Intel processor stack. IBM, Sun, HP, and others had been making their own processors and ditched in favor of x86 architectures since. If you see the SD (Software Defined) that’s popping up all over the place, it’s an indicator that it’s virtualized. The history of Virtualization, the Cloud, and Intel (x86) based processors are all linked.


The 8086 processor was created in 1978 and today represents 50% of all processors made. The x86-based systems are CISC (Complex Instruction Set Computing), have upgradable processors, and support a wide range of memory, graphics, and storage. The range of these processors is vast, covering low power systems to high-capacity servers. This flexibility drove market share, then market share made this the DeFacto standard, and standardization fueled the software defined age. This age birthed virtualization, the cloud, and made our computers Swiss Army knives of capabilities and purposes.


So why opt out now?


Occupying 2nd place in the processor market are ARM (Advanced RISC Machine) based processors. RISC (Reduced Instruction Set) is a younger brother of CISC; smaller, highly optimized, and excels at low power consumption. ARMs, unlike x86 architecture, is integrated. It’s also called SoC (System on Chip), meaning CPU, memory, interfaces, and other components are all integrated in the silicon. This provides limited upgradability, yet better power consumption. This integration and lower power consumption have made it ideal in mobile devices and GPU (Graphic Processor Units) where multiple processors are packed densely for graphics processing. This is where Apple is putting their chips. ARM is a UK-based technology provider that designs and licenses processors for a variety of services.

Apples M1 chip, is ARM (RISC) based like their mobile devices. The first M1 MacBooks have been on the lower end of performance, memory, and features leaving the higher end MacBooks still utilizing Intel i7 processors. Early reports have shown that battery life is improved, yet performance and compatibility is limited. 2021 will have the next phase of M1 chips released, promising more performance to challenge the i7, yet with improved power utilization. While short-term gains in power consumption is desirable, the long-term effects won’t be known for several years.


Is there a need move off an old standard? 42 years is a pretty good run, right? As of today, the x86 architecture is the glue that connects our businesses, organizations, and groups. However, Apple’s M1 is only one ARM challenger. Microsoft is co-developing an ARM-based processor with Qualcomm to be utilized in some Azure services and their Surface laptop. Amazon has developed their own ARM processor, yet it’s only a small part of infrastructure currently. These are a couple of big names, but the Apex predator of the ARM game is Nvidia. You might know of them as the Graphics Card company synonymous with video gaming. Nvidia is in the middle of merging with ARM, something that is being protested by Qualcomm and Intel, shockingly.


Should I avoid an M1? The short answer is no; however, I’d set your expectations realistically. Realize that you’re trading compatibility, upgradability, and top-end performance for battery life. Personally, I’ve made this tradeoff before on phones and with a Microsoft Surface. I was about to put laptop or notebook next to Surface title, yet this Surface was neither of these, as it’s based on Qualcomm’s Snapdragon processor. Portable, lightweight, great battery life - but there was something missing. It didn’t perform as well as expected, was that my fault? Maybe my hopes were too high. It didn’t support a peripheral or two that I had, was that a deal breaker? No, it just wasn’t the swiss-army knife notebook that I had come to expect. Have I thrown this away? No, not even close. But it didn’t replace my notebook either.

It comes down to use - what is your intended use case? If you’re utilizing web-based applications and some well-established desktop applications, and battery life is key, then this is a very easy sacrifice to make and a worth-while choice. But if you need the performance and versatility that modern computing has to offer, you would be better off sticking with the tried-and-true Intel x86 platform. Apple is taking a gamble by moving off the x86 platform, and time will tell whether this risk will pay off. But in the meantime, we can’t help but question the logic behind making the MacBook more similar to the iPad in terms of functionality.

52 views0 comments
bottom of page