6 Comments

Before the iPad Pro in 2015, I felt that ARM wouldn't be able to seriously compete with x86 in any reasonable amount of time. Most ARM CPUs that I'd seen up to that point were ... struggling to compete with a Pentium 3 in terms of felt performance. Of course, this hadn't been true for some time, but the nature of mobile devices meant that I couldn't really see the performance that ARM chips could produce in any real way (except for handling modern web). The iPad Pro started getting some serious creative applications ported to it, and all of that changed.

I then bought a non-pro iPad, and I helped my son create a video for his theatre class with it. It was far faster than any edit/render job on my PC; outrageously faster. Dedicated logic for a task can make a very noticeable improvement in any machine.

The M1's 8 wide decode is a large advantage in terms of perceived compute power. Likewise, the massive out-of-order buffer helps quite a bit. Then, if we add on that RISC instructions are 1 clock cycle, you get even more performance. Still, even with those design wins, x86 is competitive with M1, and x86 can still beat it (in raw compute power). Where Apple gained the bulk of their advantage is the SoC as a whole. Having so many dedicated ICs sharing the same package cuts latency, improves throughput, and mitigates the lack of efficient complex instruction decode that CISC chips have.

In the end, I think that x86 may last a very long time indeed, but it will need to adopt some of the M1 SoC design cues of insanely high integration. I personally feel that the future will hold many incompatible SoCs for which code will need to be optimized or for which specialized translating compilers like Rosetta2 will be needed. Apple's M1 is not a pure ARM play, and x86 need not be pure play either (it actually already isn't given on-die GPU).

Given recent changes of ARM's licensing model, I actually think that RISC-V will see more investment, more optimization, wider adoption, and may ultimately win the race for ISA dominance. It still won't matter too much though. The platform is no longer merely a CPU but instead many special purpose ICs plus a CPU smashed into a single package.

Expand full comment

Thanks for the article! I'm curious where the number 60 years came from in the line "I think it’s safe to say that we will still be able to buy x86 compatible machines in at the least sixty years time." (If I'm understanding correctly, its only been around 44 years, and I would have naively guessed that computer hardware will change a lot more in the next 40 years compared to the previous 40.)

Expand full comment