Before the iPad Pro in 2015, I felt that ARM wouldn't be able to seriously compete with x86 in any reasonable amount of time. Most ARM CPUs that I'd seen up to that point were ... struggling to compete with a Pentium 3 in terms of felt performance. Of course, this hadn't been true for some time, but the nature of mobile devices meant that I couldn't really see the performance that ARM chips could produce in any real way (except for handling modern web). The iPad Pro started getting some serious creative applications ported to it, and all of that changed.
I then bought a non-pro iPad, and I helped my son create a video for his theatre class with it. It was far faster than any edit/render job on my PC; outrageously faster. Dedicated logic for a task can make a very noticeable improvement in any machine.
The M1's 8 wide decode is a large advantage in terms of perceived compute power. Likewise, the massive out-of-order buffer helps quite a bit. Then, if we add on that RISC instructions are 1 clock cycle, you get even more performance. Still, even with those design wins, x86 is competitive with M1, and x86 can still beat it (in raw compute power). Where Apple gained the bulk of their advantage is the SoC as a whole. Having so many dedicated ICs sharing the same package cuts latency, improves throughput, and mitigates the lack of efficient complex instruction decode that CISC chips have.
In the end, I think that x86 may last a very long time indeed, but it will need to adopt some of the M1 SoC design cues of insanely high integration. I personally feel that the future will hold many incompatible SoCs for which code will need to be optimized or for which specialized translating compilers like Rosetta2 will be needed. Apple's M1 is not a pure ARM play, and x86 need not be pure play either (it actually already isn't given on-die GPU).
Given recent changes of ARM's licensing model, I actually think that RISC-V will see more investment, more optimization, wider adoption, and may ultimately win the race for ISA dominance. It still won't matter too much though. The platform is no longer merely a CPU but instead many special purpose ICs plus a CPU smashed into a single package.
Thanks for a great comment! Absolutely agree that x86 in principle could remain competitive for a very long time. I think the biggest challenge will come in the cloud from firms looking to lower costs and power consumption. Once there is less revenue to support x86 development then the architecture will start to lose ground.
I think your point on customers buying an SoC is really good. Perhaps ultimately x86 will need to be licensed to third parties - I think Intel is offering more flexibility in their foundry offerings which is a step in this direction.
As for RISC-V, I'm a bit less bullish short term, but long term there are so many advantages to an open source and license free ISA.
AMD is doing well on lowering power consumption, but Intel isn’t. If AMD can continue to lower power consumption and increase performance with their advanced packaging schemes, I think they could do well.
As for licensing, AMD has already done so in China. Intel may do likewise if they continue to slip on market share and profits.
RISC-V is already showing up in non-consumer facing applications (even within Apple), and I expect that to continue and increasingly quickly. ARM’s licensing changes will, imho, increase the RISC-V adoption rate.
Thanks for the article! I'm curious where the number 60 years came from in the line "I think it’s safe to say that we will still be able to buy x86 compatible machines in at the least sixty years time." (If I'm understanding correctly, its only been around 44 years, and I would have naively guessed that computer hardware will change a lot more in the next 40 years compared to the previous 40.)
Hi and thanks so much for the comment. It's really a guess based on the fact that we can still buy 6502s today, 40 years after it was a leading CPU. Even if it's not a mainstream product people will still want to run their x86 code on real hardware so I think there will still be a market.
That's always assuming that something doesn't make all our computers obsolete in the meantime!
The 8088 was available new from Intel until 1998. The 486 only recently stopped manufacture. In the realm of industrial use, lifetimes of hardware are crazy long. This is also true in deployments within DCs, power plants, and so on. With the unbelievably large install base of x86, I would expect there to be x86 compatible chips in use for at least 40 more years, but possibly 100. The determining factors for this are (1) when the last new large install/deployment of an x86 chip is (2) what the support contract dictates (3) whether or not a second source is required in the contract (4) how many enthusiasts there are who still want x86 (5) whether or not there is some new breakthrough in materials, logic, or fabrication in the meantime.
If we’re being completely honest with ourselves, while we nerdy folk crave new ISAs and chips for our own nerdy delight, the greater market doesn’t much care as long as the thing performs and runs software people care about. Given that, x86 could be around until CMOS ends.
Before the iPad Pro in 2015, I felt that ARM wouldn't be able to seriously compete with x86 in any reasonable amount of time. Most ARM CPUs that I'd seen up to that point were ... struggling to compete with a Pentium 3 in terms of felt performance. Of course, this hadn't been true for some time, but the nature of mobile devices meant that I couldn't really see the performance that ARM chips could produce in any real way (except for handling modern web). The iPad Pro started getting some serious creative applications ported to it, and all of that changed.
I then bought a non-pro iPad, and I helped my son create a video for his theatre class with it. It was far faster than any edit/render job on my PC; outrageously faster. Dedicated logic for a task can make a very noticeable improvement in any machine.
The M1's 8 wide decode is a large advantage in terms of perceived compute power. Likewise, the massive out-of-order buffer helps quite a bit. Then, if we add on that RISC instructions are 1 clock cycle, you get even more performance. Still, even with those design wins, x86 is competitive with M1, and x86 can still beat it (in raw compute power). Where Apple gained the bulk of their advantage is the SoC as a whole. Having so many dedicated ICs sharing the same package cuts latency, improves throughput, and mitigates the lack of efficient complex instruction decode that CISC chips have.
In the end, I think that x86 may last a very long time indeed, but it will need to adopt some of the M1 SoC design cues of insanely high integration. I personally feel that the future will hold many incompatible SoCs for which code will need to be optimized or for which specialized translating compilers like Rosetta2 will be needed. Apple's M1 is not a pure ARM play, and x86 need not be pure play either (it actually already isn't given on-die GPU).
Given recent changes of ARM's licensing model, I actually think that RISC-V will see more investment, more optimization, wider adoption, and may ultimately win the race for ISA dominance. It still won't matter too much though. The platform is no longer merely a CPU but instead many special purpose ICs plus a CPU smashed into a single package.
Thanks for a great comment! Absolutely agree that x86 in principle could remain competitive for a very long time. I think the biggest challenge will come in the cloud from firms looking to lower costs and power consumption. Once there is less revenue to support x86 development then the architecture will start to lose ground.
I think your point on customers buying an SoC is really good. Perhaps ultimately x86 will need to be licensed to third parties - I think Intel is offering more flexibility in their foundry offerings which is a step in this direction.
As for RISC-V, I'm a bit less bullish short term, but long term there are so many advantages to an open source and license free ISA.
AMD is doing well on lowering power consumption, but Intel isn’t. If AMD can continue to lower power consumption and increase performance with their advanced packaging schemes, I think they could do well.
As for licensing, AMD has already done so in China. Intel may do likewise if they continue to slip on market share and profits.
RISC-V is already showing up in non-consumer facing applications (even within Apple), and I expect that to continue and increasingly quickly. ARM’s licensing changes will, imho, increase the RISC-V adoption rate.
Thanks for the article! I'm curious where the number 60 years came from in the line "I think it’s safe to say that we will still be able to buy x86 compatible machines in at the least sixty years time." (If I'm understanding correctly, its only been around 44 years, and I would have naively guessed that computer hardware will change a lot more in the next 40 years compared to the previous 40.)
Hi and thanks so much for the comment. It's really a guess based on the fact that we can still buy 6502s today, 40 years after it was a leading CPU. Even if it's not a mainstream product people will still want to run their x86 code on real hardware so I think there will still be a market.
That's always assuming that something doesn't make all our computers obsolete in the meantime!
The 8088 was available new from Intel until 1998. The 486 only recently stopped manufacture. In the realm of industrial use, lifetimes of hardware are crazy long. This is also true in deployments within DCs, power plants, and so on. With the unbelievably large install base of x86, I would expect there to be x86 compatible chips in use for at least 40 more years, but possibly 100. The determining factors for this are (1) when the last new large install/deployment of an x86 chip is (2) what the support contract dictates (3) whether or not a second source is required in the contract (4) how many enthusiasts there are who still want x86 (5) whether or not there is some new breakthrough in materials, logic, or fabrication in the meantime.
If we’re being completely honest with ourselves, while we nerdy folk crave new ISAs and chips for our own nerdy delight, the greater market doesn’t much care as long as the thing performs and runs software people care about. Given that, x86 could be around until CMOS ends.