8 Comments

Thanks for this. One thing you didn't touch on is how consumer demand affects creating a new ISA. Case in point, the transition from x86 to HP and Intel's EPIC/IA-64/Itanium ("Itanic") would mean consumers losing much software investment (previous investment trap?) or having that investment relegated to an x86 compatibility mode on the Itanium chip. Customers did not want that. AMD64 was released and widely (wildly?) adopted instead of Itanium because of its backward compatibility which allowed customers to avoid losing existing software. Sales of Itanium were meager at best. It is an ISA that flopped hard. The last systems with it were released in 2017 and vendor support ended in 2021. The last Oracle SPARC systems were also released in 2017 although support for existing SPARC systems is projected to be until 2034.

Expand full comment

Great comment! Absolutely agree. Over time we're accumulating more and more software for the mainstream architectures -most especially x86 / AMD64 - so that means it gets harder and harder to abandon these architectures.

In that context I'm really intrigued by what Apple has done with Rosetta 2 with its AOT compilation of x86 to AArch64. It seems like such a good product - even dealing with JIT for example - so it now seems like an obvious answer to the challenge of moving on from x86.

Expand full comment

Loved your thoughts! What do you think about the Mill computing architecture? I'm sure they plan on licensing it, so it would be very difficult for them to get a foothold in the market, but it certainly has some unique design ideas.

Expand full comment

Thanks so much. I think the Mill is interesting - it's on my (very long!) list of architectures to write about (possibly 2026!?). I do think RISC-V will make it harder and harder for new architectures to get traction - a bit like Linux and operating systems - unless they give demonstrably better performance in a new area like machine learning (like the TPU etc).

Expand full comment

That makes a lot of sense, as I guess most of the benefits of an architecture are that it's widely used and supported—not that it's technically superior.

I don't think it would only find success in new areas, because if it's significantly faster it would be great for HPC (where they already don't use standard ISAs), but I could also see it in an ML chip like you mentioned.

Expand full comment

Interesting article, however... by paragraph eight I was practically screaming in my head "Have you never heard of quantum computing?!"

Talk about a radically new future computing paradigm that's guaranteed to be the Genesis of a raft of new ISA's.

In fact I'm thinking this will start the cycle all over again with each player creating a new ISA that will fight it out in the marketplace with the others until they're whittled down to less than a handful.

Who knows. By then we may be working on the next thing beyond that and starting the cycle all over again.

Expand full comment

Hi, You're right but I had quantum in the category of accelerators like GPUs. Is it really a replacement for the general purpose von-Neumann CPU? Interested to hear your thoughts!

Expand full comment

Not right now, no. But in the future who knows? I'd be surprised if quantum computer development doesn't continually push quantum computing into more and more areas of traditional computing.

I'm not sure I'd eliminate GPU's either since they are full fledged CPU's in their own right, albeit with a design skewed towards floating point and vector math performance.

Your overall idea is interesting, but I don't think we're there yet, or that RISC-V will be that final ISA. don't get me wrong, I'm a huge fan of RISC-V, I just don't see it as the final ISA. I think we're more likely to see a future with an open standard that software can be written to and each CPU architecture translates or recompiles to native. Think POSIX meets OpenCL, meets Java bytecode, meets transmeta, meets JIT.

Expand full comment