I love the Chip Letter, but confess to a sizable backlog. Nevertheless, I read the recent MIPS post immediately. How could I not? I joined MIPS as VP of Engineering in 2004 not long after the company exited their custom and semi-custom 64-bit CPU business and refocused on 32-bit processor IP.
The problem had been that games and printers are classic “razor blade business models” - making money, on game or ink cartridges, almost regardless of the CPU cost. During MIPS extended focus on 64-bit, well after its spin out from SGI, the upstart Arm moved steadily along their 32-bit roadmap, until they dominated the far broader embedded market and, crucially, won the smartphone (which created a wide moat after the 3rd party app store developed).
When we (MIPS) refocused on 32-bit IP we could beat, sometimes easily beat, Arm’s performance metrics but by then that wasn’t enough to dislodge Arm; the unit shipment and revenue gap grew inexorably. Today, I’m delighted to see MIPS’ outstanding RTL retargeted and reincarnated as RISC-V. As many of you know, the base RISC-V ISA is MIPS-I with minor tweaks, however, the vastly different RISC-V business model offers real opportunity at MIPS’ new home, Globalfoundries.
The MIPS history deserves a full book. Meanwhile - shameless plug - you might enjoy the story of my time at MIPS: Section 5 of Silicon Planet (Amazon).
That's good insight. As a software guy on the razor blade side of the game consoles I didn't even stop to question that MIPS being so far out in front with 64 bit was anything other than an advantage, but in hindsight it makes total sense that it was a mistake. 64-bit was great for SGI, but had a lot of knock-on costs for anything else.
Another ARM advantage that was important was the higher code density of Thumb mode with 16 bit instructions, an advantage shared with the Hitachi SH processor line. The same C source code compiled for SH-2 (Sega Saturn) was around 70% of the size of the R3000 PlayStation version, so the 2 MB of work RAM in both machines went farther with the SH. Nintendo rode that to the limit with the ARM7TDMI in the Game Boy Advance, even having a 16-bit bus to the cartridge slot for big cost-of-goods savings. As long as you avoided "traditional" 32-bit wide ARM instructions it was all good.
Yes! ARM7TDMI (with Thumb) is an important case in point. It was introduced in 1993 but it wasn’t until 1995 that the MIPS camp, (LSI Logic) reacted with its own 3-stage pipeline. Another year followed until LSI’s TinyRISC introduced a compressed mode (remarkably similar to Thumb, both using Jump-and-link to effect the mode switch).
5-stage pipelines have better performance/area-power than 3-stage pipelines BUT if the small, additional power required by the 5-stager simply wasn’t available, the customer had to choose ARM7. By the time MIPS spun out of SGI in 1998. Arm had moved to 5-stage (ARM9) and meanwhile had mopped up (a huge percentage of..) the embedded processing markets.
I’m passionate about this topic because I think MIPS’ focus on 64-bit through their first 4 or more years after becoming an IP licensing company, sacrificed the enormous benefits of their computer science architecture, >100 patents, >$100M cash from the 1998 IPO. During my tenure, outstanding people worked their asses off but were only able to delay the inevitable.
Fascinating, and I love the history here. MIPS was actually the first (and only) instruction set I played around with when I begged a computer architecture engineer to help me understand/learn low-level assembly language.
He suggested it since everything else was more complicated (x86, ARM, RISC-V, whatever)... and I was never going to practically program in any of these anyway, so it was irrelevant how industry-practical what I was learning was.
Played around with it for a few weeks—never ended up being practically useful, of course, but still fun to understand conceptually what's under the hood.
Last semester I took a graduate level Computer Architecture course. It happened to be the last semester the class would use MIPS. Going forward, it will be taught using RISC-V.
The department had no MIPS hardware for us to use, so we ran everything on an emulator. That won’t be the case going forward with RISC-V.
I think when you get down to it MIPS ultimately failed because they weren't really doing anything that unique. Even pretty early on they had competitors in the same spaces with similar efforts like SPARC, POWER and Alpha, not to mention many other less successful workstation-oriented RISC architectures.
ARM on the other hand made an early push in mobile/low-power in a very big way.
I love the Chip Letter, but confess to a sizable backlog. Nevertheless, I read the recent MIPS post immediately. How could I not? I joined MIPS as VP of Engineering in 2004 not long after the company exited their custom and semi-custom 64-bit CPU business and refocused on 32-bit processor IP.
The problem had been that games and printers are classic “razor blade business models” - making money, on game or ink cartridges, almost regardless of the CPU cost. During MIPS extended focus on 64-bit, well after its spin out from SGI, the upstart Arm moved steadily along their 32-bit roadmap, until they dominated the far broader embedded market and, crucially, won the smartphone (which created a wide moat after the 3rd party app store developed).
When we (MIPS) refocused on 32-bit IP we could beat, sometimes easily beat, Arm’s performance metrics but by then that wasn’t enough to dislodge Arm; the unit shipment and revenue gap grew inexorably. Today, I’m delighted to see MIPS’ outstanding RTL retargeted and reincarnated as RISC-V. As many of you know, the base RISC-V ISA is MIPS-I with minor tweaks, however, the vastly different RISC-V business model offers real opportunity at MIPS’ new home, Globalfoundries.
The MIPS history deserves a full book. Meanwhile - shameless plug - you might enjoy the story of my time at MIPS: Section 5 of Silicon Planet (Amazon).
—Pat Hays
Or RISC-V is RISC-I with minor tweaks. Potato potahto.
Granted original RISC (and SPARC) had those ill fated register windows but still.
That's good insight. As a software guy on the razor blade side of the game consoles I didn't even stop to question that MIPS being so far out in front with 64 bit was anything other than an advantage, but in hindsight it makes total sense that it was a mistake. 64-bit was great for SGI, but had a lot of knock-on costs for anything else.
Another ARM advantage that was important was the higher code density of Thumb mode with 16 bit instructions, an advantage shared with the Hitachi SH processor line. The same C source code compiled for SH-2 (Sega Saturn) was around 70% of the size of the R3000 PlayStation version, so the 2 MB of work RAM in both machines went farther with the SH. Nintendo rode that to the limit with the ARM7TDMI in the Game Boy Advance, even having a 16-bit bus to the cartridge slot for big cost-of-goods savings. As long as you avoided "traditional" 32-bit wide ARM instructions it was all good.
Yes! ARM7TDMI (with Thumb) is an important case in point. It was introduced in 1993 but it wasn’t until 1995 that the MIPS camp, (LSI Logic) reacted with its own 3-stage pipeline. Another year followed until LSI’s TinyRISC introduced a compressed mode (remarkably similar to Thumb, both using Jump-and-link to effect the mode switch).
5-stage pipelines have better performance/area-power than 3-stage pipelines BUT if the small, additional power required by the 5-stager simply wasn’t available, the customer had to choose ARM7. By the time MIPS spun out of SGI in 1998. Arm had moved to 5-stage (ARM9) and meanwhile had mopped up (a huge percentage of..) the embedded processing markets.
I’m passionate about this topic because I think MIPS’ focus on 64-bit through their first 4 or more years after becoming an IP licensing company, sacrificed the enormous benefits of their computer science architecture, >100 patents, >$100M cash from the 1998 IPO. During my tenure, outstanding people worked their asses off but were only able to delay the inevitable.
“The CPU would version of NEC’s…”
Was a, or would be?
Fascinating, and I love the history here. MIPS was actually the first (and only) instruction set I played around with when I begged a computer architecture engineer to help me understand/learn low-level assembly language.
He suggested it since everything else was more complicated (x86, ARM, RISC-V, whatever)... and I was never going to practically program in any of these anyway, so it was irrelevant how industry-practical what I was learning was.
Played around with it for a few weeks—never ended up being practically useful, of course, but still fun to understand conceptually what's under the hood.
Last semester I took a graduate level Computer Architecture course. It happened to be the last semester the class would use MIPS. Going forward, it will be taught using RISC-V.
The department had no MIPS hardware for us to use, so we ran everything on an emulator. That won’t be the case going forward with RISC-V.
I think when you get down to it MIPS ultimately failed because they weren't really doing anything that unique. Even pretty early on they had competitors in the same spaces with similar efforts like SPARC, POWER and Alpha, not to mention many other less successful workstation-oriented RISC architectures.
ARM on the other hand made an early push in mobile/low-power in a very big way.