I read this soon after it was published. I started university, doing computer science, in February 1981. I saw the book in the "new arrivals" section of the university book store, glanced through, and instantly bought it. I've had to buy several more copies over the years when I've loaned mine to people and it hasn't come back.
We had DEC machines at university: a PDP-11/70 was the main computer when I arrived, supplemented in 1982 by a VAX-11/780 and then I think the year after by another 11/750. There was a PDP-11/34 with 256 KB RAM, 22 terminals, and two DECWriter dot matrix printers in the 1st year students' lab. In 1980 I was still at high school but visited a friend who skipped 7th form.
At the end of my 4th year (half way through a 2 year Masters') I got a summer holiday job working jointly for the business school's finance professor and one of his star ex-students now heading the research department at a large (by NZ standards) stockbroking company. I split my time between the university in Hamilton, using an MV/8000 at the nearby technical institute (via modem on a leased line), and the stockbroking company in Wellington which had the bigger MV/10000. I also spent time in the DG office, evaluating their database products and whether we should buy (for ~$20k I think) their COBOL, FORTRAN, or PL/I compiler for my work. I settled on PL/I as it was, if you squinted just right, able to substitute for Pascal or C.
I ended up not going back to university and worked at that and other stockbroking firms, on DG computers (and Macs connected to them), for the next decade.
Skip forward twenty five years and since 2017 I've been heavily involved in the new RISC-V microcomputers and microcontrollers. I've been involved in working with CPU core designers in a kind of ‘Hardy boys’ and ‘Microkids’ experience except that instead of microcode (which does not exist in RISC-V) I've been on the system software and runtime library side. RISC-V software such as the SBI running in Machine mode (more privileged than the operating system) is often used to do the same kinds of things that microcode was used for on older designs ... but using normal RISC instructions.
I was working on the committees designing the RISC-V Vector extension (RVV) and BitManip extension and cache control extension. Designing these new instructions involves a lot of the same trade-offs between doing something in hardware vs doing it in software as you see in "Soul". As one simple example, suppose you want to preload data in a range of addresses into cache from main memory. You could have an instruction that you give an upper and lower limit and have the hardware loop over all addresses between. Or you can provide an OS or SBI function with the same arguments, but have it do a software loop with each iteration loading one cache line of data.
One interesting thing on the RVV committee was that one of the other members was Steve Wallach who had been "Manager Advanced Development - Eclipse Systems" at Data General from 1975-1980 ... and is featured in "Soul".
So, in general, I'd say "Soul of a New Machine" was an inspiration to me and got me interested in low level computer design things, which I've always done a lot of reading and study of (e.g. the comp.arch group on usenet, from as soon as I had access to the internet in 1989) -- though it took me until thirty five years later to find work involving designing computers and instruction sets rather than merely using them.
Hi Bruce! That's terrific, thanks so much for sharing. It's wonderful that you've been able to get so involved in RISC-V after first reading 'The Soul' when it first came out. Interesting that Steve Wallach was also still involved in RISC-V development. It might be interesting to trace the impact of those who featured in the book.
I was able to slip a casual "we don't want to put a bag on the side of ..." into RVV committee conversation at one point. I don't know how many got the reference but Steve picked it, as hoped.
I think that's a great strength of RISC-V. It's not just half a dozen people in a company in a smoky back room making something up and pushing it out the door under deadline pressure. RISC-V is attracting a huge breadth and depth of knowledge and experience from both industry and academia around the world. A great early example was the RVWMO memory model published in 2018.
Some of the specs (including Vector and Hypervisor, for example) have come frustratingly slowly. We thought we were virtually done with RVV in May 2019 when draft 0.7.1 was published (and two months later THead announced C906 and C910 cores implementing it) but it ended up taking 2 1/2 years longer. XTHeadVector (aka RVV 0.7.1) is a fine vector ISA, one of the best there is, but there is no question that RVV 1.0 is a significantly better ISA.
It's very interesting I think to go back and revisit the early things that Krste and co were saying and compare to how things have gone so far.
Here, for example, is a talk Krste gave at Stanford in late 2014:
I worked for Tom West when Soul was published. He was quite a character. When I first met him, he was firing his secretary as she had done his expense report for his last trip to Japan and had gotten the yen/dollar conversion backwards and had him owing DG hundreds of thousands of dollars. He didn’t suffer fools gladly. As a software manager in a hardware group, we got along well. Those were fun times.
It’s been a long time, but if I remember correctly, the Atlantic did a long form article based on the book that came out prior to when the book was published. I remember him commenting hat it was like a public psychoanalysis. I think one phrase that was used was “Machiavellian Prince of Darkness” which bothered him. He was a fairly private person and wasn’t comfortable with the fame in the beginning.
I joined DEC's TLE (Technical Languages and Environments) compiler group in mid-1978, just as the first release of the VAX 17/780 and VMS V1.0 were wrapping up. Three years later, "Soul of a New Machine" came out. Nearly everyone I knew at DEC's Spit Brook Rd facility, where I worked, read it. When I read it, my reaction was "Yup, this captures much of the spirit of the world I choose to inhabit". Since my parents had so little sense of that world, I gave them a copy of the book that Christmas.
Hi John, That's fascinating. I did wonder whether there was anything in the book that might have been useful for competitors, which I guess meant DEC really.
Also not sure what it did for DG's recruitment program in the following years.
When I arrived at DEC in Summer '77, DG was a failing company. (At least that was DEC's in-house attitude.) The VAX 11/780 shipped two and a half years ahead of DG's MV/8000 and was an instant success. So much so that the company was too busy, shipping 11/780s as fast as they could make them and developing the follow-on 11/750 and 11/730, to worry about anything that DG might be doing. We knew that we had a "tiger by the tail". The further we got away from Fall 1977, the less we looked over our shoulder. Tom West's was a skunkworks project that I doubt anyone at DEC knew existed.
Thanks for the past blast! It was a good book, spawned some imitations. Kidder also did some more tech journalism.
At the time I had been using Nova and Eclipse for a couple of years, though now all I can remember of them is the Eclipse had a load instruction which automatically followed pointer indirections, I think it was intended to optimize LISP. And the book was more of an enduring lesson about machine design and teams. It was an interesting contrast to The Mythical Man Month. Good times!
There were a bunch of companies with their own minicomputer designs in the 1970s. CTL was a UK peer of DEC. Perhaps Telefunken in Germany and I think France had one as well but I do not remember the name. None of them were imitators, in those days it was relatively easy to build machines and lots of academics published good ideas, most of these designs were independent. Machines popped up for niche markets like Plessey had a nice fault tolerant machine to control central switches in the telephone network, while there were several designs used in aircraft. I was really interested in CPU designs back then and did not notice anyone copying another overall, though many good ideas got borrowed. Lots of ego involved. There were a few licensed manufacturers, and of course behind the Iron Curtain they tried copying IBM several times. I worked for a while with a Bulgarian computer architect who had defected - he said it was terrible because all the bright engineers were allowed to do was reverse engineer the documents the spies brought in, so their talents were wasted.
Even IBM had a minicomputer in the 1970s. At IBM as an intern I went to a meeting where everyone was coached on why the S/7 was better than the PDP-11, so I suppose there was market chasing if not actual copying. They never made a 32 bit version, even the later Series 1 was pegged at 16 bits.
IIRC Nova could do infinite levels of indirection on jumps, but Eclipse add it on data loads and stores.
I never really understood what you'd want it for but, yes, could be useful for temporary forwarding pointers from oldspace to newspace in a copying GC while a collection was in progress and an object had been moved but not all references to it had been updated yet.
Ah it was jumps on Nova. Thanks for clarifying. I was remembering it ended when you reached a zero and could not figure out how that worked for data. The Eclipse version with data must have been slightly different. After 45 years the headline remains but the details need some prompting.
Lisp back then was built on pointer-value pairs and a lot of fetches. A lot of algorithms that made sense back then, when memory was in short supply but reading memory was as fast as the CPU and pipelines were short. Lisp free space was just another list and fragmentation was not a worry. Objects were not contiguous and did not need to be moved. Now it is second nature to do things like copy collectors and cleverly compile Lisp (or Julia, which is Lisp in very nice disguise) so that objects are contiguous. There is no going back, it is just historical curiosity. Now it is second nature to do things like copy collectors and cleverly compile Lisp (or Julia, which is Lisp in very nice disguise) so that objects are contiguous. Variations in how technology scales have ripped the fabric to make no sense now.
As I recall, the indirect hi bit in a pointer tells you whether the thing pointed to is your data (to load/store) or a pointer to load. If you need to load a pointer then again its hi bit tells whether it is pointing to your actual load/store data, or to yet another pointer.
Love this article! While I was only a young lad in 1978, I do remember many times over the years having 'Soul' recommended reading for a tech nerd like myself. I even HAVE THE BOOK on my shelf at home (and have for years, I'm embarrassed to say) and really need to sit down and take it all in.
In the meantime, totally unrelated, I'm finally thinking about why my work PC is so slow and it turns out that it "only" has 8GB RAM... so doubling is only $29.00, delivered tomorrow. I remember when 16GB was only a pipe dream and, if such a monstrous amount of memory could even be utilized (you couldn't) it would have been thousands of dollars.
I'm enjoying my peek into the storied past of computing while, at the same time, enjoying marveling at how much the prices for everything came down. :) Good times.
Nice retrospective. I reread "Soul" just a few weeks ago, worth it. I no longer remember my reaction when I first read it, about the time it came out, though I'm sure part of my interest was that I'd been programming the Nova and Eclipse recently.
The company I worked for did workstations for paper mills, in particular where the rolls of paper needed to be wrapped for shipment and a large label printed to be pasted onto the roll. A paper mill is hot and humid, so the computer and printer were enclosed in air conditioned cabinets. Data General Nova 3, programmed in Fortran, running on top of whatever DG's OS was called. One small system I worked on was diskless, with the program initially loaded via paper tape! Core memory, of course, so the program was retained through power off. Others were disk-based, sized at 5MB I think. There was also a home grown code switcher, which let you swap different parts of the program in and out of memory, to get around the limited memory. This was the late 70s. One can contemplate just how differently one would solve the problem today. This was Diamond Engineering Corp., in Redmond, Washington. Long gone, of course, and our former building is also gone, consumed by the Microsoft campus.
That's terrific, thanks for sharing. I (just) remember the era of paper tape! I guess that the Nova got replaced by a PC quite soon after this. These machines were so useful in their era but micros changed everthing.
My comment from a couple of years ago on Minimalist Computing:
I’ve just finished “The Soul Of A New Machine” by Tracy Kidder. It’s one of the few books that I’ve read several times over the years. My copy was lent and never returned so I picked up a fresh copy. For a 40 year old book on CPU and computer design it reads like a thriller and is nearly impossible to put down.
Highly recommended.
Thanks for the article. Some great insight as always.
Yeah, that was a great book. It made me feel like I was with them during their development progress. I wonder what Jim Keller thinks of the book as he worked at DEC around 1982.
" ... so i read about Digital and i literally read the computer architecture manual for the VAX 780 on the plane on the way to the interview and then i interviewed them with a whole bunch of questions because i just read this this architecture spec which i didn't know that much about to be honest but i was kind of a wise ass as a kid and um they hired me because they thought it was funny ..."
I began my job as a junior technical writer at Data General, a job for which I was only barely qualified, in April, 1980, just as the company was preparing to announce the Eagle (MV-8000) to the world. In fact my first assignment was to help prepare manuscripts of the first manuals (including the Principles of Operations shown in your post) for publication. I didn't understand a word in those manuals, but I could tell if the pages were upside down or not.
Four years later I was writing Principles of Operations manuals (among others), and working closely, for a while, with Tom West. Decades after that I wrote a few novels set in the world of hackers and computer engineers, and Tom West was an unavoidable influence on them. In fact one of my books has a character named Tom Best.
When Tom West passed away in 2011 I wrote an essay for my 'Wetmachine' blog: "Remembering Tom West, the original geek rock star.' Tom's ex-wife and one of his daughters separately came upon my essay, and I was invited by them to a memorial service that was held a few weeks later at Tom's home -- the one described in Soul of a New Machine. There I met Tracy Kidder and Edson de Castro, DG's founder & president, whom I had never met during my four years at DG.
It's been years since I last read Soul of a New Machine. Your essay has motivated me to buy a copy and read it again. I think it's time.
Interested readers can find my essay about Tom here:
I read this soon after it was published. I started university, doing computer science, in February 1981. I saw the book in the "new arrivals" section of the university book store, glanced through, and instantly bought it. I've had to buy several more copies over the years when I've loaned mine to people and it hasn't come back.
We had DEC machines at university: a PDP-11/70 was the main computer when I arrived, supplemented in 1982 by a VAX-11/780 and then I think the year after by another 11/750. There was a PDP-11/34 with 256 KB RAM, 22 terminals, and two DECWriter dot matrix printers in the 1st year students' lab. In 1980 I was still at high school but visited a friend who skipped 7th form.
At the end of my 4th year (half way through a 2 year Masters') I got a summer holiday job working jointly for the business school's finance professor and one of his star ex-students now heading the research department at a large (by NZ standards) stockbroking company. I split my time between the university in Hamilton, using an MV/8000 at the nearby technical institute (via modem on a leased line), and the stockbroking company in Wellington which had the bigger MV/10000. I also spent time in the DG office, evaluating their database products and whether we should buy (for ~$20k I think) their COBOL, FORTRAN, or PL/I compiler for my work. I settled on PL/I as it was, if you squinted just right, able to substitute for Pascal or C.
I ended up not going back to university and worked at that and other stockbroking firms, on DG computers (and Macs connected to them), for the next decade.
Skip forward twenty five years and since 2017 I've been heavily involved in the new RISC-V microcomputers and microcontrollers. I've been involved in working with CPU core designers in a kind of ‘Hardy boys’ and ‘Microkids’ experience except that instead of microcode (which does not exist in RISC-V) I've been on the system software and runtime library side. RISC-V software such as the SBI running in Machine mode (more privileged than the operating system) is often used to do the same kinds of things that microcode was used for on older designs ... but using normal RISC instructions.
I was working on the committees designing the RISC-V Vector extension (RVV) and BitManip extension and cache control extension. Designing these new instructions involves a lot of the same trade-offs between doing something in hardware vs doing it in software as you see in "Soul". As one simple example, suppose you want to preload data in a range of addresses into cache from main memory. You could have an instruction that you give an upper and lower limit and have the hardware loop over all addresses between. Or you can provide an OS or SBI function with the same arguments, but have it do a software loop with each iteration loading one cache line of data.
One interesting thing on the RVV committee was that one of the other members was Steve Wallach who had been "Manager Advanced Development - Eclipse Systems" at Data General from 1975-1980 ... and is featured in "Soul".
So, in general, I'd say "Soul of a New Machine" was an inspiration to me and got me interested in low level computer design things, which I've always done a lot of reading and study of (e.g. the comp.arch group on usenet, from as soon as I had access to the internet in 1989) -- though it took me until thirty five years later to find work involving designing computers and instruction sets rather than merely using them.
Hi Bruce! That's terrific, thanks so much for sharing. It's wonderful that you've been able to get so involved in RISC-V after first reading 'The Soul' when it first came out. Interesting that Steve Wallach was also still involved in RISC-V development. It might be interesting to trace the impact of those who featured in the book.
Yup. You can check the paragraph just above this bookmark
https://github.com/riscvarchive/riscv-v-spec/blob/master/v-spec.adoc#changes-from-v1-0
I was able to slip a casual "we don't want to put a bag on the side of ..." into RVV committee conversation at one point. I don't know how many got the reference but Steve picked it, as hoped.
Thanks so much. Gosh that's a distinguished list. 😊
I think that's a great strength of RISC-V. It's not just half a dozen people in a company in a smoky back room making something up and pushing it out the door under deadline pressure. RISC-V is attracting a huge breadth and depth of knowledge and experience from both industry and academia around the world. A great early example was the RVWMO memory model published in 2018.
Some of the specs (including Vector and Hypervisor, for example) have come frustratingly slowly. We thought we were virtually done with RVV in May 2019 when draft 0.7.1 was published (and two months later THead announced C906 and C910 cores implementing it) but it ended up taking 2 1/2 years longer. XTHeadVector (aka RVV 0.7.1) is a fine vector ISA, one of the best there is, but there is no question that RVV 1.0 is a significantly better ISA.
It's very interesting I think to go back and revisit the early things that Krste and co were saying and compare to how things have gone so far.
Here, for example, is a talk Krste gave at Stanford in late 2014:
https://www.youtube.com/watch?v=vB0FC1DZqUM
This panel discussion in November 2018 is also worth watching (I was fortunate enough to be there on a visit to the US):
https://www.youtube.com/watch?v=xoHsl2p2R_c
Thanks! I've seen Krste's talk before but not the panel discussion. I'll watch with interest.
I worked for Tom West when Soul was published. He was quite a character. When I first met him, he was firing his secretary as she had done his expense report for his last trip to Japan and had gotten the yen/dollar conversion backwards and had him owing DG hundreds of thousands of dollars. He didn’t suffer fools gladly. As a software manager in a hardware group, we got along well. Those were fun times.
Thanks Dave! That’s quite an error so I’m not surprised he was mad. Do you remember what his reaction was to the book when it came out?
It’s been a long time, but if I remember correctly, the Atlantic did a long form article based on the book that came out prior to when the book was published. I remember him commenting hat it was like a public psychoanalysis. I think one phrase that was used was “Machiavellian Prince of Darkness” which bothered him. He was a fairly private person and wasn’t comfortable with the fame in the beginning.
I joined DEC's TLE (Technical Languages and Environments) compiler group in mid-1978, just as the first release of the VAX 17/780 and VMS V1.0 were wrapping up. Three years later, "Soul of a New Machine" came out. Nearly everyone I knew at DEC's Spit Brook Rd facility, where I worked, read it. When I read it, my reaction was "Yup, this captures much of the spirit of the world I choose to inhabit". Since my parents had so little sense of that world, I gave them a copy of the book that Christmas.
Hi John, That's fascinating. I did wonder whether there was anything in the book that might have been useful for competitors, which I guess meant DEC really.
Also not sure what it did for DG's recruitment program in the following years.
When I arrived at DEC in Summer '77, DG was a failing company. (At least that was DEC's in-house attitude.) The VAX 11/780 shipped two and a half years ahead of DG's MV/8000 and was an instant success. So much so that the company was too busy, shipping 11/780s as fast as they could make them and developing the follow-on 11/750 and 11/730, to worry about anything that DG might be doing. We knew that we had a "tiger by the tail". The further we got away from Fall 1977, the less we looked over our shoulder. Tom West's was a skunkworks project that I doubt anyone at DEC knew existed.
I know nothing about DG recruitment.
Thanks for the past blast! It was a good book, spawned some imitations. Kidder also did some more tech journalism.
At the time I had been using Nova and Eclipse for a couple of years, though now all I can remember of them is the Eclipse had a load instruction which automatically followed pointer indirections, I think it was intended to optimize LISP. And the book was more of an enduring lesson about machine design and teams. It was an interesting contrast to The Mythical Man Month. Good times!
Glad it brought back some memories. Do you remember any of the imitators? Might be interesting to dig out.
Really enjoying the thread below too!
There were a bunch of companies with their own minicomputer designs in the 1970s. CTL was a UK peer of DEC. Perhaps Telefunken in Germany and I think France had one as well but I do not remember the name. None of them were imitators, in those days it was relatively easy to build machines and lots of academics published good ideas, most of these designs were independent. Machines popped up for niche markets like Plessey had a nice fault tolerant machine to control central switches in the telephone network, while there were several designs used in aircraft. I was really interested in CPU designs back then and did not notice anyone copying another overall, though many good ideas got borrowed. Lots of ego involved. There were a few licensed manufacturers, and of course behind the Iron Curtain they tried copying IBM several times. I worked for a while with a Bulgarian computer architect who had defected - he said it was terrible because all the bright engineers were allowed to do was reverse engineer the documents the spies brought in, so their talents were wasted.
Even IBM had a minicomputer in the 1970s. At IBM as an intern I went to a meeting where everyone was coached on why the S/7 was better than the PDP-11, so I suppose there was market chasing if not actual copying. They never made a 32 bit version, even the later Series 1 was pegged at 16 bits.
IIRC Nova could do infinite levels of indirection on jumps, but Eclipse add it on data loads and stores.
I never really understood what you'd want it for but, yes, could be useful for temporary forwarding pointers from oldspace to newspace in a copying GC while a collection was in progress and an object had been moved but not all references to it had been updated yet.
But that doesn't need infinite levels. just one.
Ah it was jumps on Nova. Thanks for clarifying. I was remembering it ended when you reached a zero and could not figure out how that worked for data. The Eclipse version with data must have been slightly different. After 45 years the headline remains but the details need some prompting.
Lisp back then was built on pointer-value pairs and a lot of fetches. A lot of algorithms that made sense back then, when memory was in short supply but reading memory was as fast as the CPU and pipelines were short. Lisp free space was just another list and fragmentation was not a worry. Objects were not contiguous and did not need to be moved. Now it is second nature to do things like copy collectors and cleverly compile Lisp (or Julia, which is Lisp in very nice disguise) so that objects are contiguous. There is no going back, it is just historical curiosity. Now it is second nature to do things like copy collectors and cleverly compile Lisp (or Julia, which is Lisp in very nice disguise) so that objects are contiguous. Variations in how technology scales have ripped the fabric to make no sense now.
As I recall, the indirect hi bit in a pointer tells you whether the thing pointed to is your data (to load/store) or a pointer to load. If you need to load a pointer then again its hi bit tells whether it is pointing to your actual load/store data, or to yet another pointer.
love this book
It’s a great book, even 40 years later!
Love this article! While I was only a young lad in 1978, I do remember many times over the years having 'Soul' recommended reading for a tech nerd like myself. I even HAVE THE BOOK on my shelf at home (and have for years, I'm embarrassed to say) and really need to sit down and take it all in.
In the meantime, totally unrelated, I'm finally thinking about why my work PC is so slow and it turns out that it "only" has 8GB RAM... so doubling is only $29.00, delivered tomorrow. I remember when 16GB was only a pipe dream and, if such a monstrous amount of memory could even be utilized (you couldn't) it would have been thousands of dollars.
I'm enjoying my peek into the storied past of computing while, at the same time, enjoying marveling at how much the prices for everything came down. :) Good times.
Thanks so much. I remember reading about 1MB 68k systems and thinking what would I do with all that memory. Enjoy your 16,384 x that when it arrives!
What a great article and such a joy to read. Thank you very much for this!
That's very kind indeed, you've made my day. Thanks so much Pedro.
Nice retrospective. I reread "Soul" just a few weeks ago, worth it. I no longer remember my reaction when I first read it, about the time it came out, though I'm sure part of my interest was that I'd been programming the Nova and Eclipse recently.
Thanks Mark. Do you have any memories of programming the DG machines?
The company I worked for did workstations for paper mills, in particular where the rolls of paper needed to be wrapped for shipment and a large label printed to be pasted onto the roll. A paper mill is hot and humid, so the computer and printer were enclosed in air conditioned cabinets. Data General Nova 3, programmed in Fortran, running on top of whatever DG's OS was called. One small system I worked on was diskless, with the program initially loaded via paper tape! Core memory, of course, so the program was retained through power off. Others were disk-based, sized at 5MB I think. There was also a home grown code switcher, which let you swap different parts of the program in and out of memory, to get around the limited memory. This was the late 70s. One can contemplate just how differently one would solve the problem today. This was Diamond Engineering Corp., in Redmond, Washington. Long gone, of course, and our former building is also gone, consumed by the Microsoft campus.
That's terrific, thanks for sharing. I (just) remember the era of paper tape! I guess that the Nova got replaced by a PC quite soon after this. These machines were so useful in their era but micros changed everthing.
My comment from a couple of years ago on Minimalist Computing:
I’ve just finished “The Soul Of A New Machine” by Tracy Kidder. It’s one of the few books that I’ve read several times over the years. My copy was lent and never returned so I picked up a fresh copy. For a 40 year old book on CPU and computer design it reads like a thriller and is nearly impossible to put down.
Highly recommended.
Thanks for the article. Some great insight as always.
Thanks so much Paul. Completely agree with your assessment. It really does read like a thriller and a great thriller at that!
Yeah, that was a great book. It made me feel like I was with them during their development progress. I wonder what Jim Keller thinks of the book as he worked at DEC around 1982.
How Jim Keller got his job at DEC
" ... so i read about Digital and i literally read the computer architecture manual for the VAX 780 on the plane on the way to the interview and then i interviewed them with a whole bunch of questions because i just read this this architecture spec which i didn't know that much about to be honest but i was kind of a wise ass as a kid and um they hired me because they thought it was funny ..."
https://youtu.be/1TmuJSbms9c?si=odFSTJdnTccb2dOH&t=227
Such a great STORY! Unbelievable. 😁
Very 'on brand' for Jim too!
I began my job as a junior technical writer at Data General, a job for which I was only barely qualified, in April, 1980, just as the company was preparing to announce the Eagle (MV-8000) to the world. In fact my first assignment was to help prepare manuscripts of the first manuals (including the Principles of Operations shown in your post) for publication. I didn't understand a word in those manuals, but I could tell if the pages were upside down or not.
Four years later I was writing Principles of Operations manuals (among others), and working closely, for a while, with Tom West. Decades after that I wrote a few novels set in the world of hackers and computer engineers, and Tom West was an unavoidable influence on them. In fact one of my books has a character named Tom Best.
When Tom West passed away in 2011 I wrote an essay for my 'Wetmachine' blog: "Remembering Tom West, the original geek rock star.' Tom's ex-wife and one of his daughters separately came upon my essay, and I was invited by them to a memorial service that was held a few weeks later at Tom's home -- the one described in Soul of a New Machine. There I met Tracy Kidder and Edson de Castro, DG's founder & president, whom I had never met during my four years at DG.
It's been years since I last read Soul of a New Machine. Your essay has motivated me to buy a copy and read it again. I think it's time.
Interested readers can find my essay about Tom here:
https://wetmachine.com/my-thoughts-exactly/remembering-tom-west-the-original-geek-rock-star/
You've inspired me. I just bought the book!