A long time ago (which is to say, the late '90s, so not that long ago…) I did part of a degree in computer science. At the time I think it was called “systems analysis” or something like that. It was from the New Jersey Institute of Technology (NJIT), which was making press for giving computers away to all incoming freshmen, and for their pioneering online distance degree program.
It turned out that I didn’t think very much of the education I was receiving there (in retrospect this was probably in about equal parts due to my attitude, and to the curriculum actually not being that great) and so I quit school to play the violin for a living. That’s what I’ve been doing ever since, and what I still ostensibly do, except that COVID-19 has given me and my colleagues an unplanned gap year.
I never lost my interest in computers, but you might say I lost my interest in keeping up with computers - both intellectually and financially. Unlike my brother, who always has a cutting edge video card and processor, I’m chugging away happy as a clam on my 11 year old MacBook Pro. I’m amazed, but not enticed, by modern processors with their thousands of cores. I’m not at peace with the current state of the industry with its layers and layers of complex abstraction and bloated code-bases. My most modern system has a Phenom-IIx3 with 4GB of RAM and some kind of nVidia card that I keep around mostly for playing games. We could go to the moon with the Apollo Guidance Computer; my machine that is hundreds of thousands of times more powerful can barely run its own operating system (Windows 10).
When I was around 15 my dad was giving trumpet lessons to an adult student who happened to be a software developer. I remember telling him that what I was really interested in was how computers worked right down to the hardware. He suggested that I might be interested in writing device drivers. I laughed that off - to me back then a device driver was a little part of windows that made your modem not work as fast as it should. I wanted no part of it. Now I kind of wish I’d paid more attention.
Anyway, I enjoy collecting and reading vintage UNIX books. In his “Commentary on UNIX 6th Edition” John Lions presents the idea of UNIX being sophisticated enough to do real work (i.e., not a toy) but sufficiently simple for one person to fully comprehend it. When Bell Labs cracked down on the dissemination of UNIX source code, Andy Tanenbaum took the idea that 10,000 lines of code is roughly the limit of what can be managed by one person and gave us MINIX.
I’ve been a big fan of this idea ever since I encountered it. I’d really like to extend it to hardware. The idea of a kind of sandbox where every part is open to my understanding is very appealing, and since I’ve had a lot of free time on my hands, I’ve been following up on it a little more aggressively than I have done before. I really want to study the machine. Books like OSDI focus on the developer-facing virtual machine aspect of operating systems. I want the bare metal I/O. Plauger’s book about implementing the C library, if you trace the I/O functions through the various files, eventually says, more or less, “this part is hardware dependent. You will probably have to write it in assembly language, or copy it from a working system. Good luck.” I’d like a machine where I can pretend I’m Dennis Ritchie… maybe I’ll never be as smart as him, but I can get an idea of what it’s like to write a barebones assembler with a hex editor, use that to write a better assembler, use that to write a simple compiler, use that to write a library, a bootloader, I/O - a basic operating system, etc., you get the idea.
Earlier this year I installed MINIX 1 in an emulator. I have to say that I find emulating to be unsatisfying. Even though I understand the functional equivalence of hardware and software, all those layers of abstraction just seem like more barriers between me and what’s really happening.
I think there’s a pretty hard cutoff in hardware complexity for this project. I have a couple of old Compaq laptops (These same ones that, until recently, McLaren used to maintain the F1, in fact: https://www.theverge.com/2016/5/3/11576032/mclaren-f1-compaq-laptop-maintenance). As modern machines go, they are not potent. They have Pentium processors, 32MB of RAM, etc. But they also have an army of custom chipsets, proprietary connectors, a modified BIOS, and so on. It seems to me that something like the PC-XT 286 or an early Amiga is about the limit of what one person [*] can expect to master.
[*] I guess I should say “one hobbyist.” Linus obviously did just fine with his 386.
It just so happens that I have an A500 that I inherited from a friend when he upgraded to a 486. It presents its own set of problems. In fact, I think it perfectly typifies the barriers real old computers present to my project: connecting to a modern display, and non-volatile storage. The A500 has a kind of strange floppy drive that is electromagnetically incompatible with PC drives. This means that getting software from the internet onto the A500 involves a lot of fussing around with serial cables. The A500 also outputs a 15khz RGB video signal that is incompatible with modern computer displays. Getting a decent display out of it involves a lot of intermediate voodoo. All of this infrastructure required to connect the A500 to the modern world is expensive, often unavailable, and sometimes involves invasive modification of the machine itself. Moreover, the A500 has several custom chipsets, and requires a proprietary kickstart ROM to boot. Although the A500 with its 68k processor initially seems like a good choice, its personal quirks may make it a particularly bad one. (I suppose an actual PDP-11 would be worse…)
So, the point of this long-winded introductory post is, I could use some advice from the perspective of more experienced users of vintage hardware. Out of interest (and, hence the slightly humorous topic title), what retro computers are particularly bad in this regard? I have a Tandy Color Computer (not sure if its I, II, or III) in my folks’ basement that I salvaged from my Grandma’s basement when they sold her house. Next time I’m back home for a visit I’ll retrieve it. I wonder if it will be better or worse than the A500 at talking to the modern world!
Thanks for reading all this way!