I want to understand the whole machine. Which machine?


A long time ago (which is to say, the late '90s, so not that long ago…) I did part of a degree in computer science. At the time I think it was called “systems analysis” or something like that. It was from the New Jersey Institute of Technology (NJIT), which was making press for giving computers away to all incoming freshmen, and for their pioneering online distance degree program.

It turned out that I didn’t think very much of the education I was receiving there (in retrospect this was probably in about equal parts due to my attitude, and to the curriculum actually not being that great) and so I quit school to play the violin for a living. That’s what I’ve been doing ever since, and what I still ostensibly do, except that COVID-19 has given me and my colleagues an unplanned gap year.

I never lost my interest in computers, but you might say I lost my interest in keeping up with computers - both intellectually and financially. Unlike my brother, who always has a cutting edge video card and processor, I’m chugging away happy as a clam on my 11 year old MacBook Pro. I’m amazed, but not enticed, by modern processors with their thousands of cores. I’m not at peace with the current state of the industry with its layers and layers of complex abstraction and bloated code-bases. My most modern system has a Phenom-IIx3 with 4GB of RAM and some kind of nVidia card that I keep around mostly for playing games. We could go to the moon with the Apollo Guidance Computer; my machine that is hundreds of thousands of times more powerful can barely run its own operating system (Windows 10).

When I was around 15 my dad was giving trumpet lessons to an adult student who happened to be a software developer. I remember telling him that what I was really interested in was how computers worked right down to the hardware. He suggested that I might be interested in writing device drivers. I laughed that off - to me back then a device driver was a little part of windows that made your modem not work as fast as it should. I wanted no part of it. Now I kind of wish I’d paid more attention.

Anyway, I enjoy collecting and reading vintage UNIX books. In his “Commentary on UNIX 6th Edition” John Lions presents the idea of UNIX being sophisticated enough to do real work (i.e., not a toy) but sufficiently simple for one person to fully comprehend it. When Bell Labs cracked down on the dissemination of UNIX source code, Andy Tanenbaum took the idea that 10,000 lines of code is roughly the limit of what can be managed by one person and gave us MINIX.

I’ve been a big fan of this idea ever since I encountered it. I’d really like to extend it to hardware. The idea of a kind of sandbox where every part is open to my understanding is very appealing, and since I’ve had a lot of free time on my hands, I’ve been following up on it a little more aggressively than I have done before. I really want to study the machine. Books like OSDI focus on the developer-facing virtual machine aspect of operating systems. I want the bare metal I/O. Plauger’s book about implementing the C library, if you trace the I/O functions through the various files, eventually says, more or less, “this part is hardware dependent. You will probably have to write it in assembly language, or copy it from a working system. Good luck.” I’d like a machine where I can pretend I’m Dennis Ritchie… maybe I’ll never be as smart as him, but I can get an idea of what it’s like to write a barebones assembler with a hex editor, use that to write a better assembler, use that to write a simple compiler, use that to write a library, a bootloader, I/O - a basic operating system, etc., you get the idea.

Earlier this year I installed MINIX 1 in an emulator. I have to say that I find emulating to be unsatisfying. Even though I understand the functional equivalence of hardware and software, all those layers of abstraction just seem like more barriers between me and what’s really happening.

I think there’s a pretty hard cutoff in hardware complexity for this project. I have a couple of old Compaq laptops (These same ones that, until recently, McLaren used to maintain the F1, in fact: McLaren needs a 20-year-old Compaq laptop to maintain its F1 supercar - The Verge). As modern machines go, they are not potent. They have Pentium processors, 32MB of RAM, etc. But they also have an army of custom chipsets, proprietary connectors, a modified BIOS, and so on. It seems to me that something like the PC-XT 286 or an early Amiga is about the limit of what one person [*] can expect to master.

[*] I guess I should say “one hobbyist.” Linus obviously did just fine with his 386.

It just so happens that I have an A500 that I inherited from a friend when he upgraded to a 486. It presents its own set of problems. In fact, I think it perfectly typifies the barriers real old computers present to my project: connecting to a modern display, and non-volatile storage. The A500 has a kind of strange floppy drive that is electromagnetically incompatible with PC drives. This means that getting software from the internet onto the A500 involves a lot of fussing around with serial cables. The A500 also outputs a 15khz RGB video signal that is incompatible with modern computer displays. Getting a decent display out of it involves a lot of intermediate voodoo. All of this infrastructure required to connect the A500 to the modern world is expensive, often unavailable, and sometimes involves invasive modification of the machine itself. Moreover, the A500 has several custom chipsets, and requires a proprietary kickstart ROM to boot. Although the A500 with its 68k processor initially seems like a good choice, its personal quirks may make it a particularly bad one. (I suppose an actual PDP-11 would be worse…)

So, the point of this long-winded introductory post is, I could use some advice from the perspective of more experienced users of vintage hardware. Out of interest (and, hence the slightly humorous topic title), what retro computers are particularly bad in this regard? I have a Tandy Color Computer (not sure if its I, II, or III) in my folks’ basement that I salvaged from my Grandma’s basement when they sold her house. Next time I’m back home for a visit I’ll retrieve it. I wonder if it will be better or worse than the A500 at talking to the modern world!

Thanks for reading all this way!


An interesting account - thanks! (And welcome!)

I’m with you - I too like to understand the whole machine, or as much as I can. And the idea of 10,000 lines of code as a limit is a nice one - but it must be the case that each of us will have a different limit, and that limit might go up as we learn, and go down as we age.

I was lucky to start a bit earlier than the PC and Windows, so for me the whole machine was one I soldered together, with a 6502, 2k of monitor, and 8k of Basic. Those 8 bit micros were - and are - simple enough to understand pretty well. It’s not too hard to be productive in programming them, but the usual comment applies: it can take a lifetime to master. And someone somewhere will always be more proficient than you are.

Of course an 8bit machine will be simpler than an A500! But the A500 isn’t too complicated to get your head around, if you have the time and energy to study and practice. It might even be a help to get started with a simpler machine first.

I feel that the A500, and the 68k within it, are pretty nicely structured (and I had one, back in the day, and still have it.) And this helps, in understanding. There might be some machine which has about the same amount of ROM and the same number of transistors, the same number of lines of code in the OS, but which is unstructured and very difficult to learn. I don’t know of one - and I’m happy to remain ignorant!

I might guess that the 8 bit machines are about as complex as they can be and still be OK to program without a high level language. One attraction of the 68k and the Amiga, and probably most machines of that generation, is that you have high level languages and they are moderately comfortable to use. And so, programmers can be more productive, write bigger and more impressive programs, and then those programs take more effort on the part of the user to understand. (Naturally, I’m fully aware that one can program a Z80 in C, and program a 68k in assembly language, and people did both!)

Here is, I think, the trouble with the late 80s/early 90s era of post-8-bit machines: they tend to have complex ROMs and custom silicon. This is certainly true of the Amiga, Atari line, Macintosh, etc. It is true of, but maybe less so due to reduced ROM complexity, the Apple IIgs and IBM PC, AT, and clones.

The earlier 8-bitters often had (as you also mention) painful I/O facilities by modern standards. The Apple II line is not terrible in this regard, as it has a usable high speed serial port and simple composite video, and reliable floppy drives are easy to come by. Other machines vary, but all of them (including the Apple II) have their compromises. As Quinn Dunki said, “[a 1980s computer] was a crazy mad scientist method of generating video based on the fever dream of a hardware engineer who had spent a little too much time watching The Last Starfighter, and maybe not quite enough time reading reference circuits for the parts in her drawers. Every 1980s computer was an experiment in a new way to generate graphics, and everything else (CPU, RAM, etc) was an accessory to that effort.” Understanding the entire system is often rather straightforward once you understand the arcane rules of video generation, but building an operating system for such a machine can be quite difficult. There are exceptions (such as the many 8080/Z80-based CP/M machines that delegated their console I/O to a serial terminal), but certainly this view of the Apple II, C64, Atari 8-bitters, etc. is not without merit.

Honestly, if you want that “retro computing I control the entire machine and it’s all there to serve my needs” experience with much less need for arcane trivia, and you’re willing to give up on the technically retro hardware end of things, a modern Cortex-M development board (which can be purchased from any of the major ARM embedded houses, such as ST, TI, and friends, or via third party boards such as the Blue Pill, Feather, etc.) can give you a LOT of that flavor with a much more satisfying environment. You have RAM and ROM capabilities in the ballpark of late 80s/early 90s machines (say, a meg of Flash and a quarter to half meg of RAM?), a sane but very retro-credentialed architecture in ARM, and easy serial port console access. I have ported several “know the whole machine” operating systems (such as Xinu) to small ARM boards, and have my own from-the-ground-up system in the works, and have greatly enjoyed the process.

I think the first thing to decide is, are you going for the flavor of such an experiment, or the particulars of the old hardware? If the flavor is your concern, check out a modern microcontroller dev board. If you want to wrestle with a TI VDP or big-banged composite video, get the real thing.

(I do both!)

1 Like

Let me actually say that I think both the 386 and the PDP-11 are very reasonable targets for a from-the-ground-up implementation.

The 386 has some unfortunate pain relating to the segmented addressing modes used by BIOS, and the interface complexity if you want to use a more straightforward addressing mode and still access BIOS resources, but it is manageable (particularly if you simply decide to use, e.g., a serial console!).

The PDP-11 is a beautiful and straightforward architecture that makes it very easy to write standalone bare metal applications with sophisticated I/O capabilities. The trouble with the PDP-11 is that they are very large and the available storage options are difficult to manage in 2020.

1 Like

Speaking of storage, something which has revolutionised retrocomputing is solid state storage. Much more reliable and usually much faster than tape or disc, and also making it easy to import files from the outside world. I recommend getting a suitable gadget, if you can afford to. Unless you are taking a very determinedly retro approach, in which case fair play.


I saw it a differnt light here in Canada in the mid 1980’s, You had computer Apple II/ S100 bus or a games machine C-64 or wanted to emulate a IBM PC. Sadly the PC turned into a game console for the most part. Almost any
computer then was easy to understand as you could get the documentation on everything. Only after Windows hardware and software knowlage became only if you pay for it .
As for the UK computers, it was case of seen on TV.

The 386 got rid of 64k segments. The PDP 11 still has them.
Why every one thought they were a good idea I never will
know. You could have a huge (for the time) address space
but could not DIM A(100,100) complex with Fortran.
If you can find the chips, you can get a PCB to put the PDP 11
on the S100 bus.
The other way to understand a whole machine is build your own.

Indeed, and Grant Searle has long provided some designs which form a great basis, for various 8 bit micros. That is, if PCBs and soldering is your thing. I think it’s quite important to consider what it is you like to do - we are all different. For some, programming on a machine they’ve soldered is worth the trouble, for others, developing on a modern machine and then deploying to run on an emulator is what they want to do. And both are absolutely fine.

1 Like

Wow, lots of replies already! :smiley:

That is a great quote, and also a cool article. I hadn’t seen “Veronica” before, so thanks for pointing it out.

This is something I’m thinking about. When I was growing up this stuff was all stuff my friends had but that I didn’t, either because I couldn’t afford it ($8 a month allowance didn’t go far, even in 1988 and I had Tolkien books to buy :wink: ) or because I wasn’t allowed to. So there is an element of “that old equipment is cool to me, and now that I can have it, I will!” Having the A500 hooked up to a 42 inch flat screen TV (the only thing in my house with a composite video input) is pretty neat, and not something I ever imagined when I was 12 years old!

That pretty well scratches that itch though; from a utilitarian perspective what I’m mainly interested in is “shrinking the practice space.” Simplifying the target makes the project manageable; if the A500 makes the project more difficult, I’m willing to let go of it and just use it as a nostalgic gaming platform. The truth is, I think I’d be perfectly happy with something like David Murray’s Commander X16. It has just one serious disqualification: it doesn’t exist yet!

The barrier to entry with the ARM dev boards is that there are so many of them it’s hard to know where to start. The nice thing is that they’re not very expensive so if I got one it won’t mean limiting the project to just that one vector.

Interesting that you mention XINU: earlier this year (in April, I think) I finished my second go round of OSDI, and got the 2nd edition of it from a used bookseller. I rapidly came to the conclusion that, while POSIX is no doubt a worthy initiative, achieving POSIX compliance is not on my personal to-do list. One reviewer of OSDI suggested the XINU book as an alternate, and better, approach to studying operating systems, so I picked up a copy. It’s pretty neat! A possible long term idea for my project was to port XINU to the A500.

Building kit computers, just like Woz in his garage! I do not have soldering skills, or tools, unfortunately. This is something I think I would really enjoy, but I also think I would need a lot of practice before I was ready to try something like the PE6502. The kits themselves seem pretty expensive, and I’d have to assemble an electronics toolbox, so I’m going to keep that one on the back burner until work (hopefully!) resumes in January.

I have, however, been working my way through nand2tetris, which is a lot of fun. I’ve just about finished the hardware chapters.


I’m going to suggest a few of the ones that I’m finding the most interesting and the most well-docuimented machines.

The first is the ZX Spectrum. There are books from Melbourne House that have the entire source code (with comments) for the machine. The other is the excellent " The ZX Spectrum ULA: How to Design a Microcomputer" book that covers the most custom piece of the ZX Spectrum.

The second machine is the Atari ST, simply because Hatari (an emulator) and EmuTOS (a clean-room implementation of the OS) are both released under the GPL license. You can look at the source code for both and have a clear understanding of what’s going on in there.

Good luck!


A machine that may be rather easy to understand and which is also fun is the “Model T”, the Tandy TRS-80 Model 100, as well as it’s sisters, the NEC PC-8201A, the Olivetti M10 and the Kyotronic 85 (all 1983). Being portable machines, they are fun and self-contained, the 8-bit Intel 80C85 CPU is much the same as the Z80 (actually, it’s rather the other way round), and you can connect to them simply by a serial cable. There’s an emulator for all major platforms, VirtualT, which also comes with ROM listings for all those machines. There have been broad communities about them, especially for the Model 100, with posts still archived. Moreover, especially the Tandy machine is well documented and there are all the service manuals available, including descriptions of the hardware. Last, but not least, the machines all feature a very good Alps keyboard with mechanical switches and are really nice to work with. Battery backed-up storage memory is also of help for quick editing sessions.

1 Like

If you want to go the development board route, given that you enjoy vintage UNIX kind of things, I’d recommend getting one of the boards that can run RetroBSD. RetroBSD is a port of BSD UNIX 2.11 to MIPS. And 2.11 UNIX is reasonably capable but not all that far removed from Unix v6 complexity-wise, at least in the kernel. No MMU-based memory management for example. All the source code is available in GitHub.

In particular, I’d recommend the DuinoMite Mega from Olimex as I’ve used it to run RetroBSD. It about 30 euro / 35 USD. (I have no affiliation with Olimex.)



I can see the attraction of knowing everything there is to know about a real computer that people actually used. But there is also value in learning an educational computer designed specifically for that purpose, like the really great NAND2Tetris you mentioned. That is a simulated computer running on your actual computer.

A similar project which can run on an FPGA board is Project Oberon. It is a compiler, an operating system and some applications which all run (in this version of the book) on its own RISC processor and small computer. It is very different from Unix, so probably not what you are looking for.

1 Like

Hi Paganini,

I too was given an unexpected gap year, and also have a huge hole in my computer science knowledge.

My approach was to do a broad study of historical machines, starting with EDSAC (1948) and look at how each new generation of processor provided more resources and more computational speed.

Study the older machines like PDP-8, PDP-11 but don’t get too bogged down in specifics of more modern devices like 6502, Z80, 80x86

The biggest lesson I learned is that almost any cpu can be directly emulated in software running on a different machine - albeit somewhat slowly.

I started 7 years ago with the popular Nand to Tetris course. This showed me how a simple 16-bit cpu could be made with fewer that 1600 2 input Nand gates. It also illustrated the successive layers of software required to turn a cpu into a modern computer with high level language and operating system.

I then invested in a Gigatron TTL computer kit. Fewer than 36 simple TTL ICs capable of emulating a 6502 cpu and running Microsoft BASIC - at about 1/8th of the speed of a 1MHz 6502.

After that I decided that I was ready to design my own 16-bit cpu. I chose a 3 pronged approach to maximise the educational benefit:

  1. Define the instruction set and architecture and simulate it in C code. A simple cpu with about 30 instructions can be simulated in about 60 lines of C code.

  2. Using the C simulation as a guide - port the design to verilog hardware description language so that it can run as a soft core on a low cost FPGA board.

  3. Purely for nostalgia - implement the cpu as a TTL computer using readily available, low cost ICs.

Step 1, I achieved with a low cost $20 Teensy 4.0 target board - which can simulate my cpu at about 20 million instructions per second.

My main advice is don’t be too ambitious - even a very simple cpu with 8 fundamental instructions (Gigatron) will teach you all you need to know

1 Like

I must agree that most early 1980’s home machines were a desperate “mad scientist” attempt to create a video signal, using about $20 of hardware.

The upshot of this is that a considerable percentage of cpu cycles were spent servicing the video display.

If you unshackle the cpu from this burden, and offload this overhead to specialist video hardware ( such as Gameduino Dazzler) you will be surprised at what you can achieve with a 2MHz 6502 or a 4MHz Z80A

@monsonite - I just ran across the Gigatron today. What a cool machine! Looks like I’m on the right track with nand2tetris; I’m just about to embark on chapter 5, where you put all the pieces together into a working CPU.

Along the idea of offloading the video overhead, I suppose I have a bad habit (probably a modern disease) of assuming video output and keyboard input are required. But when I think about it, my project is not video games, but operating system stuff, like low level I/O and memory management. I can just as easily work on that with a terminal connection and it really will be like the old days at Bell Labs! :smiley:

I was thinking about getting one of these: W65C816SX SBCs. No soldering required, thoroughly documented architecture, and it seems to hit a kind of sweet spot between retro and modern. It seems like they’ve had a price drop so that they’re also significantly less expensive that some of the other SBC kits.

@codesmythe - That Duinomite board is SLICK. That is exactly kind of thing I’ve been poking around for, and all my googling had not turned it up.


There are various threads on the 6502.org forums about the various WDC offerings: you can pick up some advice there before buying.

I’m also inclined towards textual computing, and it is quite liberating, not worrying at all about pixels or sprites or sounds.



I’m a bit of a minimalist. I wanted to understand the cpu right down to it’s lowest level, and the Nand to Tetris course certainly gave me an insight into how little was actually needed to have a functioning cpu.

The less sophistication that you have in hardware, the more complexity that you need in software.

For example the PDP-8 minicomputer of 1965 could offer a similar performance to a 6502 of 1976, and did it using about 50% significantly fewer transistors. However the PDP-8 cost $18,000 in 1965 and was the size of a small fridge, compared to the $25 6502 which fitted into a 40-pin DIP package.

The 65C816 is a direct descendant of the 6502 and is a popular choice amongst retrocomputing fans - as it will execute 6502 code, at much greater speed. With a 16Mbyte addressing range, availability of 16-bit wide registers and a wide supply voltage range it makes a popular choice.

You might want a look at “Drogon’s” Ruby 65C816 board


You might also consider using it to create a virtual machine. One good example is Steve Wozniak’s “Sweet-16” which was written to make the 8-bit 6502 more useful for some 16-bit operations.

Learning how one cpu can appear to look like a completely different machine is an important technique. Woz’s Sweet-16 explains this approach in a very understandable manner.


It should be remembered that the behaviour of any historical cpu can always be emulated on a modern PC. It’s kind of a “try before you buy”

If you want the “hardcore” experience of “talking” to a cpu over a serial terminal, you can easily replicate this using an Arduino or $20 Teensy, that is emulating a vintage CPU.

I found a PDP-8 emulator that runs on Arduino - and had a great time running the Lunar Lander game - which is entirely text based.

I have a soft-spot for the PDP-8 - it’s literally amazing what you can do with 1500 transistors (and 10,000 diodes), and a tiny, but well thought out instruction set.

Although it lacked subtraction, OR and XOR, these could be synthesised from the available instructions using short macros. It lacked any real registers, apart from the accumulator, but this deficiency was addressed by clever use of zero page memory, some of which could be auto-incremented.

The PDP-8 is a close cousin, and definitely influencial to the “Hack” computer, that forms the basis of the Nand to Tetris course.

Even though that era of teletypes and punched paper tape were a bit clunky, and noisy, there is very little a PDP-8 could not do, compared to the early microprocessor based home computers.
The only trick they brought to the party was cheap video generation hardware.

Moving on 10 years, the 6502 was absolutely groundbreaking in its day - and led to the advent of low cost microprocessors - it was about 1/10th of the cost of its competitors.

Chuck Peddle intended it to be built into industrial controllers, traffic signals, petrol pumps and cash registers. He never expected that it would trailblaze the home microcomputer revolution.

It’s legacy is that it is still available today, in a wide variety of forms, some 44 years after its introduction.

For $20 you can buy a 600MHz ARM M7 on a breadboard friendly pcb that can emulate any of the historical machines, talk over a serial connection to a serial terminal, and generate VGA graphics in it’s spare time.

As far as retro-computing goes, we live in a wonderful world!


I know you said you don’t like emulation, but take a hard look at From Nand to Tetris: https://www.nand2tetris.org

1 Like