I want to understand the whole machine. Which machine?

An interesting account - thanks! (And welcome!)

I’m with you - I too like to understand the whole machine, or as much as I can. And the idea of 10,000 lines of code as a limit is a nice one - but it must be the case that each of us will have a different limit, and that limit might go up as we learn, and go down as we age.

I was lucky to start a bit earlier than the PC and Windows, so for me the whole machine was one I soldered together, with a 6502, 2k of monitor, and 8k of Basic. Those 8 bit micros were - and are - simple enough to understand pretty well. It’s not too hard to be productive in programming them, but the usual comment applies: it can take a lifetime to master. And someone somewhere will always be more proficient than you are.

Of course an 8bit machine will be simpler than an A500! But the A500 isn’t too complicated to get your head around, if you have the time and energy to study and practice. It might even be a help to get started with a simpler machine first.

I feel that the A500, and the 68k within it, are pretty nicely structured (and I had one, back in the day, and still have it.) And this helps, in understanding. There might be some machine which has about the same amount of ROM and the same number of transistors, the same number of lines of code in the OS, but which is unstructured and very difficult to learn. I don’t know of one - and I’m happy to remain ignorant!

I might guess that the 8 bit machines are about as complex as they can be and still be OK to program without a high level language. One attraction of the 68k and the Amiga, and probably most machines of that generation, is that you have high level languages and they are moderately comfortable to use. And so, programmers can be more productive, write bigger and more impressive programs, and then those programs take more effort on the part of the user to understand. (Naturally, I’m fully aware that one can program a Z80 in C, and program a 68k in assembly language, and people did both!)

Here is, I think, the trouble with the late 80s/early 90s era of post-8-bit machines: they tend to have complex ROMs and custom silicon. This is certainly true of the Amiga, Atari line, Macintosh, etc. It is true of, but maybe less so due to reduced ROM complexity, the Apple IIgs and IBM PC, AT, and clones.

The earlier 8-bitters often had (as you also mention) painful I/O facilities by modern standards. The Apple II line is not terrible in this regard, as it has a usable high speed serial port and simple composite video, and reliable floppy drives are easy to come by. Other machines vary, but all of them (including the Apple II) have their compromises. As Quinn Dunki said, “[a 1980s computer] was a crazy mad scientist method of generating video based on the fever dream of a hardware engineer who had spent a little too much time watching The Last Starfighter, and maybe not quite enough time reading reference circuits for the parts in her drawers. Every 1980s computer was an experiment in a new way to generate graphics, and everything else (CPU, RAM, etc) was an accessory to that effort.” Understanding the entire system is often rather straightforward once you understand the arcane rules of video generation, but building an operating system for such a machine can be quite difficult. There are exceptions (such as the many 8080/Z80-based CP/M machines that delegated their console I/O to a serial terminal), but certainly this view of the Apple II, C64, Atari 8-bitters, etc. is not without merit.

Honestly, if you want that “retro computing I control the entire machine and it’s all there to serve my needs” experience with much less need for arcane trivia, and you’re willing to give up on the technically retro hardware end of things, a modern Cortex-M development board (which can be purchased from any of the major ARM embedded houses, such as ST, TI, and friends, or via third party boards such as the Blue Pill, Feather, etc.) can give you a LOT of that flavor with a much more satisfying environment. You have RAM and ROM capabilities in the ballpark of late 80s/early 90s machines (say, a meg of Flash and a quarter to half meg of RAM?), a sane but very retro-credentialed architecture in ARM, and easy serial port console access. I have ported several “know the whole machine” operating systems (such as Xinu) to small ARM boards, and have my own from-the-ground-up system in the works, and have greatly enjoyed the process.

I think the first thing to decide is, are you going for the flavor of such an experiment, or the particulars of the old hardware? If the flavor is your concern, check out a modern microcontroller dev board. If you want to wrestle with a TI VDP or big-banged composite video, get the real thing.

(I do both!)

1 Like

Let me actually say that I think both the 386 and the PDP-11 are very reasonable targets for a from-the-ground-up implementation.

The 386 has some unfortunate pain relating to the segmented addressing modes used by BIOS, and the interface complexity if you want to use a more straightforward addressing mode and still access BIOS resources, but it is manageable (particularly if you simply decide to use, e.g., a serial console!).

The PDP-11 is a beautiful and straightforward architecture that makes it very easy to write standalone bare metal applications with sophisticated I/O capabilities. The trouble with the PDP-11 is that they are very large and the available storage options are difficult to manage in 2020.

1 Like

Speaking of storage, something which has revolutionised retrocomputing is solid state storage. Much more reliable and usually much faster than tape or disc, and also making it easy to import files from the outside world. I recommend getting a suitable gadget, if you can afford to. Unless you are taking a very determinedly retro approach, in which case fair play.

2 Likes

I saw it a differnt light here in Canada in the mid 1980’s, You had computer Apple II/ S100 bus or a games machine C-64 or wanted to emulate a IBM PC. Sadly the PC turned into a game console for the most part. Almost any
computer then was easy to understand as you could get the documentation on everything. Only after Windows hardware and software knowlage became only if you pay for it .
As for the UK computers, it was case of seen on TV.
Ben.

The 386 got rid of 64k segments. The PDP 11 still has them.
Why every one thought they were a good idea I never will
know. You could have a huge (for the time) address space
but could not DIM A(100,100) complex with Fortran.
If you can find the chips, you can get a PCB to put the PDP 11
on the S100 bus.
The other way to understand a whole machine is build your own.
Ben.

Indeed, and Grant Searle has long provided some designs which form a great basis, for various 8 bit micros. That is, if PCBs and soldering is your thing. I think it’s quite important to consider what it is you like to do - we are all different. For some, programming on a machine they’ve soldered is worth the trouble, for others, developing on a modern machine and then deploying to run on an emulator is what they want to do. And both are absolutely fine.

1 Like

Wow, lots of replies already! :smiley:

That is a great quote, and also a cool article. I hadn’t seen “Veronica” before, so thanks for pointing it out.

This is something I’m thinking about. When I was growing up this stuff was all stuff my friends had but that I didn’t, either because I couldn’t afford it ($8 a month allowance didn’t go far, even in 1988 and I had Tolkien books to buy :wink: ) or because I wasn’t allowed to. So there is an element of “that old equipment is cool to me, and now that I can have it, I will!” Having the A500 hooked up to a 42 inch flat screen TV (the only thing in my house with a composite video input) is pretty neat, and not something I ever imagined when I was 12 years old!

That pretty well scratches that itch though; from a utilitarian perspective what I’m mainly interested in is “shrinking the practice space.” Simplifying the target makes the project manageable; if the A500 makes the project more difficult, I’m willing to let go of it and just use it as a nostalgic gaming platform. The truth is, I think I’d be perfectly happy with something like David Murray’s Commander X16. It has just one serious disqualification: it doesn’t exist yet!

The barrier to entry with the ARM dev boards is that there are so many of them it’s hard to know where to start. The nice thing is that they’re not very expensive so if I got one it won’t mean limiting the project to just that one vector.

Interesting that you mention XINU: earlier this year (in April, I think) I finished my second go round of OSDI, and got the 2nd edition of it from a used bookseller. I rapidly came to the conclusion that, while POSIX is no doubt a worthy initiative, achieving POSIX compliance is not on my personal to-do list. One reviewer of OSDI suggested the XINU book as an alternate, and better, approach to studying operating systems, so I picked up a copy. It’s pretty neat! A possible long term idea for my project was to port XINU to the A500.

Building kit computers, just like Woz in his garage! I do not have soldering skills, or tools, unfortunately. This is something I think I would really enjoy, but I also think I would need a lot of practice before I was ready to try something like the PE6502. The kits themselves seem pretty expensive, and I’d have to assemble an electronics toolbox, so I’m going to keep that one on the back burner until work (hopefully!) resumes in January.

I have, however, been working my way through nand2tetris, which is a lot of fun. I’ve just about finished the hardware chapters.

3 Likes

I’m going to suggest a few of the ones that I’m finding the most interesting and the most well-docuimented machines.

The first is the ZX Spectrum. There are books from Melbourne House that have the entire source code (with comments) for the machine. The other is the excellent " The ZX Spectrum ULA: How to Design a Microcomputer" book that covers the most custom piece of the ZX Spectrum.

The second machine is the Atari ST, simply because Hatari (an emulator) and EmuTOS (a clean-room implementation of the OS) are both released under the GPL license. You can look at the source code for both and have a clear understanding of what’s going on in there.

Good luck!

2 Likes

A machine that may be rather easy to understand and which is also fun is the “Model T”, the Tandy TRS-80 Model 100, as well as it’s sisters, the NEC PC-8201A, the Olivetti M10 and the Kyotronic 85 (all 1983). Being portable machines, they are fun and self-contained, the 8-bit Intel 80C85 CPU is much the same as the Z80 (actually, it’s rather the other way round), and you can connect to them simply by a serial cable. There’s an emulator for all major platforms, VirtualT, which also comes with ROM listings for all those machines. There have been broad communities about them, especially for the Model 100, with posts still archived. Moreover, especially the Tandy machine is well documented and there are all the service manuals available, including descriptions of the hardware. Last, but not least, the machines all feature a very good Alps keyboard with mechanical switches and are really nice to work with. Battery backed-up storage memory is also of help for quick editing sessions.

1 Like

If you want to go the development board route, given that you enjoy vintage UNIX kind of things, I’d recommend getting one of the boards that can run RetroBSD. RetroBSD is a port of BSD UNIX 2.11 to MIPS. And 2.11 UNIX is reasonably capable but not all that far removed from Unix v6 complexity-wise, at least in the kernel. No MMU-based memory management for example. All the source code is available in GitHub.

In particular, I’d recommend the DuinoMite Mega from Olimex as I’ve used it to run RetroBSD. It about 30 euro / 35 USD. (I have no affiliation with Olimex.)

Rob

4 Likes

I can see the attraction of knowing everything there is to know about a real computer that people actually used. But there is also value in learning an educational computer designed specifically for that purpose, like the really great NAND2Tetris you mentioned. That is a simulated computer running on your actual computer.

A similar project which can run on an FPGA board is Project Oberon. It is a compiler, an operating system and some applications which all run (in this version of the book) on its own RISC processor and small computer. It is very different from Unix, so probably not what you are looking for.

1 Like

Hi Paganini,

I too was given an unexpected gap year, and also have a huge hole in my computer science knowledge.

My approach was to do a broad study of historical machines, starting with EDSAC (1948) and look at how each new generation of processor provided more resources and more computational speed.

Study the older machines like PDP-8, PDP-11 but don’t get too bogged down in specifics of more modern devices like 6502, Z80, 80x86

The biggest lesson I learned is that almost any cpu can be directly emulated in software running on a different machine - albeit somewhat slowly.

I started 7 years ago with the popular Nand to Tetris course. This showed me how a simple 16-bit cpu could be made with fewer that 1600 2 input Nand gates. It also illustrated the successive layers of software required to turn a cpu into a modern computer with high level language and operating system.

I then invested in a Gigatron TTL computer kit. Fewer than 36 simple TTL ICs capable of emulating a 6502 cpu and running Microsoft BASIC - at about 1/8th of the speed of a 1MHz 6502.

After that I decided that I was ready to design my own 16-bit cpu. I chose a 3 pronged approach to maximise the educational benefit:

  1. Define the instruction set and architecture and simulate it in C code. A simple cpu with about 30 instructions can be simulated in about 60 lines of C code.

  2. Using the C simulation as a guide - port the design to verilog hardware description language so that it can run as a soft core on a low cost FPGA board.

  3. Purely for nostalgia - implement the cpu as a TTL computer using readily available, low cost ICs.

Step 1, I achieved with a low cost $20 Teensy 4.0 target board - which can simulate my cpu at about 20 million instructions per second.

My main advice is don’t be too ambitious - even a very simple cpu with 8 fundamental instructions (Gigatron) will teach you all you need to know

1 Like

I must agree that most early 1980’s home machines were a desperate “mad scientist” attempt to create a video signal, using about $20 of hardware.

The upshot of this is that a considerable percentage of cpu cycles were spent servicing the video display.

If you unshackle the cpu from this burden, and offload this overhead to specialist video hardware ( such as Gameduino Dazzler) you will be surprised at what you can achieve with a 2MHz 6502 or a 4MHz Z80A

@monsonite - I just ran across the Gigatron today. What a cool machine! Looks like I’m on the right track with nand2tetris; I’m just about to embark on chapter 5, where you put all the pieces together into a working CPU.

Along the idea of offloading the video overhead, I suppose I have a bad habit (probably a modern disease) of assuming video output and keyboard input are required. But when I think about it, my project is not video games, but operating system stuff, like low level I/O and memory management. I can just as easily work on that with a terminal connection and it really will be like the old days at Bell Labs! :smiley:

I was thinking about getting one of these: W65C816SX SBCs. No soldering required, thoroughly documented architecture, and it seems to hit a kind of sweet spot between retro and modern. It seems like they’ve had a price drop so that they’re also significantly less expensive that some of the other SBC kits.

@codesmythe - That Duinomite board is SLICK. That is exactly kind of thing I’ve been poking around for, and all my googling had not turned it up.

2 Likes

There are various threads on the 6502.org forums about the various WDC offerings: you can pick up some advice there before buying.

I’m also inclined towards textual computing, and it is quite liberating, not worrying at all about pixels or sprites or sounds.

2 Likes

@Paganini,

I’m a bit of a minimalist. I wanted to understand the cpu right down to it’s lowest level, and the Nand to Tetris course certainly gave me an insight into how little was actually needed to have a functioning cpu.

The less sophistication that you have in hardware, the more complexity that you need in software.

For example the PDP-8 minicomputer of 1965 could offer a similar performance to a 6502 of 1976, and did it using about 50% significantly fewer transistors. However the PDP-8 cost $18,000 in 1965 and was the size of a small fridge, compared to the $25 6502 which fitted into a 40-pin DIP package.

The 65C816 is a direct descendant of the 6502 and is a popular choice amongst retrocomputing fans - as it will execute 6502 code, at much greater speed. With a 16Mbyte addressing range, availability of 16-bit wide registers and a wide supply voltage range it makes a popular choice.

You might want a look at “Drogon’s” Ruby 65C816 board

https://projects.drogon.net/ruby816-build-and-initial-booting/

You might also consider using it to create a virtual machine. One good example is Steve Wozniak’s “Sweet-16” which was written to make the 8-bit 6502 more useful for some 16-bit operations.

Learning how one cpu can appear to look like a completely different machine is an important technique. Woz’s Sweet-16 explains this approach in a very understandable manner.

3 Likes

It should be remembered that the behaviour of any historical cpu can always be emulated on a modern PC. It’s kind of a “try before you buy”

If you want the “hardcore” experience of “talking” to a cpu over a serial terminal, you can easily replicate this using an Arduino or $20 Teensy, that is emulating a vintage CPU.

I found a PDP-8 emulator that runs on Arduino - and had a great time running the Lunar Lander game - which is entirely text based.

I have a soft-spot for the PDP-8 - it’s literally amazing what you can do with 1500 transistors (and 10,000 diodes), and a tiny, but well thought out instruction set.

Although it lacked subtraction, OR and XOR, these could be synthesised from the available instructions using short macros. It lacked any real registers, apart from the accumulator, but this deficiency was addressed by clever use of zero page memory, some of which could be auto-incremented.

The PDP-8 is a close cousin, and definitely influencial to the “Hack” computer, that forms the basis of the Nand to Tetris course.

Even though that era of teletypes and punched paper tape were a bit clunky, and noisy, there is very little a PDP-8 could not do, compared to the early microprocessor based home computers.
The only trick they brought to the party was cheap video generation hardware.

Moving on 10 years, the 6502 was absolutely groundbreaking in its day - and led to the advent of low cost microprocessors - it was about 1/10th of the cost of its competitors.

Chuck Peddle intended it to be built into industrial controllers, traffic signals, petrol pumps and cash registers. He never expected that it would trailblaze the home microcomputer revolution.

It’s legacy is that it is still available today, in a wide variety of forms, some 44 years after its introduction.

For $20 you can buy a 600MHz ARM M7 on a breadboard friendly pcb that can emulate any of the historical machines, talk over a serial connection to a serial terminal, and generate VGA graphics in it’s spare time.

As far as retro-computing goes, we live in a wonderful world!

2 Likes

I know you said you don’t like emulation, but take a hard look at From Nand to Tetris: https://www.nand2tetris.org

1 Like

Indeed. And I can think of two good reasons for going this route:

First, adding a video display to a machine, as opposed to using a UART chip (or even just bit-banged serial I/O) adds considerable complexity. Learning about how video signals work is great fun and not really much more complex than learning about how to put together a CPU and RAM and so on, but it’s a rather different area that requires its own set of knowledge and expertise.

Second, this is actually the more “retro” way of using microcomputers. The rise of microcomputers with video systems around 1976-77 was most directly due to economic factors: video terminals generally cost well over $1000, easily doubling the price of a simple computer system. (Even used printing terminals started at $400-$600.) Homebrew “TV typewriters,” using a standard television as a display, were the first attempt to mitigate this expense; the Apple 1 was in fact a TV typewriter circuit that Woz had previously built glued on to a processor and some memory and I/O. This eventually led to integrated video circuitry being a standard feature (even on systems that provided a monitor, such as the PET or TRS-80 Model I; the latter actually used a re-badged black and white TV) because of its advantages in cost; that falling RAM prices made them easily capable of more sophisticated memory-mapped displays was basically just an added bonus.


I’d recommend building, or at least fully understanding, a simple 8-bit single-board computer (SBC); I’ve found that there are many things that go on in hardware design (even at a high level, such as system address decoding) that will really illuminate how machines and their instruction sets work in a way that you never really see from the software side. (I used to see designs with a lot of “wasted” space for I/O in memory maps and think, “What a waste of space”; now I see ones without that and think, “What a waste of logic gates.”)

There’s no need for soldering to do this; with reasonable care (and sticking to low clock speeds) you can build such a system on a breadboard. (But use high-quality breadboards to avoid pain!) Ben eater has an excellent series of videos on doing this; merely watching them will probably be immensely educational. Nor do you need to use that system for all future projects; doing your initial programming for such a system and upgrading to something more sophisticated (such as a more capable pre-built SBC or an Amiga, the latter of which already hides a lot more complexity than is hidden by chips like a 6502 and its peripherals) when you find your programs are getting too big or complex for your little 8-bit system.

That said, the From Nand To Tetris approach is certainly quite valid, too, and probably as good if you want to slightly gloss over some of the hardware details (though it also does get more deeply into other hardware details, such as CPU design). It may be as much a matter of preference and previous experience as anything else; I not only really like playing with “real” hardware, but came to this with a strong theoretical background already so that the actual “hands on” experience with hardware was what I was missing.

Yes, this is quite a good idea; it’s portable, self-contained with a decent keyboard and display, has a serial port that makes loading/saving and cross-development easy, and detailed technical documentation is easily available for both the hardware and software. I’m not particularly fond of the 8080 CPU architecture (the 6800 and 6502 are both easier to learn and use, IMHO), but that could be as much my bias as reality. (I may also be biased in my liking for these machines; I owned and very much enjoyed a Model 100 back in the '80s, enough that I have recently bought one again, as well as a NEC PC-8201.)


And now some verging-on-off-topic notes on other comments here. If you want to get into the details of any of this, it may be better to branch off into a new thread for it.

The PDP-11 never had segmentation in the way the 8086 did; it was a 16-bit machine with a 16-bit address space (not at all unreasonable for the time) which later used a few bank-switching-like tricks to expand that a bit. The architecture was never intended to support address spaces larger than 64 KB, as the 8086 was.

For the 8086, it’s clear that one of the primary design criteria was to easily be able to replace the 8080 (or, to a lesser degree, the Z80) in embedded systems. If you had an 8080-based board and software with decent (and fairly typical) design, but with memory use bursting at the seams, it required minimal and fairly easy design changes to create a new board with an 8088 and reassembled software (with very minor tweaks) that solved that problem in most cases by easily letting you split your code, data and stack into three separate segments of up to 64 KB each. I go into more technical detail on this in this Retrocomputing SE answer. The 16-byte segment offsets were really more of a nice hack that theoretically allowed use of a 1 MB address space while still allowing efficient use of memory (which was still expensive at the time) for such upgrades.

(Further discussion of either of these should probably go to either the Thoughts on address space extensions thread for general discussion or a new thread for discussions of specific schemes.)

Well, I’d say lower cost microprocessors; even at $150 or more, the microprocessors of the day were a significant cost reduction over anything else available. Nor would I say that the 6502 “trailblaze[d] the home microcomputer revolution”: that was clearly already under way. One of the U.S. 1977 Trinity used a Z80 and in some markets, such as Japan, the 6502 had little impact. (The thriving Japanese microcomputer market was almost exclusively Z80- and 6800/6809-based.) Even in the U.K., until the Commodore 64 really took over, the 6502 was used mostly in mid- to high-end microcomputers, with the Z80 used in the low end.

3 Likes