Collapse OS for Z80

This may be interesting:

[Collapse OS] is a z80 kernel and a collection of programs, tools and documentation that allows you to assemble an OS that can:

  • Run on minimal and improvised machines.
  • Interface through improvised means (serial, keyboard, display).
  • Edit text files.
  • Compile assembler source files for a wide range of MCUs and CPUs.
  • Read and write from a wide range of storage devices.
  • Replicate itself.

Additionally, the goal of this project is to be as self-contained as possible. With a copy of this project, a capable and creative person should be able to manage to build and install Collapse OS without external resources (i.e. internet) on a machine of her design, built from scavenged parts with low-tech tools.

https://collapseos.org

HN discussion: Collapse OS | Hacker News

5 Likes

Interesting idea! Planning for post-collapse computing is one way to constrain implementations. (One-page computing being another.) I think I’d vote for a relay-based 16 bit RISC. I can imagine building relays, much more so than vacuum tubes or transistors, let alone integrated circuits. But there’s still a lot of work and ingenuity needed to bootstrap from machine code to assembly to compiler. And there’s always the question of what the role is of computing in a post-collapse world. For myself, the satisfaction of programming and some exploration of mathematics would be enough. I wouldn’t expect to use a computer for communication, and possibly not even for entertainment. Much more interesting to share stories around a fire than to play tetris!

As a related question: on your desert island, if your chosen luxury is a computer, what would you want to use it for?

1 Like

An interesting idea - but you have to ask a few questions first:

Realistically how much computing could be done on say a 4MHz Z80 with 64K memory and a serial terminal interface? Unhindered by the need to generate video, the Z80 might be quite lively compared to the Sinclair and Jupiter ACE offerings of 35+ years ago

What would you use for mass-storage? I see this as being a major stumbling block.

Would you really need an operating system - as such a machine would likely be single application only.

Making it self-hosting - with all debug tools available would be useful - especially if it is your only machine.

This might be a candidate for a 16-bit virtual machine and some variant of Forth.

Having just acquired an RC2014 “Micro” Z80 minimal dev-board at a recent conference, and it being at least 30 years since I last touched a Z80 - this might be an interesting side-project.

1 Like

Well move to a 6809 and run a better OS than cp/m. I lot can be done in 64Kb, other than a bit mapped display. I like text windows better than bit mapped displays. 32KB to 64KB was all the memory you could afford
in 1975 to 1985. As for end of world computing, the 6502 has been transitorized. Me I am stockpiling guns when the UN-DEAD show up.
Self hosting is allways a pain for 8 bit micros because the software was cross compiled on BIG MACHINES in Fortran with 128K, so you could control your toaster or washing machine or run a fancy calculator.

1 Like

Reads the web site. The real problem is not consumer but industrial products. The power plants stop running because some EPROM that was subsitued for a cheaper brand dies and cannot be replaced because of politics from 5 years ago.
If you really wish restart techonlogy.it must be the late 1970’s
with less cutting edge techology. 8K drams 4 bit slice (2901 with only 8 registers) 512x4 bipolar ram and a simple PAL in a 22 pin package.( clear & output #9) and some other brand of basic than microsoft.

I was thinking this, too. Most of the more simple media are out of production for decades and drives, if available, are probably suffering from some state of decomposition of the plastic and rubber parts involved (belts, capstans, gears, etc.). Not necessarily on an individual basis, but severely enough to be a real issue, as you’re going to base your technology on them.

(How would you have this system ready? In an in-emergency-break-glass box on the wall? But in what format on what media? Floppy? 8"? Tape in Cansas City format? Probably, for a start, a set of versatile EPROMs would do, and then the self replication kicks in. But then, you hit the storage media wall again…)

P.S.: Paper tape may be a good starting point for a bootstrapping process. The media is easily available (any kind of film etc) or easy to produce. However, the mechanisms involved have become some of a black art and we would need some Guerilla-kind of DIY-read-punch.

The big challenge in the history of computing was storage: as we know, Zuse used mechanical means, some early machines used recirculating sound waves in mercury (not gin, although it was considered) and then the magnetic drum offered useful amounts of storage, rotating rather than recirculating. You need pretty accurate fabrication to make any of those work. The dekatron used gas discharge tubes, and paper tape. I think there’s a fighting chance for an amateur to get a paper tape system working. You can of course make a loop from tape. And keep a library of useful routines, which you can link together.

When random access memory arrived, it was in the form of the Williams-Kilburn tube, and then the magnetic core memory. In all cases you need pretty good signal handling - which means either transistors or valves - both of which are a challenge. Although you can make valves at home…

You are totally right, memory/storage was the big challenge. (And, maybe, power supplies, which were just nascent technology – which is, why we have binary, providing as much tolerance as possible. Which is here, after the collapse, a good thing.) However, in the given scenario, where there are plenty of Z80s to salvage, there are probably plenty of related memory chips to salvage, too. And, once you have figured out the production process, these low density arrays are probably not that difficult to reproduce. Otherwise, things probably become prohibitively expensive (like coating magnetic drums and aligning and synchronizing precision read-write heads and motors).

P.S.: Regarding home made vacuum tubes: Just consider the number of tubes, you’d actually need to produce and the amount of energy this contraption would consume. Unless you provide your own power plant, this may not be that viable…

Technically, anything that is ‘Turing complete’ could be used for computing like, say, Magic: The Gathering. :slight_smile:

There may be enough cards floating around, post-apocalypse, to do some computing.

But I don’t think we need to descend to the level of XKCD’s rock computing!

1 Like

And this is, where Virtual Card Read-Punch becomes relevant… :wink:

1 Like

Assume, post collapse, that we have access to traditional wire-ended components - it would be possible to make up a series of diode- transistor- logic (DTL) modules - loosely based on what we call the 7400 TTL series.

With these basic modules, more complex systems could be created, such as counters, adders, multiplexers and registers.

DTL emerged in the early 1960s - when the cost of transistors was still very high. Transistor count was kept to a minimum, with all AND/OR logic implemented in diodes., as seen in the classic PDP-8 which used 1409 transistors and 10,148 diodes.

The video shows a PDP-8e executing a BASIC program to plot a sine curve. IMHO it’s not too much of a slouch, compared to my memories of my early Sinclair ZX81 machine

Perhaps it might be possible to return to serial memory based on bistable flip flops. The classic bistable uses 2 transistors, 2 diodes, 2 caps and 8 resistors. With 14 discrete components needed to store a bit, and taking up around 1 square inch of pcb area, it is conceivable that register files of say 256 bits could be created in this manner.

With access to many registers, architectures and software could be devised that reduced the need for external storage.

2 Likes

The PDP-8 is pretty quick, given that this is BASIC. I guess, it’s at least as fast as any classic late 1970’s or early 1980s micros. But the PDP-8 was also using pretty quick core memory (1.5 microseconds cycle time).

However, the PDP-8 cards already used DIP packaged components, compare this image from the Small Computer Handbook, p. 383 (1967/68):

pdp8-cards-small-computer-handbook-p383

1 Like

NoLand - agreed, the example I see in the video is a PDP-8e from 1972, which by that time used 7400 series ICs.

The earlier models from 1965 used discrete components on very low component density, modular pcbs, plugged into a backplane - which was machine wirewrapped.

One module contained a 1-bit slice of the ALU, Accumulator, the Program Counter, the Memory Buffer and Memory Address registers. 12 such boards completed the heart of the logic design.

The early pcbs have been reverse engineered - and transcribed into EagleCAD - so that they could be remanufactured.

Much of the PDP-8 design was concerned with the complexity of accessing the core-memory. The cycle time was dictated by the core memory. There were minor improvements in cycle time from 1.5uS in 1965 to 1.2uS by 1974.

There’s a good site here that follows the restoration of an early PDP-8 - documenting much of the electronics.

https://www.pdp8.net/straight8/functional_restore.shtml

2 Likes

One thing, the PDP-8 illustrates rather nicely, is that you need fast memory in order to operate with smaller word sizes and smaller instruction set encodings. Just a few years before, the memory cycle time of the much more expensive PDP-1 was at 5 microseconds, which provided suitable processing speed with 18 bits and 5-bit instructions. The reduction to 12 bits was only viable as memory became about 3 times faster. The same is also probably also true for 8 bit computing. Meaning, the lesser your means of production and the longer resulting cycle times, the more complex your architecture will be – and the higher your part count.
(This is also, where some of the early, simpler machines like the LGP-30 shine, providing suitable processing with simple components and a simple architecture. And yes, this means some pretty extensive arrays of diodes. The equivalence of encoding logic in ROMs.)

1 Like

Marc Verdiell has a nice video on this, showing how complex the matter of addressing core memory is and what feats in electronic engineering are involved:

2 Likes

NoLand - thanks for the explanation about word-size and memory access time. I had not realised that there was such a direct correlation - but it explains why the very early machines, eg EDSAC, had large word-sizes, which seemed a little adventurous for the day.

I guess, if your memory is slow, you want a correlation of one CPU cycle == one memory cycle and you want to accomplish something viable in this cycle. Take for example an 8-bit MPU, with instructions taking anything from 3 to 7 cycles and about 4.5 in average, like the 6502, involving stitching up addresses from consecutive reads and writes. And then having to do this multiple times for, say, an addition with suitable precision. You probably don’t want to do this with slow memory.

I suppose we should aspire to build an EDSAC before we aspire to an LGP-30. But the earlier machine uses 3000 valves/tubes - the later, simpler, machine is surely an easier build. “The LGP-30, standing for Librascope General Purpose and then Librascope General Precision … contained 113 electronic tubes and 1450 diodes” whereas “the Bendix G - 15 computer was introduced in 1956 [and] has 180 vacuum tube packs and 300 germanium diodes.” We know we can make diodes at home, using copper oxide:

Regarding the LGP-30, have a look at this still from General Precision LGP 30 Computer Oldtimer - YouTube – the entire frame on top of the computer is a single array of diodes:

lgp-30

And here’s a similar array for the read-write logic of the drum (from a later model):

Notably, solid state diodes only emerged at the end of WWII.

1 Like