Collapse OS for Z80

The PDP-8 is pretty quick, given that this is BASIC. I guess, it’s at least as fast as any classic late 1970’s or early 1980s micros. But the PDP-8 was also using pretty quick core memory (1.5 microseconds cycle time).

However, the PDP-8 cards already used DIP packaged components, compare this image from the Small Computer Handbook, p. 383 (1967/68):

pdp8-cards-small-computer-handbook-p383

1 Like

NoLand - agreed, the example I see in the video is a PDP-8e from 1972, which by that time used 7400 series ICs.

The earlier models from 1965 used discrete components on very low component density, modular pcbs, plugged into a backplane - which was machine wirewrapped.

One module contained a 1-bit slice of the ALU, Accumulator, the Program Counter, the Memory Buffer and Memory Address registers. 12 such boards completed the heart of the logic design.

The early pcbs have been reverse engineered - and transcribed into EagleCAD - so that they could be remanufactured.

Much of the PDP-8 design was concerned with the complexity of accessing the core-memory. The cycle time was dictated by the core memory. There were minor improvements in cycle time from 1.5uS in 1965 to 1.2uS by 1974.

There’s a good site here that follows the restoration of an early PDP-8 - documenting much of the electronics.

https://www.pdp8.net/straight8/functional_restore.shtml

2 Likes

One thing, the PDP-8 illustrates rather nicely, is that you need fast memory in order to operate with smaller word sizes and smaller instruction set encodings. Just a few years before, the memory cycle time of the much more expensive PDP-1 was at 5 microseconds, which provided suitable processing speed with 18 bits and 5-bit instructions. The reduction to 12 bits was only viable as memory became about 3 times faster. The same is also probably also true for 8 bit computing. Meaning, the lesser your means of production and the longer resulting cycle times, the more complex your architecture will be – and the higher your part count.
(This is also, where some of the early, simpler machines like the LGP-30 shine, providing suitable processing with simple components and a simple architecture. And yes, this means some pretty extensive arrays of diodes. The equivalence of encoding logic in ROMs.)

1 Like

Marc Verdiell has a nice video on this, showing how complex the matter of addressing core memory is and what feats in electronic engineering are involved:

2 Likes

NoLand - thanks for the explanation about word-size and memory access time. I had not realised that there was such a direct correlation - but it explains why the very early machines, eg EDSAC, had large word-sizes, which seemed a little adventurous for the day.

I guess, if your memory is slow, you want a correlation of one CPU cycle == one memory cycle and you want to accomplish something viable in this cycle. Take for example an 8-bit MPU, with instructions taking anything from 3 to 7 cycles and about 4.5 in average, like the 6502, involving stitching up addresses from consecutive reads and writes. And then having to do this multiple times for, say, an addition with suitable precision. You probably don’t want to do this with slow memory.

I suppose we should aspire to build an EDSAC before we aspire to an LGP-30. But the earlier machine uses 3000 valves/tubes - the later, simpler, machine is surely an easier build. “The LGP-30, standing for Librascope General Purpose and then Librascope General Precision … contained 113 electronic tubes and 1450 diodes” whereas “the Bendix G - 15 computer was introduced in 1956 [and] has 180 vacuum tube packs and 300 germanium diodes.” We know we can make diodes at home, using copper oxide:

Regarding the LGP-30, have a look at this still from General Precision LGP 30 Computer Oldtimer - YouTube – the entire frame on top of the computer is a single array of diodes:

lgp-30

And here’s a similar array for the read-write logic of the drum (from a later model):

Notably, solid state diodes only emerged at the end of WWII.

1 Like

Cat whisker (solid state) diodes have been around forver.
The Amateur Scientist from Scientific American may be usefull
for ideas. Organic (plastic) transistors may be another option.
Negitive resistance devices like NE2 tubes and Zinc/Oxide devices
also would make a intersting computer.
Mass storeage is easy, just have a good set of machine tools.
Mother earth news is a good read on other topics.

To my knowledge, operable diodes were among the first vacuum tubes and crystal diodes were first developed for RADAR applications in 1945. Am I missing something?
(The semiconductor diodes used in detector radios were of very varying quality, which is why there was this pin to find a suitable spot on the material that would do the job. I wouldn’t consider this in the margins required for automatic operations.)

Computer Engineering (a DEC view of hardware design) by Gordon Bell
pdf a good reading. The PDP 8 is a modern of the PDP 5, a machine designed to be 12 bits wide.
From the same book, the price of (semiconductor) memory per bit in cents per bit is: p = .3 * .72^(year-1974). Core memory ~ 1 cent a bit
at the same time. PDP8/E PDP11 PDP15 all came out in 1970 with the PDP 15 having the fastest memory and the biggest price. The PDP 11 with lower cost memory and the PDP 8/E the lowest speed and slowest cost. 1.2 us cycle time vs .8us cycle time for the PDP 15.
The current cpu I am building has 1.5 us core memory in a DE1 FPGA development kit, and 18 or 20 bit’s wide. The Design is from
late 1974, permiting use of MSI TTL for the the data path, and
256x4 PROMS for microcode decoding. The current version is 18 bits
but design idea’s change often, thus FPGA development, rather than
cards of 74LSxx chips. Byte addresing and other factors led a 9 bit byte, so that a more complete instruction set be implimented,
including load constant on condition and branch on condition.
Push/pop are NOT implimented, but indexing off the SP
is permited.Indexing and Immediate are the only two addressing modes. The design is my version of a computer to compete with the PDP 11 and the other Small PDP’s as well as the IBM 1130 and S100 systems.

1 Like

Now compare the 1.5µs (1964) of the original PDP-8 (1.2µs by the 1970s) to the 5µs of the PDP-1 (1959/60). – The developments in the production of cores (they shrunk considerably) and in interfacing core memory (not that trivial) were enormous and exceeded in terms of gains in speed any of Moore’s Law, at least in the early years.

Speaking of bootstrapping technology after a hypothetical collapse, I’m not that sure, if this could be accomplished easily and at a reasonably price. There are stories of early European machines, when cores where still “huge”, of core memory actually having been knitted by some elderly lady. Still, you had to come up with the cores in the first place and manage the engineering of the interfacing circuitry. (However, as indicated before, this is probably not the problem in this scenario, since, if there are plenty of salvageable Z80s left, there are probably also some of the interfacing RAM packages left, probably on the very same board.)

Otherwise: Core memory on an FPGA? Is this emulated core memory or are there FPGAs with real core memory? The project is interesting, let us know about the progress. (Personally, I always liked the idea of coming up with a slightly updated PDP-1.)

Well the core memory gives just the archecture model, for the read and write back cycle of main memory. The static memory on the FPGA development kit is hacked together with the cpu logic so it all fits timing wise and has speed that is normal for core memory at that time.
A early version was [B] [OP][ACC][INDEX][OFFSET]
with B=Byte/Word OP=opcodes AC=JCC/A/X/S INDEX=pc/zero/X/S
and SHIFT ac packed in there as well as the Classic FrontPanel.

I hope, we will never have to use a post-apocalyptic OS.
I wonder, if all computers are destroyed and someone has to use a Z80 from a toy.
How or where should he find or make this OS?
And what about a monitor, power supply etc?
You would need other things like food and water.

Another old thread surfacing - but this one is easy - How much computing could be done with a 4Mhz Z80 and serial terminal? Well - quite a bit WAS done on such systems running the CP/M operating system - many options for ‘self hosting’ too - assemblers, BASIC, C, FORTRAN and COBOL compilers (they may not have been as full featured as on more capable systems, but they were there). Not to mention databases, editors (Wordstar!) and so on.

Networking didn’t quite make it before they faded from common use - but various file transfer systems using modems, null-modems and so on.

Mass storage - I had a Northstar Horizon with a 10MB fixed drive, as well as floppys in the university I worked in - we used it for industrial control applications. (But I have to say, BBC Micros were easier to use, program and interface as soon as we got them!)

However, In the “post collapse” scenario then who knows, maybe my small collection of mechanical calculators would be valuable then… (Shades of “Logopolis” from Dr. Who with people working as ‘registers’ and computational nodes overseen by a ‘monitor’. Actually, didn’t Feynman do that with people and mechanical calculators in parallel as part of the Manhattan Project?)

-Gordon

1 Like

Is it just me or did some discussion of the Z-80 underlie the technical aspects of that script? I remember some sort of breakthrough moment where the Doctor (or somebody) realizes that they can use “block moves” to get past a problem. To me that is straight out of a list of potential advantages of the Z-80 – the “block move” instructions like LDIR.

BTW, while I’m a big fan of the Z-80, I think the block moves provide some code compactness advantages but aren’t so much a big win when it comes to speed. I guess unrolled LDI instructions will beat out an 8080 but POP/PUSH are the ones that go the fastest and even then it isn’t clear you’re beating 6800/6502/6809 in terms of moving memory.

This thread makes me think of The Three Body Problem, which is sort of “pre-apocalyptic.” They don’t have time or the natural resources to actually design and build computers, so the “emperor” simply emulates one, using his army. Every guy has a couple of flags and a set of rules for what to do with them. Computing is just a big logistics problem. :slight_smile:

Obviously, old computers could (and did) do very many useful things. But “useful” is context dependent. What kind of apocalypse would leave the world in a state where computing was still useful but deny the use of the last 30 years of technology? I mean, my MacBook Pro that I’m writing this post with is 10 years old; I have 20 year old ThinkPads with WinXP and Ubuntu that I still actually use (running old software, hobby development), and 30 year old desktops that are kind of crusty, but still work. In what sort of apocalypse would I use my RC6502 to save the world and not my ThinkPad x23?

The only one I can think of is some kind of AI uprising (see Battlestar Galactica!) where modern tech is vulnerable, then strictly controlled, so that the underground resistance must reinvent in secret using sort of “medieval computing” - blown glass vacuum tubes, forged wires, etc.

Workers calculate rows of pixels, return their sheets to the front for pixel rendering, and hang their completed worksheets on the wall

Human computing was a human-powered computer.

Collapse Computing

Frugal Computing Salvage Computing Collapse Computing
Utilizing computational resources as finite and precious, to be utilised only when necessary, and as effectively as possible. Utilizing only already available computational resources, to be limited by that which is already produced. Utilizing what has survived the collapse of industrial production or network infrastructure.

Collapse computing is on the tail end of not-being-able-to-make new hardware. A post-collapse society that has eventually lost all of its artificial computing capacity may still want to continue the practice of computer science in a purely theoretical level, as a form of mathematics.

Eventually you can’t assemble a thinkpad battery, fix the Macbook’s USB ports, or make a new power supply from ductapped salvaged parts.

Collapse informatics is software engineering taking advantage of today’s abundance in computing power to prepare for a future in which current infrastructures have collapsed.

3 Likes

Global Thermonuclear War… ie. the M.A.D. scenario where it’s not just radioactive fall-out but the EMPs generated which will render a lot of electronic devices useless…

However the knowledge will still live on in books (oops, lets keep paper copies and not store them in the cloud!) and living memories…

Shall we play a game?

(excuse me while I shield my 6502’s with tinfoil hats…)
-Gordon

The paper copy for a retro-transistor
Scientific American’s
THE AMATEUR SCIENTIST
Transistors. how to make thin film,
1970 Jun, pg 141