The Golden Age of Retrocomputers

Regarding Engelbart, “The Mother of All Demos”, and NLS: This was serious retro hardware!

The computer was a SDS 940 (which originated from the Project GENIE timesharing system at Berkley), featuring paged copy-on-write memory. The IO was in its more basic version implemented by TTY, but the visual IO seen in the demo was another story. The “Special Devices Channel” projected the output on multiple calligraphic CRTs, which were timeshared between users. The images produced by the CRTs were picked up by commercial video cameras, amplified and then sent to user-facing raster-scan CRTs (basic monochrome TVs without a tuner) via a closed-circuit TV network. The system featured two display generators, each driving up to 8 CRTs. The controller logic for this was built from vacuum tubes and reportedly required “one and a half people to keep those things running all the time” (Engelbart). On the plus side, sessions could be easily (screen-)shared as could be external sources, like video streams.

Here’s a block diagram of the Special Devices Channel:

(Image, as well as technical info on NLS hardware, drawn from “Bootstrapping” by Thierry Bardini.)

It’s an interesting solution, incorporating both design choices and unorthodox use of technology dictated by budget constraints.

1 Like

This is a rather important, while mostly overlooked application of Moore’s Law, which is also reversible: if you put just enough money at it, you can buy yourself a window into the future (of what may become generally affordable technology).

I disagree. Memory speed and density are two important factors that cannot change much per generation. New trends
take advantage that cost will come down, not the speed or density. fake pricing 8080 IC $150 each 86 $200 286 $250
386 $350 486 $450 586 (BUG discount) $500

The nLS demo used an Eidophor projector for the large screen video presentation in the conference hall.

The Eidophor is a fascinating retro-technology in its own right - involving static charged oil films as a primitive light-valve and “windscreen wipers” to refresh the oil film.


From Wikipedia Motorola 68000 - Wikipedia

IBM considered the 68000 for the IBM PC but chose the Intel 8088 because the 68000 was not ready.

Walden C. Rhines wrote that thus “Motorola, with its superior technology, lost the single most important design contest of the last 50 years”.

I suspect that IBM Management were not really interested in making a PC, and chose the cheapest, worst performing 16-bit processor, with an 8-bit hobbled bus, to keep costs down.

I’m sure some of the management thought that if they only paid lipservice to the PC project - that it would die a quiet death.

Many words have been written on IBM’s choice of CPU for the PC! My own summary would be that Intel were an existing supplier, the 8 bit bus option was attractive, Intel would second-source, and the chip was ready. As the PC was a famously rapid and independent project, picking a CPU which wasn’t ready, or for which supplies might be disrupted, would be a risk too far.

Edit: the idea that superior technology doesn’t always win the race is an important one which keeps recurring.

1 Like

Agreed - the 68000 did not really sample until February 1980, with volume not available until November 1980.

However, it is one of those crossroads in computing history - where the temptation is to ask “what if?”.

1 Like

While this is mostly true, you can still invest into hardening support logic to use a promising technology, which isn’t ready for production yet. E.g., the Alto used RAM chips, when these were still notoriously failing and introduced a quite extensive low-level error correction layer in hardware to cope with this. Regarding density, mind that the Alto was the “interim Dynabook” – an excessive form factor may allow for some liberty.

I think the crucial observation about the interim Dynabook was to throw money at the problem: this was never to be a commercial proposition with a business case behind it. A research machine, not a product. And so the usual constraints of economics don’t apply.

The BBC Micro was in some small way similar: the design used bleeding-edge DRAM chips, available only from one supplier, and not cheap. The expectation was to make mostly the 16k model, not the 32k model, and to hit volumes of 10,000. When, in the event, the BBC Micro was much more popular than that, and the 32k model very much the only one worth making, the price and availability of those bleeding edge chips presumably relaxed a bit.

Which is to say, in the presence of continual exponential improvement, it pays to try to design for the situation you’ll have at the time of sale, not at the time of design. If only you can hold your nerve, and predict the future well enough.

(The IBM PC also misunderestimated the expected sales volumes dramatically: as a consequence, the product was priced higher than it would have been, and so ended up more profitable than it would have been.)

In the early days it was very much over-engineered - after all it was designed and built by a mainframe manufacturer.

I remember the thickness of the steel in those early chassis and the hard-drive suspended on rubber bungee mounts.

My employer started buying them in about 1986 for about £3000 each, when you could buy a house for £60,000

Specifically, according to the 68000 design team, the 68008 was not ready, and IBM wanted the cost reduction of an 8-bit external data bus.

This interview with some members of the 68000 team covers that (as well as some other fascinating bits of history, including discussions of working with Sun and Apollo on their workstations, the relation (or lack thereof) between the 68000 and 6809, etc.):

1 Like

This would have made some sense, since the 68000 was big-endian, like most of the bigger IBM hardware. The 68000 may have been just ready for market introduction, but reportedly Motorola couldn’t provide 5,000 pre-production samples required for IBM’s internal evaluation process. (At least, this is what I’ve read. I’m not so sure about the quite excessive number of samples. This may be off a magnitude or two.)

Regarding management not being so sure about the PC: Mind that IBM was struggling all over the 1970s over a PC design (even before the trinity of 1977). There were several concepts, like “Yellow Bird”, a very promising prototype “Aquarius” based on bubble memory modules, which even made it to pre-production prototype stage (including a complete marketing concept), and out-sourced design studies (e.g., there’s one by Eliot Noyes Associates based on unknown hardware). After the dismissal of “Aquarius” (apparently for fading confidence in bubble memory) upper management apparently just gave up. At some point, just before Project “Chess”, which became the IBM PC, IBM even considered buying Atari and basing their PC on the Atari 800. (At least, there’s a design study for this.)

Some of this (including images) can be found in “Delete.” by Paul Atkinson.

IBM “Yellow Bird” mockup (Tom Hardy, 1976; image: “Delete.”):

Envisioning home computing with “Yellow Bird” in 1976 (image: “Delete.”)

IBM Aquarius (Tom Hardy, 1977; image: “Delete.”):

IBM “Atari PC” design study (Tom Hardy, 1979; image: “Delete.”):

Here’s a sketch for the Noyes Associates project:

Source and further information: “IBM’s Home Computer, 1977 | Dan Formosa

“The prototypes created for IBM were a bit more interesting than their 1981 PC – this 1977 project was envisioned in three versions: beige, deep red, and teak. Teak? Yep – a real wood cabinet.”

It seems, contrary to common belief, IBM didn’t “miss out on the home computer”, they just tried too hard…


Just to note, CHM also published a transcript of this oral history.

1 Like

It was sort of available at the end of 1980 in that you could get 4MHz versions (like the Lisa used) but had to wait a while longer before the 8MHz chips could be had.

IBM did launch a 68000 based computer about 10 months after the PC. But it was 2 to 3 times more expensive than the PC.

1 Like

That’s a wild machine @jecel! (A lab machine)


Regarding IBMs journey to the PC: It’s quite interesting to see IBM shift from home computing to the office (as first noticeable in the Noyes Associates study and the “IBM Atari”) in the prototypes and design studies above. However, the “PC”, the IBM 1550 wasn’t the disruptive product it is often considered nowadays. At 4.77 MHz and with a 8-bit mother board, it was hardly faster than the home machines of the time. (With built-in BASIC in ROM and a cassette port, it wasn’t also that dissimilar from the home computer concept, less RF output.) The idea was more about merging the various text processing systems and the smart terminal, while crucial software, like databases, would still live on the mainframe, tying the PC into the IBM ecosystem. From this perspective, the 5150 didn’t extend much over the previous IBM 5120, but more cost effective and in a smaller form factor. It wasn’t before the i386 that PCs really became what is considered a PC today, a machine that was capable of shifting business applications from a shared mainframe to a standalone machine. And IBM apparently wasn’t too happy with this. (From this perspective, the original PC was just a short episode, before we got “real PCs” in form of the i386 machines.)
It really was more for the business customers, who were longing for standardization, in the jungle that was acquisition, and were looking at IBM for this, that the PC became this “iconic breakthrough”. (And, in the end, it was probably gaming that won the day for the PC.)

The 68000 workstation linked by @jecel is a different beast. As a lab computer it is perfectly able to “stand on its own feed”. However, from a business perspective, there’s quite a difference in a lab machine (where standalone is a requirement and cost isn’t that a crucial factor) and business computing (where you want to tie everything into your ecosystem and scale is a factor – also, “there’s a card for that”, the 1980s’ equivalent of the App Store).

Edit: This was probably also, where the Lisa and the Mac failed as “serious” business machines: as a front-end to mainframe application, they didn’t have much to offer in terms of the GUI, apart from slightly lesser integration, since it was still the text-based mainframe application. What remained, what they were really good at, was the production of individual documents of all sorts, but they still failed on the integration aspect. Which may have been just a bit too much of a personal computer. (Apparently, Jobs learned a lot from this, compare the NeXt, which was really huge on intergation.)

A fascinating oral history, and insight into Motorola’s internal operations - involving several of the key members of the 68000 “delivery team”.

How from out of the downturn of 1975, the Motorola microprocessor group delivered a new architecture, and brought it into volume production.

This involved radical internal changes of culture within Motorola and the building of at least one new MOS factory at the cost of some $800 million.

The engineers witnessed the chage from learning about punched cards on mainframes at college, to creating an environment where engineers had powerful workstations on every desk. Workstations that primarily used Motorola 68K family devices.

The video interviewees cover approximately the two decades at Motorola between 1975 and 1995, of which 1980 to 1990 was the decade of the 68000.

The interesting twist to this oral history is that the chairman of this meeting was ex-Intel, and there was a lot of camaraderie between once rival organisations. They agreed amonst themselves that Intel and Motorola just did things differently, two completely different approaches to solving the same basic problem.

At almost 3 hours long - it’s an excellent bit of viewing for a spare evening.


Related to the original question, I think, a crowdfunding campaign for a documentary:
(About looking at the past to understand the present: many classic photos in the enclosed video. And the line “lets make sure the computer doesn’t end up like the television” - which makes complete sense to me, but I can imagine it might be dated.)

via mastodon

“Message Not Understood” is by far the most common error message in Smalltalk.


There is a great video of a Lilith Emulator here and some photos

On Github there’s an emulator for SIMH running on the Interdata32 emulator but needs UNIX 7 installed.

1 Like