The Golden Age of Retrocomputers

This topic is to encourage some discussion about what we feel might be known as the “ideal retrocomputer”.

Clearly there is no simple answer, and everyone will have their own personal thoughts.

An earlier discussion about memory size, graphics capabilities and operating systems requirements - suggested that the route we have followed since the introduction of the IBM PC in 1981 has been far from ideal, and with a few twists in history, we may have had an entirely different journey and destination over the last 40 years.

I think the starting point for my historical journey must start with Doug Engelbart’s “Mother of All Demos” from December 1968. Full length video available on youtube.

Engelbart’s work at the Stanford Research Institute, had a strong influence on the work done at Xerox PARC, and the Alto machine they created in order to implement Engelbart’s concepts in reasonable sized (workstation) hardware.

The Xero Alto is well documented on Wikipedia, and to some was considered to be the first personal computer. Applications included engineering tools - including Mead and Conway’s integrated circuit editor, and document editing tools.

Originally was programmed in BCPL but became a strong influence on experimental new software such as Smalltalk-76 and Niklaus Wirth’s accompanying work on the Swiss built, Lilith workstation and Project Oberon in the late 1970s.

All of the elements of advanced GUI based workstations were available by 1980 - approximately 10 years timeframe after Englebart’s nLS demo.

It is documented that Steve Jobs visited Xerox PARC with some Apple Engineers sometime in 1979 and their work on the Lisa and Macintosh was reputed to be heavily influenced by what they saw there.

Contrary to popular belief - work on the Lisa and Macintosh had been underway for several months before the time of their visit.

The Macintosh of 1984 could be argued to be a cost engineered descendent of the Xerox Alto. The introduction of the 68000 in 1979 and cheaper RAM made it possible to gain an order of performance over 8-bit designs, and the ability to support a GUI.

The basic model Alto that had cost $32,000 in 1979, had by 1984 been reduced to a $2495 Macintosh, but it faced stiff competition from the IBM PC and the numerous PC clones that were availabe. Initial Mac sales were hindered by an acute dearth of software titles compared to the by then well established PC.

That concludes my Golden Age of Retrocomputers - the decade from 1969 to 1979 which coincides nicely with the first 10 years of Xerox PARC.

4 Likes

A second effort at a worthy response: perhaps the question is “at what time in the past did the future of computing look most promising?”

(My first effort was merely a riff on what retrocomputing means to me)

And yes, I think PARC’s research, and Xanadu, and Englebart, are all in there. The first rash of Doctor Dobb’s Journal from The People’s Computer Company was all about the liberating and empowering aspects of computers - access to computers, not ownership of them. Education, rather than entertainment. Computers as a tool, to improve people’s lives, not as a harness, not as a distraction. And there’s Logo, and the Media Lab.

I don’t have personal experience of any of that, but it seems like optimistic times, time of possibilities.

2 Likes

Agreed Ed,

My decade might perhaps be referred to as the “Research Decade” where the roots of much of what we have now, were developed within the lab environment.

Overlapping these somewhat idealistic, esoteric, and expensive 16-bit research projects, was the 10 years that marked the “Peoples 8-bit Revolution”.

The advent of the Altair 8800 of 1974, the Imsai, Apple 1 and all the subsequent affordable 8-bit machines up until about 1984 when the 8-bit market spectacularly crashed.

That market was to be replaced by the 68000 based Macintosh, then the Atari 1040ST and Commodore Amiga machines appeared - marking the end of the 8-bit era.

As for the IBM PCs - although introduced in 1981, they were notoriously expensive, I saw my first - a DEC Rainbow in my final year at college.

It probably wasn’t until 1991when clones became affordable enough to have one on every company desk.

Eventually in 1995 I could afford a home PC - a 2nd hand refurbished laptop PC with an 80186, cost me about £300.

Computers were expensive back then - and most home users tried to get at least 5 or 10 years life out of them. My 2010 Dell laptop is still going strong.

EDIT - I refer to the 68000 as being a mover and shaker of the 16-bit GUI era, but as most people know it is 16/32 bit - with 16-bit ALU. Later family members became full 32-bit.

I had another thought… a Golden Age is usually the age somewhat before one’s own adulthood. Looking at what we see, and seeing the flaws, we tend to hark back to something we know a bit less about, about which we can paint a favourable picture.

1 Like

A personal thought of that time might be just before an entire computer design could not be more or less fully understood by a single person.

So the home computers of the 70’s and early 80’s might fall into this category - but note that while the latter ones like the BBC Micro/Spectrum, etc. had large (for the time) ULAs their block functionality was still easy to understand and you could comfortably have a working grasp of the CPU architecture and most instruction mnemonics and popular software (ie. BASIC and assembler) as well as a good idea what the custol silicon could do for you. The early Amigas and Atari STs might fall into this line of thought too.

The early Apple Mac’s may fall into this category too but when software started to get more complex and hardware started to match, then it started to get harder and harder for one person to carry it all.

Early IBM PC’s? Possibly, but then plug-in cards, more, bigger and varied software and they quickly spiraled out of the ‘ken’ of a single person (I reckon)

So if I were to put a date on it for me, it might be 1982. The BBC Micro was out, S100 systems were still easy to understand. The PDP11 was old and well understood (although software had outgrown it by then and Unix was no-longer a one or 2 man system either…) and I was 20 years old by the time 1982 was out.

My own modern retrocomputer is based on the 65816 (so 6502++) and programmed in BASIC, assembler and BCPL (which I used extensively in the early-mid 80’s).

Cheers,

-Gordon

1 Like

Note that the first value is “cost” (as the Alto was never commercially available) while the second is “price”. The cost of the 128KB Mac was a fraction of the price, which included the cost of the famous Superbowl commercial and of building a new automated factory.

A few years later an issue of Macworld magazine estimated the actual costs of a Mac SE with its price and used Apple’s annual report for investors to figure out where the difference went.

Alan Kay talks about this, see for example How to Invent the Future - the aim with the machine that became the Alto was to build a machine which would have capabilities from the future. You can get 10-15 years ahead if you spend enough money. And then you can discover or invent the ways of using the machines that will in due course become ordinary. (Perhaps see also this discussion on HN)

Regarding Engelbart, “The Mother of All Demos”, and NLS: This was serious retro hardware!

The computer was a SDS 940 (which originated from the Project GENIE timesharing system at Berkley), featuring paged copy-on-write memory. The IO was in its more basic version implemented by TTY, but the visual IO seen in the demo was another story. The “Special Devices Channel” projected the output on multiple calligraphic CRTs, which were timeshared between users. The images produced by the CRTs were picked up by commercial video cameras, amplified and then sent to user-facing raster-scan CRTs (basic monochrome TVs without a tuner) via a closed-circuit TV network. The system featured two display generators, each driving up to 8 CRTs. The controller logic for this was built from vacuum tubes and reportedly required “one and a half people to keep those things running all the time” (Engelbart). On the plus side, sessions could be easily (screen-)shared as could be external sources, like video streams.

Here’s a block diagram of the Special Devices Channel:

(Image, as well as technical info on NLS hardware, drawn from “Bootstrapping” by Thierry Bardini.)

It’s an interesting solution, incorporating both design choices and unorthodox use of technology dictated by budget constraints.

1 Like

This is a rather important, while mostly overlooked application of Moore’s Law, which is also reversible: if you put just enough money at it, you can buy yourself a window into the future (of what may become generally affordable technology).

I disagree. Memory speed and density are two important factors that cannot change much per generation. New trends
take advantage that cost will come down, not the speed or density. fake pricing 8080 IC $150 each 86 $200 286 $250
386 $350 486 $450 586 (BUG discount) $500
ben.

The nLS demo used an Eidophor projector for the large screen video presentation in the conference hall.

The Eidophor is a fascinating retro-technology in its own right - involving static charged oil films as a primitive light-valve and “windscreen wipers” to refresh the oil film.

2 Likes

From Wikipedia https://en.wikipedia.org/wiki/Motorola_68000

IBM considered the 68000 for the IBM PC but chose the Intel 8088 because the 68000 was not ready.

Walden C. Rhines wrote that thus “Motorola, with its superior technology, lost the single most important design contest of the last 50 years”.

I suspect that IBM Management were not really interested in making a PC, and chose the cheapest, worst performing 16-bit processor, with an 8-bit hobbled bus, to keep costs down.

I’m sure some of the management thought that if they only paid lipservice to the PC project - that it would die a quiet death.

Many words have been written on IBM’s choice of CPU for the PC! My own summary would be that Intel were an existing supplier, the 8 bit bus option was attractive, Intel would second-source, and the chip was ready. As the PC was a famously rapid and independent project, picking a CPU which wasn’t ready, or for which supplies might be disrupted, would be a risk too far.

Edit: the idea that superior technology doesn’t always win the race is an important one which keeps recurring.

1 Like

Agreed - the 68000 did not really sample until February 1980, with volume not available until November 1980.

However, it is one of those crossroads in computing history - where the temptation is to ask “what if?”.

1 Like

While this is mostly true, you can still invest into hardening support logic to use a promising technology, which isn’t ready for production yet. E.g., the Alto used RAM chips, when these were still notoriously failing and introduced a quite extensive low-level error correction layer in hardware to cope with this. Regarding density, mind that the Alto was the “interim Dynabook” – an excessive form factor may allow for some liberty.

I think the crucial observation about the interim Dynabook was to throw money at the problem: this was never to be a commercial proposition with a business case behind it. A research machine, not a product. And so the usual constraints of economics don’t apply.

The BBC Micro was in some small way similar: the design used bleeding-edge DRAM chips, available only from one supplier, and not cheap. The expectation was to make mostly the 16k model, not the 32k model, and to hit volumes of 10,000. When, in the event, the BBC Micro was much more popular than that, and the 32k model very much the only one worth making, the price and availability of those bleeding edge chips presumably relaxed a bit.

Which is to say, in the presence of continual exponential improvement, it pays to try to design for the situation you’ll have at the time of sale, not at the time of design. If only you can hold your nerve, and predict the future well enough.

(The IBM PC also misunderestimated the expected sales volumes dramatically: as a consequence, the product was priced higher than it would have been, and so ended up more profitable than it would have been.)

In the early days it was very much over-engineered - after all it was designed and built by a mainframe manufacturer.

I remember the thickness of the steel in those early chassis and the hard-drive suspended on rubber bungee mounts.

My employer started buying them in about 1986 for about £3000 each, when you could buy a house for £60,000

Specifically, according to the 68000 design team, the 68008 was not ready, and IBM wanted the cost reduction of an 8-bit external data bus.

This interview with some members of the 68000 team covers that (as well as some other fascinating bits of history, including discussions of working with Sun and Apollo on their workstations, the relation (or lack thereof) between the 68000 and 6809, etc.):

1 Like

This would have made some sense, since the 68000 was big-endian, like most of the bigger IBM hardware. The 68000 may have been just ready for market introduction, but reportedly Motorola couldn’t provide 5,000 pre-production samples required for IBM’s internal evaluation process. (At least, this is what I’ve read. I’m not so sure about the quite excessive number of samples. This may be off a magnitude or two.)

Regarding management not being so sure about the PC: Mind that IBM was struggling all over the 1970s over a PC design (even before the trinity of 1977). There were several concepts, like “Yellow Bird”, a very promising prototype “Aquarius” based on bubble memory modules, which even made it to pre-production prototype stage (including a complete marketing concept), and out-sourced design studies (e.g., there’s one by Eliot Noyes Associates based on unknown hardware). After the dismissal of “Aquarius” (apparently for fading confidence in bubble memory) upper management apparently just gave up. At some point, just before Project “Chess”, which became the IBM PC, IBM even considered buying Atari and basing their PC on the Atari 800. (At least, there’s a design study for this.)

Some of this (including images) can be found in “Delete.” by Paul Atkinson.

IBM “Yellow Bird” mockup (Tom Hardy, 1976; image: “Delete.”):

Envisioning home computing with “Yellow Bird” in 1976 (image: “Delete.”)

IBM Aquarius (Tom Hardy, 1977; image: “Delete.”):

IBM “Atari PC” design study (Tom Hardy, 1979; image: “Delete.”):

Here’s a sketch for the Noyes Associates project:


Source and further information: “IBM’s Home Computer, 1977 | Dan Formosa

“The prototypes created for IBM were a bit more interesting than their 1981 PC – this 1977 project was envisioned in three versions: beige, deep red, and teak. Teak? Yep – a real wood cabinet.”

It seems, contrary to common belief, IBM didn’t “miss out on the home computer”, they just tried too hard…

3 Likes

Just to note, CHM also published a transcript of this oral history.

1 Like