The Golden Age of Retrocomputers

It was sort of available at the end of 1980 in that you could get 4MHz versions (like the Lisa used) but had to wait a while longer before the 8MHz chips could be had.

IBM did launch a 68000 based computer about 10 months after the PC. But it was 2 to 3 times more expensive than the PC.

1 Like

That’s a wild machine @jecel! (A lab machine)
image

2 Likes

Regarding IBMs journey to the PC: It’s quite interesting to see IBM shift from home computing to the office (as first noticeable in the Noyes Associates study and the “IBM Atari”) in the prototypes and design studies above. However, the “PC”, the IBM 1550 wasn’t the disruptive product it is often considered nowadays. At 4.77 MHz and with a 8-bit mother board, it was hardly faster than the home machines of the time. (With built-in BASIC in ROM and a cassette port, it wasn’t also that dissimilar from the home computer concept, less RF output.) The idea was more about merging the various text processing systems and the smart terminal, while crucial software, like databases, would still live on the mainframe, tying the PC into the IBM ecosystem. From this perspective, the 5150 didn’t extend much over the previous IBM 5120, but more cost effective and in a smaller form factor. It wasn’t before the i386 that PCs really became what is considered a PC today, a machine that was capable of shifting business applications from a shared mainframe to a standalone machine. And IBM apparently wasn’t too happy with this. (From this perspective, the original PC was just a short episode, before we got “real PCs” in form of the i386 machines.)
It really was more for the business customers, who were longing for standardization, in the jungle that was acquisition, and were looking at IBM for this, that the PC became this “iconic breakthrough”. (And, in the end, it was probably gaming that won the day for the PC.)

The 68000 workstation linked by @jecel is a different beast. As a lab computer it is perfectly able to “stand on its own feed”. However, from a business perspective, there’s quite a difference in a lab machine (where standalone is a requirement and cost isn’t that a crucial factor) and business computing (where you want to tie everything into your ecosystem and scale is a factor – also, “there’s a card for that”, the 1980s’ equivalent of the App Store).

Edit: This was probably also, where the Lisa and the Mac failed as “serious” business machines: as a front-end to mainframe application, they didn’t have much to offer in terms of the GUI, apart from slightly lesser integration, since it was still the text-based mainframe application. What remained, what they were really good at, was the production of individual documents of all sorts, but they still failed on the integration aspect. Which may have been just a bit too much of a personal computer. (Apparently, Jobs learned a lot from this, compare the NeXt, which was really huge on intergation.)

A fascinating oral history, and insight into Motorola’s internal operations - involving several of the key members of the 68000 “delivery team”.

How from out of the downturn of 1975, the Motorola microprocessor group delivered a new architecture, and brought it into volume production.

This involved radical internal changes of culture within Motorola and the building of at least one new MOS factory at the cost of some $800 million.

The engineers witnessed the chage from learning about punched cards on mainframes at college, to creating an environment where engineers had powerful workstations on every desk. Workstations that primarily used Motorola 68K family devices.

The video interviewees cover approximately the two decades at Motorola between 1975 and 1995, of which 1980 to 1990 was the decade of the 68000.

The interesting twist to this oral history is that the chairman of this meeting was ex-Intel, and there was a lot of camaraderie between once rival organisations. They agreed amonst themselves that Intel and Motorola just did things differently, two completely different approaches to solving the same basic problem.

At almost 3 hours long - it’s an excellent bit of viewing for a spare evening.

3 Likes

Related to the original question, I think, a crowdfunding campaign for a documentary:
https://www.kickstarter.com/projects/messagenotunderstood/message-not-understood/
(About looking at the past to understand the present: many classic photos in the enclosed video. And the line “lets make sure the computer doesn’t end up like the television” - which makes complete sense to me, but I can imagine it might be dated.)

via mastodon

“Message Not Understood” is by far the most common error message in Smalltalk.

4 Likes

There is a great video of a Lilith Emulator here and some photos

On Github there’s an emulator for SIMH running on the Interdata32 emulator but needs UNIX 7 installed.

2 Likes

Indeed, The “Atari PC” is real. When us old Atarians first heard about it, we wondered if it was just some fantastic story to get attention. It only came to light in the retro Atari community a few years ago. Though, what I heard about it then was that Atari would have been an OEM manufacturer for IBM, not purchased by them.

A very small group inside Atari was involved in creating the prototype for IBM, and even knew about it. When former Atari employees were asked about this a few years ago, almost none of them knew about this proposal. In any case, it fell through. The proposal happened before Atari released the 400/800 in 1979.

Speaking of such stories, Tandy Trower, a former Atari employee, said in 2015 that he worked with Microsoft in about 1978 to get its 6502 Basic, with a specific feature set Atari wanted for their computers, working inside of 8K of ROM, but they couldn’t get it to fit. So, Atari got their Basic from Shepardson Microsystems. The language design they used was inspired by Data General’s Business Basic.

In any case, Atari was so impressed with Microsoft that they offered to buy the company. Bill Gates turned them down. Smart decision, lol! :smile:

Antic interview 77: Tandy Trower, Atari Product Manager

2 Likes

Indeed, there are only a few kinds of error messages in Smalltalk. I think this is an indicator of its flexibility.

I eventually learned that this message can be trapped by a method/handler: doesNotUnderstand: (often referred to as “DNU”).

I found a use for it in an experiment I tried, where I used a proxy class for contacting websites, using CGI. What was fascinating was that it was super easy to emulate CGI from within Smalltalk. The CGI form maps well onto Smalltalk’s messaging syntax. So, I contemplated that rather than using an API to contact websites, I could just treat websites like they were objects in the Smalltalk system.

The proxy class contained a few defined methods, but most of what I passed to it were messages it knew nothing about. So, they’d get dumped to the DNU handler, which would receive the message as an object. The handler could then unpack the message, and do with it what it wants.

DNU reflects how some of the earlier versions of Smalltalk worked normally: The way they implemented methods was each object parsed an input stream. With ST-78 and later, they decided on a keyword syntax for message passing, and made DNU a special case for custom message parsing.

1 Like

Don’t skip the original KiddiComp/Dynabook concept! The Alto was meant to be an initial step toward a mobile slate device but the idea of an office PC and the new directon of Smalltalk pulled everyone’s attention away from that goal.
102716364p-03-02
Dynabook

1 Like

True, the technology needed to create a Dynabook was available by the late '70s, and Alan Kay tried pitching the idea to Xerox, but they weren’t interested in developing it.

The closest he got at Xerox was a project he started on in the mid-70s, called NoteTaker.

It ran Smalltalk-78 on 3 Intel processors, IIRC, which Alan called “barely sufficient.” An exciting thing they tried was using the NoteTaker on a commercial flight, and it worked out. I think they marked the occasion by saying it was the first object-oriented system used at 30,000 ft.

If it looks an awful lot like an Osborne 1, that’s because the case design for it was inspired by NoteTaker. Nothing else about it was, though.

There’s a brief moment in The Xerox Alto: A Personal Retrospective (from 2001) where Chuck Thacker said that what Alan Kay was after with the Dynabook was “this,” holding up a tablet. He mentioned that he had a version of Squeak (Smalltalk) on it.

It was a big improvement on the Compaq (3 8086s, as you said, instead of one 8088) and was running a GUI. Instead releasing this in 1979 Xerox did the Star and a desktop CP/M machine years later.

About Squeak on a tablet, though it was ported to iOS very early on Apple has never allowed it to be available (you can use it to write an app and release that as long as you prove there is no way for the users get to the underlying Smalltalk).

2 Likes

Bob Belleville at Parc proposed a small scale version of the Star, using Intel or Motorola processors, but at the time he proposed it, it would’ve been more expensive than the Star. That changed a couple years later, becoming cheaper. It also would’ve required rewriting everything they were doing for the Star, and removing some features. So, they didn’t go forward with it.

BTW, I think, since Chuck Thacker was working for Microsoft Research, the tablet he held up was a Tablet PC. I never used one of those (I worked with Telxons in the 1990s, sometimes running Windows), but I imagine it wasn’t locked down. So, running Squeak on it without restrictions on code would’ve worked when he talked about that.

Re. Apple and Squeak

I used to track this. Last I heard (many years ago), they allowed Squeak on the iPad, but the only way to share code was by uploading projects to a website, so that others could run the code through a web-emulated version of Squeak (ie. they were not allowed to download code to the iPad).

What I remember is Apple worried that Squeak could be used as a vector for security exploits, since you can access OS functions, and the file system from within Squeak.

Like so many commercial Linux distributions, I imagine apps. run as root on iOS, so all apps. have to be treated like “potential criminals.” As Alan Kay has said for years, the problem should be solved by designing a secure OS architecture, not looking at programming with such suspicion. It’s a good point. I’ve seen this with PCs in schools. School districts lock them down, and make it difficult for programming students to download new languages to them (because the language isn’t “certified” by the district). When I was in school, the 8-bit computers weren’t like this at all. We could run what we wanted on them, without fear of screwing up the system. A good part of that was the OS was in ROM, so it could not be altered by software. Another was we didn’t have hard disks on the school computers, and schools could block writes to their floppies, by simply covering the “write notch” with tape that was difficult to remove (though, this could be circumvented with scissors, but I never saw that). Administration was a lot simpler with this setup, and the risk was a lot lower. I’m not saying go back to that exact setup. I use it as an illustration that it is possible to secure systems without interfering with the ability of users to do programming, if the powers that be want it. Though, it is a matter of how the system is designed, and using currently available system designs may not support current needs. So, new ones would have to be researched.

1 Like

Thacker was the creator of the Tablet PC at Microsoft, so you are right that he would show that and there was no problem running Squeak on it.

Very early on Kay’s group did a browser plugin version of Squeak that could be securely used without having to be explicitly installed on school machines. It couldn’t touch files except for those in one special directory. Unfortunately they found that schools wouldn’t allow plugins in general to be downloaded but only Flash and Java. So this effort was wasted and they still couldn’t reach the students.

If a school has a lab with a bunch of Raspberry Pi machines and each student has their own SD card(s) you have roughly the same functionality as the floppy-only days (and the Alto with removable pizza disks before that). The students can only harm themselves by what they do locally or using the network and the next student to use the machine gets a clean slate.

2 Likes

I think this is one of the big advantages of the Pi, and perhaps a slightly unexpected one. There is no state on the machine other than the SD card, and those cards are cheap and can be one per project per person, and can readily be re-imaged. (Indeed, one learns the not-quite-obvious lesson that what matters is not which computer one uses, but what data it can access.)

As for the other point, I think it’s essentially one of sandboxing, or security, and the simplest safe approach is to forbid whatever cannot be understood, or modelled, or policed. Hence Apple’s rules, and school rules. The one-sdcard rule is simple and hopefully simple to explain too.

1 Like

Micro SD cards, though, are easily lost and very fragile. I have a drawer full of cracked ones that the magic smoke leaked out of.

(I do perhaps see more Raspberry Pis than most, though. I’m a former Approved Reseller and now build products that include Raspberry Pis)

Yes indeed, they are very small. Also, I have fewer than a dozen in use but struggle a bit to know which is which.

I could try to see this through the retrocomputing lens, and say that floppies too (and cassette tapes) will not always read back correctly - and we could see it through a teaching lens too, about the importance of backups!

I imagine that these Micro SD cards are sufficiently reliable for everyone to continue to use them, even if at scale we see many failures.

Agree. We had to be careful with our floppies. Don’t put them on anything magnetic (so, not on speakers, not on TVs (back when they used tubes)). Don’t touch them through the window. You had to know how to hold them, at least before we got to 3-1/2" disks, which had more secure cases, and so were more comfortable to handle.

A lesson I learned when I was a teen was don’t leave floppies in the car on a hot day. The case for a 5-1/4" floppy I had in the car got warped, and the disk became unreadable. My mom came up with a way to fix this by sandwiching the disk between some heavy books for around a month. That managed to straighten it out enough that it became readable again.

I think there would’ve been a way to take the disk out of its case, and put it in a fresh one, but the only way I would’ve been able to do that was to buy a new disk (each costing about $5), and sacrificing the disk in it so I could use its case, and hopefully doing a good enough tape job so it would fit in a drive.

1 Like

I remember when Alan was complaining about the design of the web browser, saying it wasn’t even a flat tire, but a broken wheel, he said it should’ve been designed like an operating system. I thought that was the right idea, but I didn’t quite understand what he was getting at, until I thought about the contrast between using JS in a web browser vs. using a plug-in. I realized that plug-ins were potential exploit vectors, because they were just dynamically loaded libraries, which had access to the underlying OS. So, even if one plug-in developer had done their due diligence to make their plug-in secure, you couldn’t count on the next plug-in developer to do the same. So, you had uneven security in the browser environment (which explains why schools wouldn’t allow plug-ins). Whereas JS is more securable, because it’s a standard part of the browser, and its vulnerabilities have been getting sewn up in the browser design. In one sense, that’s what he was talking about: Create a more generalized, standard security infrastructure, so you don’t have to just use HTML/CSS/JS to get this protection and trust (though, now, you can get it using WASM, as well). Also, I’m sure, he would’ve liked to have seen better communication interfaces between internet objects in that environment than just trying to get plug-ins and JS entities to talk to each other. Also, JS has its design problems that make it a headache for developers, but this was because Brendan Eich was forced to come up with it in about a week!

There’s an interesting story behind JS. I thought, once I learned Lisp, that JS should’ve been designed as a Lisp dialect, because its semantics would fit well with the design aesthetic of HTML tags, etc. Well, it turns out that’s what Eich originally wanted to do. He was brought in to implement a Scheme interpreter in the Netscape browser, but Netscape changed its mind once they saw they could do a deal with Sun to bring in Java applets. Netscape and Sun didn’t want the scripting language to contrast too much with Java’s language design. So, Eich redesigned it to have a Java-like aesthetic, while keeping some of Scheme’s semantics, also adding prototypes, which he said were inspired by Self.

3 Likes

The original web browsers handled html exclusively. If you clicked on a link that pointed to any other kind of file the browser would ask the operating system to start up the corresponding application and forward the file to it. If that crashed your machine it was the OS and/or application’s fault.

Mosaic changed that by handling some graphic files itself so it could show them inline with the nicely formatted text. Obviously you wouldn’t want to recompile the browser for every new kind of file you wanted to show, so it followed a path similar to the evolution of MS-DOS and other operating systems (the creation of loadable drivers).

The lack of awareness that in practice they were creating a whole operating system made the evolution of web browsers more awkward and limited than it should have been. Something like Greenspun’s tenth rule of programming.

About the Self inspiration, it is a pity that two closely related but different models ended up both being called “prototypes” - Self’s and Henry Liberman’s. It might have been better if Self had used the term “exemplars” instead. Both Newtonscript and Io claim inspiration from Self but actually follow Lieberman. Javascript does seem more Selfish to me.

3 Likes