The 68000's progress - ten computers

Nice article here, first of two:

Note that there’s an interleaving in the article, of the computers which used the actual 68000, with the subsequent introductions of the successors, starting at 68010. With very nice die shots.

It hadn’t struck me until just now that Apollo was, among other things, the god of the Sun. Nonetheless, before the rise of Sun, we had the rise of Apollo

Apollo would become the biggest seller of engineering workstations in the mid-1980s.

There are some great quotes throughout the article

Steve, here’s the deal. If you buy a million chips you will not pay more than 15 million dollars, today."

Apple introduced the Macintosh 128k powered by an 8MHz 68000 but only after Steve Jobs had negotiated the price down from more than $100 to $15 per chip.

5 Likes

Interestingly, Hack-A-Day had an article today about a $5 128K Mac built on a Raspberry Pi Pico.

1 Like

I thought the Apple Lisa, was a better machine but we got the cheap to make
knock off. Dr Dobb’s Jan 1985. Fatten Your Mac , by Thomas Lafleur and Susan Raab. Step by step instructions to increase the RAM in a Macintosh to 512K

I do wish the TRS-80 Model 16 was mentioned. A pretty early entrant (Feb. 1982) and was for a few years the best selling Unix (well, Xenix, but OK) machine.

2 Likes

Definitely. From what I read about this, the drawback was that the Lisa was designed around the early 68000 emulator (and it’s limited clock speed) and that the timing was intrinsically connected to the display circuitry – and thus it was difficult to change anything in the basic architecture.

That’s interesting. The same issue affects many do-it-yourselfers today: whether to divide the pixel clock down to make the system clock. The hardware design is more complicated if they are independent. And the frequencies for different pre-HDMI video resolutions are all unrelated, so you can’t easily evolve either one if you tie them together.

1 Like

In the '80s, it was pretty much a contest between the Intel PC and other computers that used Motorola. Intel won out. In the early days, this felt dumb, because of Intel’s segmented memory model on the 8086, whereas the 68000 series used a flat memory model. A lot of us were wondering why IBM went with this, rather than something that was easier to program.

Even when PCs had upgraded to the 386 and 486, which could use a flat memory model, I remember reading in the mid-90s that most DOS PC software was still written for the 8086, to maximize backward-compatibility, and that this had a performance penalty.

The CS lab at my school, in the late '80s/early '90s, had a bunch of Motorola-based workstations from HP, ranging from using the 68010 to the 68030, including some HP-Apollo workstations. The Apollos we used booted into a desktop UI, based on X and Motif. The color resolution on them was a sight to behold.

I got the chance to program in 68000 assembly on Unix for a computer architecture course, on some of these workstations, and it was nice. I’ve felt fortunate to have been trained on that, as opposed to trying to learn assembly on Intel. Even though learning Intel assembly would’ve been more practical.

Even though I’ve since learned the Motorola architecture was not very efficient, in terms of performance, the assembly instruction set felt very well-designed, straightforward.

As an example, I listened to a video game programmer, who’d had experience with both the MOS 6502 and the M68000, say that in terms of instructions per second, the 68000 wasn’t much faster than the 6502, because while the 68000’s clock was anywhere from 4-8x faster, it took something like 6-8 clock cycles to execute each instruction, whereas it would take anywhere from 2-4 cycles per instruction on the 6502.

I got the chance to see this, visually comparing the screen update rate for a homebrew, multitasking OS with GUI, written for the Atari 8-bit, called GOS, which used the Atari’s high-res. monochrome mode (320x192), as it redrew windows, with a 1.79 Mhz 6502, and comparing that with the rate on an 8 Mhz Atari ST GEM interface in low-res. (320x200, 16 colors), doing the same thing, and it looks about the same.

When I asked myself why someone like Jay Miner would want to leave the 6502 so badly for the M68000, the answer seemed to be it wasn’t for better performance, but rather, it was easier to program, and could access larger memory sizes more easily. As for the performance issue, he seemed to be a fan of using custom-designed coprocessors, anyway.

Even more so, since the M68000 was big-endian, like IBM’s big machines and resembled the system/360 & 370 in more than one way.
Allegedly, this was because Motorola was just at the initial stage of production, when the decision was made, and couldn’t fulfill IBM’s minimal samples requirements to prove production capabilities. Consequently, the “Chess” team didn’t have much of an alternative to the i8086…

On the other hand, IBM had just designed the IBM System/23 Datamaster (released 1981), which was based on the 8-bit 8085, and had gathered some familiarity both with Intel architecture and Microsoft BASIC, because of this. So there may have some genuine drive towards Intel originated from this.

Bill Gates seems to be confirming both versions at once, in an interview found in the March 1997 issue of PC Magazine:

For IBM it was extremely different because this was a project where they let a supplier — a partner, whatever you call us — shape the definition of the machine and provide fundamental elements of the machine. When they first came to us, their concept was to do an 8-bit computer. And the project was more notable because they were going to do it so quickly and use an outside company … The novel thing was: Could you work with outsiders, which in this case was mostly ourselves but also Intel, and do it quickly? And the key engineer on the project, Lou Eggebrecht, was fast-moving. Once we convinced IBM to go 16-bit (and we looked at 68000 which unfortunately wasn’t debugged at the time so decided to go 8086), he cranked out that motherboard in about 40 days.

1 Like

Hmm. I’m a little curious when they were working on the IBM PC design, because (and maybe this article is inaccurate) Wikipedia is saying that the 68000 was released in 1979. The PC was released in 1981. Seems like there would’ve been enough time(?)

Gates is right that they were thinking about an 8-bit computer. Though, maybe they were thinking of using Intel for that, as well, since you mentioned the System/23.

A lot of us old Atarians didn’t hear about this until about 5 years ago, but it’s been confirmed that IBM briefly considered using what became the Atari 800, based on the 6502 as its “PC.” I guess this was in 1978 (since the 800 was released by Atari in '79). A small team at Atari was secretly working on it with IBM. Most of the people who had worked at Atari knew nothing about it, until this revelation in 2019.

Incidentally, Atari wanted Microsoft Basic to be the standard language on their computers. They tried really hard, working with Microsoft to make that happen, but they couldn’t pull it off. The challenge was they needed it to fit in an 8K ROM, while also including commands that could access certain features of the machine. If one looks at the story of how Commodore put MS-Basic in their early computers (the PET, the Vic-20, and the C-64), with the same constraints, you can see why it wasn’t going to work for Atari. Either they would’ve had to make the same compromises as Commodore did (mostly using a generic version of MS-Basic, with few Atari-specific features), or it wasn’t going to work at all.

Atari ended up selling their Atari-specific version of MS-Basic, which included the full language, and all of the Atari-specific features, as a separate product for the 800, coming on disk (later, in a cartridge-disk combo), which was largely ignored by the user base.

It can easily be that a product is “introduced” at one time, and not yet fully debugged until a later time. (I’m not even sure what “introduced” might exactly mean - “available in quantity” would be one thing, “available in engineering samples” would be another.) My own vague understanding is that IBM were not happy with a single-source chip - it was normal at that time to have second sources, hence the cross-licensing which often happened.

1 Like

As @EdS already mentioned, introduced/release dates are often fuzzy. E.g., the product is announced and demonstrated, but there are still some months until production ramps up and the product becomes generally available and/or in numbers.
I can’t recall the source for the story, but it was something that should be quite reliable. The gist of it was that it was (can’t remember exactly at the moment, again) either about 500 or 5000 test samples, before Motorola could be even considered as a possible source. (Probably, what Gates was referring to as “debugging”.) So it may be possible that the chip was already in production, but Motorola wasn’t able to fulfill the request in a short period of time. (We may assume, there would have been quite a demand by various parties in the early phase.)

Regarding Atari Basic: Yes, Atari had licensed MS BASIC, but failed to cut it down to fit into 8K of ROM. So, in 1978, they looked for an external contractor for finalizing the ROM and the bid was won by Shepardson Microsystems, who took inspiration from Data General’s Business Basic. (Which may explain some of the oddities as compared to standard MS BASIC dialects.)

And, regarding the “Atari PC”: There’s an actual design prototype by Tom Hardy (IBM, 1979), which can be seen in "Delete.” by Paul Atkinson:

(As corroborated by various sources, it seems that IBM initially really wanted to go with 8-bits.)

1 Like

Drifting slightly off-topic here, the desire to fit BASIC into 8K seems very strong - however I’m somewhat glad Acorn didn’t see that as a restriction on the BBC Micro - although it was a few years later. Leaving “just” 32KB of RAM for application and video did seem limiting at the time, but it wasn’t really a big issue overall.

Imagine BBC Basic on an IBM PC in 1980 …

-Gordon

The scenario of IBM chosing to go with an 8-bit processor is probably one of the most interesting and amusing ones in all of alternate computer history. The cassette port on the 5150 and the odd choice to include a cassette-only BASIC ROM seem to provide some credibility this. This would have been an entirely different landscape, CP/M would have been an obvious choice for the OS, until Acorn’s US version of the BBC Micro took over, with networking and a GUI (which was apparently in development, at the West Coast)… Would there have ever been a need for ARM? :slight_smile: But this is really another thread.

But, for the purpose of this thread, there’s also the question what this may have meant for 16-bit processors. Would their march into personal computing have been significantly delayed, as this domain remained a dedicated 8-bit market thanks to IBM’s blessing? What would have been their niche without that economy of scale, they enjoyed in our timeline? (I guess, it may have been 16-bit micros versus DEC?)

Note that the Apple Lisa, which was launched a year and a half after the IBM PC, was limited to 4MHz because Motorola had not quite ramped up production yet. The Mac and all machines launched the following year or later were able to use 8MHz 68000s.

About speed, each 68000 memory access took at least four clock cycles compared to 1 for the 6502 or 6800 but it did transfer 2 bytes instead of just 1. That gives an 8MHz 68000 the same bandwidth as a 4MHz 6502. Since the Acorn had the latter as an expansion board for their BBC Micro they were not impressed by the 68000.

2 Likes

Re. the Basic from Shepardson

I definitely think it explains the differences from MS-Basic. I don’t know the features of DG Basic, though.

I read that Shepardson took out string variables entirely, and instead, cast strings as byte buffers; what at the time were called packed arrays, since they were one byte per element. I remember if you wanted to have a string array, you had to simulate it arithmetically.

You could string statements after IF-THEN, and they would all be executed. MS-Basic had that, too, but I’ve read recently that wasn’t a feature of the original Dartmouth Basic. You could only branch to a line number (as in, 100 IF A=1 THEN 310).

The floating-point library was inspired by the routines in Cromemco Basic. They went with a Binary Coded Decimal (BCD) approach, and all numeric variables and arrays in Atari Basic were floating-point. So, every numeric variable, constant, and array element was 6 bytes long. I think line numbers were different. My guess is they were 16-bit integers (though, it only allowed you to go up to 15-bit quantities, but this was probably because doing unsigned 16-bit arithmetic on the 6502 would’ve been challenging. Maybe there wouldn’t have been enough program memory, anyway).

Atari Basic wasn’t the most pleasant to work with, because of the lack of strings and string arrays, but ultimately, I think they made reasonable compromises that helped make the Atari’s features accessible. I liked the approach they took better than what Commodore did with the C-64. I remember being flabbergasted that Commodore’s Basic had no commands for accessing graphics or sound, since those were two of its most desirable features; that one had to poke registers to get to them. That didn’t diminish the 64’s capabilities in those areas. They were good, but I just thought gosh, they made them difficult to get to.

I think the best version of MS-Basic from those days was on the Apple II and the IBM PC. Though, I have yet to try Atari’s version.

M$ Basic had 2 versions, 6 digit or 9 digit floating point. That to me is more important to know who had what ,other than about than other bells and whistles BASIC might have.The COCO II I had at the time, had DISK BASIC so I never needed to upgrade when I got some floppies.
I tried OS/9 but the COCO was too under powered for that.
For a short time I had a COCO 3,but never could get it more memory.

The PC had one advantage over the other 8 bitters,Good video (for the time)
and dual floppies in one simple box. A computer and printer all fit on a desk.
I miss that.

To keep this vaguely 68K-related, Microsoft BASIC for Mac also came with a decimal (BCD float) version. I’m unsure how many digits of precision it used, but there might be a run-in-the-browser version on archive.org