A bit of progress on our Beeb816 project

First, I really do not want to derail this nice thread, which is certainly not about Apple, but I think there are more general aspects to this, as well. Namely, aspects of thermal design.

So there is this narrative about Jobs maliciously crippling the IIGS to protect his own baby, the Mac. However, we may also consider the very situation Apple was in at that time: Apple had just gone from one of the fastest growing companies to the brink of bankruptcy thanks to the thermal disaster that was the Apple III with its warping board and the subsequent recalls. At that time, Apple had a quite extensive line-up, with various flavors of the Apple ][, the high-end business package that was the Lisa (which was actually performing better than the marketing forecasts), the Mac, the ill-fated Apple III (as Apple’s future in the education market and in the low-end business segment), and the new machine (which was at that time reported as a possible, somewhat cubic shaped Apple IV) in the making. With the financial strain put on the company by the Apple III disaster, Apple had to radically reduce the line-up and also somewhat straighten it a bit. (Mind that we’re not only speaking about cost of production, but about marketing, as well.)
Keeping the aging Apple ][, which was still making for the better part of Apple’s revenues, was some of no-brainer, especially, when the company was in need of cash-flow. The Lisa was to go (with as little as ceremony as possibly, compare the land-fill dump), Mac was to stay to revolutionize the computer market and was now to cover what remained of the business market, as well, the Apple III would see some minimal commitment (I guess, for brand credibility), but was otherwise to be forgotten as soon as possible, and the new machine seemed to have been delayed. In the light of this, it may have been also about not risking another machine running too hot, which may have well been the end of the company. Also, there wasn’t really much room for the IIGS in this new, restructured line-up, besides transitioning the Apple ][ community eventually to the Mac. Playing this safe, may not have been an entirely weird idea.

Much of this also adheres to existing 8-bit platforms. E.g., the C64 was already running hot at 1MHz. Transitioning any of this towards the teens range of MHz wouldn’t have left much resemblance to the existing architecture, while other, 68000-based high-speed platforms were already in the making. Which may have also been the major strategic flaw of the 65816: existing systems weren’t to be upgraded into anything with much of a future (if possible at all, considering what was else on those motherboards), while, in the short run, these machines could still pose a major threat to any new platforms which were already maturing in development or were already out on the market.

It really should have been the Mac that was dropped in favor of the Apple IIGS, which incidentally was designed in a way that made after market cooling fan mods easy (and popular). The Mac would have sunk the company if Apple ][ sales hadn’t remained profitable to keep Apple going. The Mac was a horribly expensive mistake, and everything the Mac could do the IIGS could have done much better.

Now, Commodore also had a hot-head resulting in a lot of overheating problems, but they ejected him so it would be possible for their computers to have cooling fans. It really wasn’t rocket science. Engineers knew how cooling fans solved cooling problems.

Note that the dominant 68000 systems stuck with 8MHz or slower for the most popular models into the 1990s. A 65816 running at 4MHz would have been just fine during that time. Had the 65816 been running on any popular hardware, it would have been straightforward to upgrade with higher speed accelerator boards. The Amiga 2000 shows a possible paradigm - include a CPU accelerator slot that allows expansion with a faster CPU along with its own faster RAM. That allows combining cheaper slow expansion RAM on the main bus with faster more expensive RAM on the accelerator card.

I had an Amiga 2000 with 28MHz 68030 accelerator card. It worked well, although I did wish that the 68000 weren’t limited to just sitting there unused.

At the risk of bringing this back on topic, this accelerator card approach was, IMHO, superior to the second processor approach. It might have been slightly less efficient, with the original CPU going to waste, but it meant that almost all software would work without modification.

Also sorta back on topic, the Amiga 500 also had numerous accelerator boards, that plugged into the 68000 socket. So, for the most part an Amiga 500 could be expanded as a less expensive accelerated system than an Amiga 2000, despite not being designed for it.

Regardless, all of the 68000 systems ultimately failed. The Macintosh finally limped into some semblance of profitability in the 1990s, only to get dropped in favor of PowerPC. The Apple IIGS could have been generating healthy profits all that time, only to get dropped in favor of PowerPC.

From a company/business perspective, I’m not so sure. Mind that these would have been probably Apple ][ customers and/or customers that would have (eventually) bought the Mac anyway. In terms of an additional market segement that Apple couldn’t reach otherwise, it may not have been that significant. Why go through the hassles of developing and marketing multiple lines, when you could do about as much with a single line at a fraction of the expenses? What could the IIGS do for Apple in terms of growth, beyond holding a position they already had with the Appple ][? (Meaning, it was addressing the same market segment and was still much of a hobbyist/home/education machine. While that segment would have loved it, it didn’t lend itself that much to business deployment to compete with IBM’s PC & clones. – Mind that this was already 386 era with former mainframe applications transitioning to the PC. – Had Acorn successfully entered the US market, it could have been Apple’s answer to the BBC machines, while a bit late, But, as it was…)

It’s true that the early 68000 machines were not that impressive as they could have been (the Mac had an 8-bit board and the Lisa was built around the pre-production emulator of the chip, which couldn’t go beyond 5 Mhz). However, I guess, analysis may have shown that there was more potential in this platform, esp. given the rather obvious effects of Moore’s law. While the 65816 rooted in an 8-bit past, the 68K was basically a mainframe architecture on a chip with all the prospects that came with that.
While an alternate history could have worked out quite differently, I can somewhat understand what happened in this one. :slight_smile:

That said, is great to see (at the example of this project) what can be done with the 65816 at its full potential!

The Apple IIGS could have done EVERYTHING the Macintosh did, but better and at a lower price point, and it would also have had a huge software library, and people’s investment in Apple ][ peripherals would have made it an even less expensive computer (remember that at the time, floppy drives were somewhat expensive, and could even cost more than the computer).

Of course, the 1984 ad campaign for the Macintosh should not have gone to waste, and the humiliation of dropping the Macintosh after just a couple years should have been avoided. Instead, the Apple IIGS could have been named the Macintosh GS, so Apple could pretend it was just upgrading the line rather than admit the truth that it was abandoning an inferior product.

And given all the amazing things the “Macintosh GS” could do that the original Mac couldn’t, the fact that it wasn’t object file compatible could have been spun as simply the cost of all the added features. It had higher resolution and color graphics (including a rather clever way to make high resolution mode very colorful - the palettes of even and odd x coordinates were different). It offered backwards compatibility with existing software and hardware, along with a vastly more compelling software library. It was also simply more powerful in raw terms. It was more expandable. It had an INCREDIBLE sound chip. It was a better product all around.

As for being rooted in an 8-bit past … I don’t know if you noticed, but the computing world got taken over by IBM PC clones, which were rooted in an 8-bit past. Backwards compatibility was a huge factor - and one which Intel would learn the hard way when trying to jump to 64 bit without the backwards compatibility that AMD would offer.

Mind that in 1986, the Mac had been already on the market for two years and eventually found its niche thanks to square pixels and what was to become desktop publishing. The PC world, on the other hand, had entered into new territory with the i386, way beyond of what IBM had envisioned for it. (As I see it, the 386s weren’t really PCs at all, but a new thing, and the PC in its original concept was a rather short lived phenomenon rather than being disruptive. This disruption really came with the i386 and this was now a new game. And, to common opinion of the time, a business computer had neither color nor sound… Even the Mac, which to its reputation lacked color, but still had high-resolution sound, was considered more of a toy than a “real” computer suitable for business.)

However, a variant of the story that is closer to the topic, may be also interesting: Had the BBC Micro been introduced in the US successfully, it could have easily stood its ground against the IBM 1550 and even surpassed it in most categories, but for memory. Could the 65816 have helped it to hold up with this new generation of 32-bit competitors? Are these boards, as shown here, “i386 killers”?

I think we need to note that there are several realities, or viewpoints: the technical one, the marketing one, and the managerial one. Acorn and Commodore both struggled, in different ways, with their management weaknesses. Apple too, I think, but they did ultimately survive.

As for technology, the 65816 in some ways bears comparison with the 68008 - it runs on an 8 bit memory bus. That’s good for economy - hence Sinclair’s QL - but not so good for performance. As noted, Acorn went straight from 8 bit busses to 32 bit busses, in search of performance. Whereas Commodore (Amiga) and Atari went to 16 bit busses - somewhat faster and not quite so expensive. Also, using commodity CPUs. And from a marketing perspective, “16 bit” might have been a useful label, for all that it’s actually quite a subtle and complex question. Hence “ST” I suppose: sixteen/thirty two.

Different people, once they’ve programmed the '816, come away with different opinions. Some find it acceptable, some find it awkward. Acorn’s Communicator shows that it’s possible to make an OS for a 512k machine without too much obvious 64k bank granularity. I don’t know about the GS in this respect. But I think that’s key: the 816’s banks and the x86’s segments are ideally made as invisible as possible. The 68k, of course, offers a simple flat memory model, which I found rather attractive.

If Acorn hadn’t tackled the US at all, they wouldn’t have run out of money and been acquired by Olivetti, perhaps. But if they had conquered the US education market, that would have been a huge win. Could they have taken any market share in the business world? Doubtful, I think, although the ABC machines were rather impressive, including the ACW as a scientific workstation. Again, we see that the second processor was Acorn’s preferred way to move forward.

Regarding the BBC Micro in the US market: I think, networking may have been a huge selling point. Also, that this was readily available in BASIC. Since software was still much of a custom affair, this would have been quite an advantage. You could have made software of the next generation out of the box… The BBC label could have lent some officious credibility to the machine, as well.
Reportedly, they had been devoloping a GUI for the US, so that may have been interesting, as well. (Maybe, this was just GEM?)

I’ll admit I am sorely unfamiliar with the BBC Micro, so I really don’t know how much of a chance it would have had in the USA if it hadn’t been for the various issues that delayed it.

What I can say is that here in the USA, PBS aired The Computer Programme, and this made me (and presumably a lot of others) want the computer showcased in it (the BBC Micro). So I think this would have generated some interest. However, even by this time the Commodore 64 had a dominating lead and Jack’s price war was decimating the competition all over the place. I don’t think the relatively expensive BBC Micro would have stood a chance anyway. The Computer Programme might have gotten some feet walking into stores, but then they’d see the Commodore 64 and buy that.

1 Like

Having been working on the '816 over the past year or 2, despite 40+ years of 6502 experience, I’ve not found it comfortable to work with and those 64K banks have limitations which, if it were me “back in the day”, that day being 1984, I’d have jumped ship to another CPU without a 2nd glance. My own way to remove that bank granularity was to write a VM for it to make the banked addressing transparent to the higher level programs I am interested in - the crux of that is now the 8-bit data bus… Being able to halve the cycles needed to move data between registers and RAM would be a big win for me.

However both the Communicator and //gs have shown what is possible with good software (and presumably a team writing it rather than just me as a hobby)

The //gs seems to have required a degree of software compatibility with the //e and the Communicator did have a good base of software or at least software ideas and specifications which may have made their task easier.

I’m not sure I’ll be doing another '816 project though - like others in the past, there are other CPUs tempting me - my advantage is that I’ve managed to make the leap to a relatively machine independent high level language - a bit like Unix did even earlier, so maybe there is something in that too…

Cheers,

-Gordon

3 Likes

The 65816 was more efficient and higher performance than the 68008, though. Despite the seemingly low 2.8MHz clock speed, the Apple IIGS actually had performance reasonably competitive with the 7-8MHz Amiga and Atari ST competition. You can see this in various high profile software ports where the IIGS versions run smoothly. The IIGS’s incredible sound chip allowed for enhanced versions, even, such as Another World/Out of this World. I remember watching a longplay on YouTube and when the enhanced music kicked in I was shocked.

I don’t think anyone in the computer market cared about “bit wars” at the time. What mattered was that the Apple IIGS looked just as impressive in stores as the ST and Amiga … oh, if you saw one at all. But the sad fact is that Apple didn’t push the IIGS because they didn’t want to harm the struggling Mac, while both Commodore and Atari were pushing their 68000 machines as hard as they could.

Still, Apple was real lucky that retailers HATED Jack, having been burned by him too many times in the past. The Macintosh looked really bad next to the Atari ST with monochrome monitor. The Mac had a small screen that was only 512 pixels wide. This prevented the Mac from making much headway in word processing and desktop publishing - already at the time a good 80 column display was seen as essential. The ST, with its 640x400 resolution, could do both a good 80 column display and Mac style WYSIWYG. The Mac couldn’t, and it cost three times as much as the ST (yes, including monitor). The Apple IIGS could do a good 80 column display, and in retrospect should have included higher vertical resolution like the ST or maybe Amiga. (The Amiga had interlaced modes, which are flickery but the existing Apple Monitor /// had long persistence pixels that eliminated interlace flicker.)

Oh, another noteworthy competition between 65816 and 68000 would be SNES vs Mega Drive/Genesis. Okay, the main CPU wasn’t as important on them, but nevertheless the SNES basically won the performance contest, regardless of nominally slower CPU clock speed.

1 Like

Something possibly worth noting in the context of Acorn and the Beeb: Acorn, or perhaps specifically Sophie, were very keen on using interrupts in a certain way, and demanded fast interrupt response. The 68k scores rather badly on that metric, at least for the worst case. The 816 is presumably about as good as the 6502, although the ISR will need to store (and restore) whichever registers it needs to use, which will take a few cycles. The ARM, perhaps unsurprisingly, goes so far as to have dedicated registers for the ISR.

It could be that the SNES makes good use of interrupt response. AFAICT the SNES doesn’t make use of the 6502 mode.

1 Like

My experience with the 65816 is all from data sheets, I’ve never spent any real time programming it, so please correct me if I am mistaken. However, isn’t this more a package limitation than an architectural limitation? The '816 instruction set has a wide variety of two-byte memory instructions; given a 16-bit data bus, it could do these in a single fetch for aligned access, could it not? That wouldn’t be dissimilar to the 8088 → 80286 upgrade path in the PC world. Even in the 40-pin package, it already had multiplexed pins, perhaps it could have even been taken to 16 bits in a 40-pin package. (I’m not intimately familiar with its multiplexing, though.)

I think it would be quite difficult to give the '816 a 16 bit bus - I think it would be a rather different implementation on the inside. I think the 816 was possible, by WDC’s very limited team, in large part because it works rather like a 6502, but more so.

Every now and again on 6502.org someone wonders about this sort of thing, but no-one has yet done it, even though there are some 20 known reimplementations. It is tempting, even for the 6502 instruction set, but I think it’s intrinsically rather difficult. It needs a decoupled load/store unit, I think, or at least a decoupled memory interface. Those sorts of things are typically seen in later, more complex CPUs.

1 Like

I agree, and I think the typical lumping of “x86” oversimplifies that history; certainly the 80386 as a PC platform got a huge leg up through backward compatibility, but it’s difficult to overstate the sea change in the personal computer market of a true 32-bit architecture with a built-in paging MMU. 386 protected mode isn’t even in the same ballpark as the 8088/86/186/286 that came before. Like AMD64, it’s almost a new architecture with an incidental backward compatibility mode.

This is something that came pretty late to competing architectures; the 68k didn’t have a built-in MMU until the 68030 a couple of years later, and the external MMU units for the 020 and earlier all had significant compromises. I think the relative simplicity and garbage-ness (to use a technical term that @EdS may object to) of MS-DOS also played a role here; it was “easy” to start using that MMU, even for individual DOS applications, on the 386, and to switch in and out of protected-versus-not modes to access BIOS and DOS services. Video games and application software alike did it, and boom, they were freed from overlays and segments and all of the cruddiness of a 16 bit architecture that somehow in the meantime sprouted double-digit megabytes of RAM.

I’ll admit I really don’t know whether interrupt response was significant on SNES or Mega Drive. But generally most software on those consoles relied more upon the graphics chips to pull their weight than the CPU. The CPU would only really get a workout on certain games, like some computer game ports. Another World/Out of this World was a particularly interesting example of CPU heavy software rendering, if you look into the details of how its console ports worked.

1 Like

About MMUs and similar, aren’t we seeing here a sort of recapitulation of earlier CPU developments, as transistor budgets (and budgets) got bigger? The ARM1 lacked a cache because there was no time and no budget: the ARM3 added it (IIRC). The 6800 has no multiplier, but the 6809 adds it. It’s not so much about what can be done, as what’s economic at the time.

1 Like

Yeah, the dragon that I see lurking there is that programmers could feel free to treat it is a “better 6502”, which means rampant unaligned accesses and similar, entailing a memory interface for a 16-bit bus that can handle decomposing and reassembling 16-bit accesses across alignment boundaries, etc. However, several other architectures of the time handled that relatively gracefully, with only the unavoidable attendant performance penalties; granted, they did it with more transistors than the '816!

I agree that it seems likely that the size of WDC’s '816 team played into this; it is notable that even 40 years later there is essentially only one implementation of the 65816 from WDC, only ever available in two packages. Is that the chicken or the egg, though?

For better or worse, WDC seem fixated on the 8 bit bus, probably because that leads to low system costs. Their 32 bit idea was also to use an 8 bit bus. But as their major market is the toy market, low cost is probably right.

@EdS I, guess 68K interrupt response is over my pay grade. :slight_smile: Meaning, I never did much on the 68K platform besides using it, so this was rather new to me. – Thanks for providing yet another opportunity of learning something.

The 80386 was a success because of its backward compatibility, period. Yes, it introduced vastly superior and vastly more elegant capabilities, but it wouldn’t have gotten anywhere without its backwards compatibility. Not just software, but also hardware.

The 68000’s elegant design was already attractive back when IBM was designing the IBM PC. But it lacked compatibility with existing I/O support chips. IBM wasn’t so worried about software backwards compatibility, but the lack of hardware compatibility caused them to pass over the 68000. And as a result, x86 won and 68K would end up an also-ran.

If Intel had tried to introduce a new 32 bit CPU without any backwards compatibility? It would have been a humiliating dud. By the time they released the 80386, 68K was well established and had a suitable hardware ecosystem of support chips. This hypothetical Intel alternative without backwards compatibility would be trying to go up against that from scratch … a very hard sell, considering IBM was happy to continue with the 80286 (which they had a license to produce), and considering workstation makers were doing well with 68K.

No, this hypothetical Intel 32 bit CPU would have been a failure, and Intel (or possibly others) would have to develop a backwards compatible 32 bit CPU to kick start the 32 bit PC clone revolution.