Thoughts on Apple's IIGS

The IIGS was, essentially, a '816 Macintosh that happened to have hardware compatibility with the legacy Apple ][ series.

Software wise, it was bipolar trying to juxtapose between the IIGS desktop software and the evolutionary ProDOS environment. (Which is a gross generalization, as ProDOS was part and parcel to the desktop environ).

But the IIGS software was a second stab at what the Mac had already become, but a little better thought out with lessons learned from the Mac. Not quite a second generation Macintosh.

@Singletona point is the belief about how the IIGS was hamstrung so as not to compete with the Mac. The '816 was clocked at only 2.8MHz. In theory it could have been clocked faster (I believe that there were accelerator cards to that affect).

In that regard, at the time, the IIGS could have been a real competitor to the Mac as a faster COLOR(!!) Macintosh. I’ll let other extol the virtues and clock timing of a '816 vs a 68K.

Whether there was much future for it, is difficult to say since the '816 effectively peaked at that point. Was the Apple market enough to drive development of the chip any farther in contrast to what Motorola was doing at the time with its broader application base for the 68K family.

The IIGS was some complicated hardware with it’s legacy capabilities. A “pure” '816, with a better clock rate (Spec goes to 14Mhz, but many have pushed it to 20Mhz), and color display (the IIGS did not have sprites et al) would be an interesting machine.

3 Likes

I think I mentioned Pete Foley’s Turbo 6502 project, mentioned in the middle of this page with a drawing at the bottom. Apple didn’t want that nor a better Apple IIGS. It might be fun to do one now in an FPGA.

2 Likes

Stories abound regarding the IIgs - one, possibly apocryphal is that Steve Jobs insisted that it not have a faster clock else it would outperform the early Macintoshes. Woz wanted an “engineers” machine with slots and he saw the IIgs as a natural progression from the II. (the Apple /// was designed by marketing)

Part of the (cost) issues from my readings was also trying to be compatible with the II, software and graphics wise, so it would slow down to 1Mhz to run in II compatibility mode and run faster when accessing RAM outside the bottom 64K. I think it’s back to the basic Engineer vs. Business man/marketing scenario myself. Would Apple have been able to hold its ground against the rapidly establishing IBM PC with the IIgs? I doubt it…

They were expensive and rare in the UK (still are), although a friend of mine had one. wish I’d saved for one, however that was then… (and he now works for Apple!)

One of my ideas for my Ruby project was to have a bit of a nod towards the IIgs - sadly, health, life and other things have been against me in recent months and a stupid idea I had earlier this year with the 6502 version sent me down another track… “I wonder how hard it would be to port BBC Basic…” so now the 816 version has a ProDOS like filing system with a BBC Micro like operating system and will have “slots” like no other…

However I did see the GUI on my friends IIgs, thought it was great, if a little slow, but the alternatives at the time? I only really saw the Atari ST and GEM (I think) RISCOS (Arthur) on the ARM didn’t come until a year or 2 later…

-Gordon

1 Like

I’m not sure how much of a future it could have had professionally, but given at the least rough functional prototypes of a ‘GS+’/‘Mark twain’ model of the IIGS had bee ncreated, apple did seem to look into it at least at a conceptual level. Again I don’t know the market realities, but the me now thinking about hte me of the early ninties probably would have loved it as the stepping stone between the IIE’s I’d learned on in elementary, and on to bigger things. Think about it if you will; a higher clcok speed, ram slots instead of having to solder in individual chips (wasn’t it limited to 24 or maybe even 16meg of ram due to addressing limits? Either way that’s a heck of a lot more than the… One megabyte GSOS needed,) an integrated 1.44meg floppy drive, and integrated hard drive 9not a BIG one by modern standards.)

Realistically if they came out with a GS+ I could see it marketed towards either legacy support situations with an eye towards ‘hey maybe you really want to use some of this nice nifty mac stuff too so we’re giving yo ua mid point,’ or perhaps as a ‘junior’ machine that you could sell to families so they’d get into the apple ecosystem and perhaps be more inclined towards mac in the workplace/school. The big limiting factor to me looking at the situation with a modenr eye is the lack of compatibility with VGA standard monitors out of the box, but I wouldl ike to think that could be addressed if the platform had hung around for any length of time (wasn’t there a graphics card made for the GS that did that?)

Eventually the GS wouldh ave been supplanted by the mac anyway, but it feels like apple missed a golden opprotunity to use the GS as a 'hook ‘em early’ approach while selling the mac as a more business/professional oriented approach. Then again maybe there wasn’t nearly the overlap I’m hoping there was and am just wallowing in ‘might have been.’

Something I found on vimeo and put up on youtube because 'important for this to get more exposure on as many platforms as possible.

I’ve never experianced the the real hardware. However in emulation putting it up to something the accelerator cards could do and giving it sixteen meg of ram made the thing feel pretty fun from my experimentations. I’d like to think with something as simple as a VGA card with some form of graphics acceleration might have been that ‘last piece’ when it came to useability (assumign CDROM support existed,) with networking/eathernet being added in making it a platform that would have eventually had to bow out, but would have been a decidedly useable look at what things will be like. I do have to wonder what web browsing would have been like if it were given eathernet/modem card support what were browsers for the mac like in the early 90’s?

That shows off some of what GSOS could do and could have expanded into with the right widgets helping for organization and useability.

And lastly… something from kansasfest that looks relevant: NewGS

2 Likes

The Apple IIGS could have been great! But Steve Jobs had such different ideas for what a computer should be like.

Still … could you imagine if the original Macintosh had been combined with the Apple IIc (both from 1984), for a backwards compatible MacApple?

With a slotless keyboard wedge design, the mainboard and floppy wouldn’t have to live inside a CRT box. The computer could be attached to either a high quality monochrome monitor or a backwards compatible color monitor (or TV) … sort of like the Atari ST.

As for the speed limitations of the 65816 only going up to 20Mhz? Well, the 68K line was going to lose the speed war with x86 anyway, so is this really a big loss? In fact, it could be a blessing in disguise. By the 1990s, the slow speed of the 65816 MacApples compared to PC Clones could prompt Apple to jump directly to x86 based hardware with software emulation of 65816 for backwards compatibility. No mucking around with PowerPC or 68K. Just going straight from 6502 based to x86, maintaining backwards compatibility all the way!

So why is x86 a good thing to be compatable with?

Because Macs would eventually go to x86 architecture anyway because the price/performance would eventually win out. By maintaining full binary compatibility, this alternate Mac history would save Mac users a number of headaches and it would make Macs generally more competitive with PC clones. Apple’s hardware development would have been streamlined, and there wouldn’t be this odd-ball Apple IIGS leaving users in the lurch.

Honestly, the benefits of this sort of thing could have been even greater for Commodore or Atari. However, the clusterfudge of Tramiel and Amiga and Atari and Commodore makes any such elegant alternate timeline implausible, I think. I mean … a hypothetical 14MHz C128 could have been cool, but I don’t think Commodore would have gone for an outside CPU like that. And the cost cutting Tramiel wouldn’t put all those custom Atari 8 bit chips into a next gen computer.

I have always felt the IBM and APPLE both over priced,
regard less of the cpu trend of the day. Apple II and the PC both had clones so I don’t buy the argument of the x86 or 68000 a better cpu, both had faults. but the only real options over the 6502,.6809,z80.
Apple and IBM I think are here today because they were felt to be better than just a game computer/console
.

I thought half the fun of having a machine with slots was in being able to modify it without as much worry?

1 Like

IMHO comparing regular manufacturers and clones is a bit unfair, since you’re also paying for development and a sustainable infrastructure to be in place, when you may need repairs or upgrades, etc. So you’re not only paying for the specific model or architecture, but also for all those architectures which never made it past prototype stage, all those operating systems that eventually came to nothing, but may have contributed to some iteration of the one we know, etc, and, last, but not least, the kind of bureaucracy and infrastructure that comes with this. While most of this should be compensated by the scale of the business, some will still become part of the price. That said, there were (and probably are) some comfortable margins … (which, with an eye on Apple in the 1990s, may still not have been that comfortable)

Edit: In other words, we pay them for not having to buy their failures. However, at times, it may be hard to decide, which one was the failure and which one the sensible product. :wink:
(There’s a certain risk, we pay them for continuing their failures and us not getting the sensible product.)

1 Like

Certainly Steve Wozniak’s dream machine would have slots. But Steve Jobs’s dream machine would be sealed in a watertight fanless box.

Historically, the slotless Apple IIc was indeed developed and it was popular. It was a particularly good fit for schools. It was less effort to set up and deal with as well as a bit more compact.

The Mac’s form factor was also a good fit for school computer labs, but something backwards compatible with the Apple ][ would have been even better. As it was, schools had to choose between the less powerful but compatible Apple IIc or the more powerful but incompatible Mac.

And yes, the Mac was overpriced. This was a purposeful marketing decision to project the perception of a premium product. The petite Mac mainboard with its slow RAM was a marvel of cost reduction. It’s hilarious that the Amiga and Atari ST were twice the Mac’s speed at 1/2 or 1/3 the price (both including 1st party monitors, for a fair comparison).

The Mac could have sold for less than $1000. The only reason it didn’t was because Apple didn’t want the Mac to be perceived as a product for commoners. In order to be perceived as a serious alternative to the IBM PC, it had to demand a serious feeling price.

2 Likes

Actually, the Mac was marketed as “for the rest of us” (hoi polloi) – and the architecture was suffering heavily from cost reduction. (Mind that the Lisa, meant to be introduced about the same time as the IBM PC and conceived as a business machine with integrated software, was first invesioned to be sold at about US$ 500, according to the Marketing Requirements Document (MRD). Obviously, creeping specs and development costs had a word in this. However, there’s not much indication that Apple was going for high margins from the get-go.) To be fair, the ST and the Amiga came a bit later, which changed a lot, technology-wise. Moreover, the Apple II was still the major (and only) profitable product when the Mac was introduced – and this was even more so after the Apple III warping mainboard disaster. Apple had every incentive to keep the aging Apple II alive as a standalone product for the sole cause of surviving…

(Regarding Apple and financials, I found it always rather interesting that Apple was on the one hand the most successful new business after Xerox, while, on the other hand, there seems to haven’t been a single major manufacturer who wasn’t offered to buy the Apple II business. Compare accounts by Tramiel, Bushnell, and others. It is almost as Apple leadership hadn’t much trust in their product…)

1 Like

The 68K architecture falling behind the x86 wasn’t fait accompli at the time decisions were being made. The power of the x86 market fueled a lot of capital in to Intel to build out their designs and other processes. Motorola was outspent and perhaps mismanaged, which hurt the 68x architecture.

The PPC decision was not a wrong decision for Apple either. The chips were powerful and efficient. Both 68K and PPC architectures had value over the Intel designs. What doomed the PPC on the mac was the fact that IBM and Motorola weren’t inclined to do the work in the mobile space. Apple wasn’t a big enough customer to drive development their way. Meanwhile, Intel was driving headlong in to the mobile space. A space Apple has certainly conquered.

The Mac 68K to PPC switch was an extraordinary achievement in systems design. It’s amazing that they pulled it off as well and seamlessly as they did.

The issue with the '816 was similar to what happened with the PPC. It’s not that Apple was particularly unhappy with the 68K line, but Motorola went with the PPC partnership, killing development. Was there any future in the '816? Did WDC want to become Apples CPU Unit? Those are non-technical decisions. The '816 at 20Mhz was quite fast, using the same “MHz doesn’t matter” math the CPU performance isn’t solely based on clock rate, but work done per clock. And the '816 was pretty efficient that way. More so certainly than the 68K at the time. But there was no '816+. No 651632 or whatever the next generation would be. The '816 has been stalled for 40 years outside of hobbyist explorations.

Apple is currently moving to ARM. It will not surprise me in the next few year to see an ARM based Mac laptop using an Apple design ARM CPU and the other Apple logic chips. This will be more tricky having to co-habitate with the x86 Mac systems (I do not see them killing off the x86 any time soon).

But Microsoft is already pulling this off with some success, and Apple’s ecosystem would likely do a better job with it. If they can convince Adobe to port PhotoShop to ARM, then the deal is done. I have no doubt there are efforts in house to do this already.

This is all in the quest for better and better power management, which is Apples primary driver today. Computers are “fast enough” now the work is on weight, heat, durability, and battery life.

2 Likes

Yes, the 68K and PPC decisions made sense at the time. In the hypothetical AU I’m pondering, the benefits of going with 65816 would have been a fortuitous accident from clinging onto Apple ][ backwards compatibility.

After all, Apple didn’t just go all in on the Mac. 1984 also saw the release of the Apple IIc. And of course, the Apple IIGS came out in 1986. Given these products, I don’t think it would have been ridiculous for Steve Jobs to have gone for a backwards compatible Mac … something like an Apple IIc with a 65816 instead of the historical 65C02. Basically … like an Apple IIc but with a built in CPU accelerator.

I even recall close to everybody being clueless, why IBM had chosen what became the x86 platform, instead. (Now we know, Motorola hadn’t production samples ready in numbers for IBM’s internal certification.) Going for the 68000 seemed to be just the wise thing to do.

It might be interesting to note that Acorn faced a similar problem, wanting to produce a bigger/better/faster computer than the 6502 would support, and explored the '816 and 68k, but concluded that it was worth trying to make their own CPU. By no coincidence at all, the result, ARM, has something of a 6502 feel about it, and could do a reasonable job of emulating a 6502 too. So Acorn’s Archimedes computers, powered by ARM, could be shipped with two 6502 emulators as well as a new implementation of BBC Basic, which between the three tactics covered backward compatibility pretty well, at least for software. (Two emulators: a high performance one for OS-legal applications such as would have run well on a 6502 Second Processor, and a medium performance one for software which was less disciplined.)

Acorn’s bold move came from two things: reading the RISC research which showed the way to simpler machines, and visiting WDC and seeing how very small and unskilled the team there was. Which is another reason, I would think, for being very careful about designing in the '816: WDC was never a top tier engineering outfit.

Backward compatibility is a real bugbear: without it, you are liable to leave your customer base behind, but with it, all your engineering moves are constrained.

2 Likes

I guess, there are still some around who remember how Adobe snatched the layout business from Quark. On the other hand, the Photoshop code base may be well at the point, the XPress code base was then, inviting any quick movers and the entire market may be reshuffled soon. Interesting times to come…
(Maybe, retiring the entire 32bit code base was a first move by Apple to facilitate an effective emulation layer, preparing the transition towards ARM? Dealing with the same word length both in the emulation layer and the host hardware is definitely making things easier by several marks. On the other hand, the new MacPro, which is about a league above what one may have had expected, is an incentive to stick with the current architecture: customers who just invested into this kind of hardware at this price level won’t be amused to see their investments invalidated soon.)

That’s why it’s not going to be a wholesale switch like 68K->PPC and PPC->x86 were. If they go to ARM outside of their iOS devices, it’s going to be in their smaller laptops, like the MacBook Air class.

Not Mac Pro notebooks or the desktop iMacs, certainly not the MacPros, nor the MacMini.

The drive is power and heat, problems that the desktops don’t have. Most Mac Pro laptops are desktop replacements.

But the lighter notebooks, certainly are open to adopting this if they can demonstrate true advantages with power and battery life. The iPad Pros are simply not quite the macOS desktop replacement they try to make them to be.

Not sure, if this might be a valid strategy. ARM is already very interesting because of throughput and may become even more interesting with active cooling. However, it’s still significantly behind in x86 emulation mode. Meaning, deploying it on an already low range portable niche device, which will be not receiving any special software (because of lack of incentives), may be a risky business. Compare Microsoft’s experiments with a low range secondary architecture. (I guess, these devices are retro already.)
I guess, it must be all about a mid range platform with a potential for active cooling and a deployment broad enough, to incentivize native application development. Meaning, it should be about the iMac (or a new modular device in about the same segment). Once there’s a sufficient number of native applications, you can go all in with portable devices and high range products. While the ARM architecture is very attractive in the mobile segment, I doubt that it can drive from there into the desktop / high performance segment. In the end, it’s all about software.
(Another, rather bold strategy may be just transitioning to ARM on the MacBook Pro platform and forcing native development. But this may turn out rather nasty, At the same time, this could render Apple quite in the same spot it had found itself with the IIGS… The same applies for any hypothetical new, modular mid-range device, which could be just another IIGS.)

The device will receive all of the software that comes with the Mac already (which is quite a lot). Is it all “best in class”, no necessarily. Is it “MS Office”? No.

Which is why its important for Apple to have support of top drawer vendors like Adobe porting a flagship product the platform. Notably something like Photoshop, simply because if anything has a lot of potential low level optimizations (i.e. machine code), it will be photoshop. But Adobe has ported from 68K to PPC to x86 in the past already. They’ve suffered the pangs of porting software, been there and done that.

Much of the other software SHOULD be a simple compile, since most software does not rely machine code optimizations. Apple also aggressively promotes portable coding practices to the dev community (whether they’re followed or not is a different story).

Also, as was mentioned, consider their move in their recent OS release to supporting only 64 bit code. This could well be more of the camel under the tent to prepare the industry for something like ARM. Apple also is quite aggressive with not being a slave to backward compatibility. Whereas MS goes through great efforts to ensure legacy software continues to function, Apple does not and consistently breaks things from the past, pressuring developers to “keep up” with Apple.

MacOS has been capable of supporting applications with different binaries since forever. The system supports the compilation of code to different architectures and bundling them all in to a single application for distribution (thus all of the resources are shared, while the code is platform specific).

So, in the world of the lightweight notebook, something that fits in the realm of a modern Chromebook, with a few local applications and a LOT of web applications, they’re well positioned to pull this off. And most modern applications may well port readily to the new platform (especially when a bunch of modern apps are just web apps with bundled runtimes). The fact that it can emulate x86 poorly would not necessarily be a factor in this space.

Which is also why ARM won’t be penetrating the desktop market as much since VM technology is so prevalent in the industry, and having a machine that is running emulation vs a VM, isn’t really tenable in the professional space.