Mural (and poster) showing mini, personal, and home computers

The S100 is a type of bus that we call “backplane”. The slots are on a board that has little or no electronics and all boards have equal status. Minicomputers used this style and similar microcomputer busses were Intel’s Multibus, Motorola’s VersaBus and VME (Versa Modules European format), the STD Bus, STE Bus and SS50.

Apple’s expansion style is what we call a “motherboard”. This can be mechanically very similar to a backplane (see extremely expanded KIM-1 machines for an example) but the form factor in the Apple II was pretty unique and what IBM copied for its PC.

Apple used a decoded bus: circuits on the motherboard convert the processor’s address into separate “card select” signals which are different (but on the same pin) for each slot. The Apple II actually has two such signals: one for selecting registers in the card and the other for selecting a small (256 bytes) internal ROM for the firmware for the card. This allows you to plug in two identical cards and they won’t interfere with each other. The downside is that which address a card uses depends on what slot you put it in. In the era before OS drivers it was up to each application scan all slots to figure out what to do, but lazy programmers preferred to just write in the manual “you have to put the printer card into slot 3” instead.

The PC bus did not have any per slot signals. That meant that each card had to receive all addresses and compare them to some DIP switches to see if it is being selected or not. In addition, there were several different interrupt lines and several different DMA lines (the Apple didn’t have DMA, but see below) and you had to have electronics or jumpers to use the correct one. All this added quite a bit of cost and it was up to the user to correctly configure each board so it not only didn’t conflict with any of the other installed ones but also used the resources that the poorly programmed applications expected.

One advantage of the PC solution is that there is no limit on the number of slots a computer can have while the Apple scheme could only have 8 slots.

The Apple bus did have the option for an expansion board to take over the whole machine (used by Microsoft’s SoftCard, for example), which did allow DMA at the cost of needing really fancy expansion boards compared to the PC’s built-in DMA controller. But IBM did add this feature to the PC-AT (so it is a part of ISA and the subject of one of its patents) but in a very clumsy way that didn’t work very well in my experience.

Towards the end of ISA’s life the DIP switches and jumpers got replaced by electronics backported from PCI allowing Windows 95 to offer “plug-n-play”. This finally brought usability to a level comparable to Apple’s original bus and NuBus (which also included pre-decoded “card select” signals) without the corresponding limitations.

1 Like

This is certainly true, but it could have made a point in bringing the message about the “drop-in replacement” home. Especially, as this was a new concept and it could have helped marketing the product, I guess. (The book has a rather interesting section on marketing computers.) It may have been actually useful for reviewing data in a familiar format. (In this case, I’d personally rather opt for 13 rows with a top row for character representation, though.)

On the other hand, low screens for terminals were apparently somewhat popular as they weren’t that “totalitarian” as they didn’t occupy the entire view in front of the operator. Moreover, the originally intended task, data entry similar to a keypunch, didn’t require much of screen height, but 80 columns would have been important, indeed. (From this perspective, 12 rows were probably generous.)

BTW, does anyone know, if the DP 2200 featured a special box character that may have been useful for representing a punched hole?
Update: I can’t see any such character in the list provided in the manuals and neither of the manuals available at bitsavers (which were published, when version 2 was already available) makes any reference regarding any reasons for the specific dimensions of the screen. Notably, the DP 2200 as produced was an ASCII machine, which doesn’t fit that original purpose well. (Meaning, if there had been any such background for the display dimension, this wouldn’t have been notable in any way for these manuals.)

Hey all, for part of the introduction, I’m thinking of having boxes like this:

image

It won’t be perfect and there is a lot of overlap, but was wondering if these are approximately representative of the “flow” of why computers were built? It doesn’t have to be super precise.

I know in the 40s was WWII - I’m not sure if “field computers” were yet used for ballistics calculations. Certainly there was the encryption aspect, or should it just be “communication”. But in general, military application motivated a lot of the budgeting. I’d imagine large factories made use of computers - but perhaps not for automation, if only just for the accounting (time, labor, payment).

I’m not sure when banking first used computers - well, sure, you could argue it was since the 1400’s (re: Florance). But I was thinking the first credit card was in the 1950s? And that pretty much requires computers to involved, to handle the transaction accounting. And I recall uses for airport/commercial travel, computers were critical for the logistics (ATC ground stations). Insurance companies were also big users (anyone with 10,000’s of accounts to manage).

Then the last part is what this video tries to talk about. I forget what IBM executive said something like “what would anyone use a computer for?” (or “the world only needs 5 computers”, something along those lines). Well, we didn’t know until we built them - and the end of the video would fill the residential box in with probably “communication, digital arts, recreation”

Thoughts?

edit: revised…
image

2000+ Internet to replace TV, complete with Ads.

The famous quote is “I think there is a world market for maybe five computers” attributed to IBM’s president, Thomas Watson Sr in 1943. This doesn’t make any sense - nobody would use the word “computer” to talk about machines back in 1943. And as Gordon Bell pointed out, if there had been such a prediction it would have been correct for the next 10 years and so couldn’t be used as an example of a smart guy being wrong.

Speaking of Bell, his law of “computer classes” makes doing a diagram like you want very complicated. There is not one computer history, but several different ones happening in parallel with different timelines. So minicomputers in the 1970s were going through the same things that had happened to the mainframes in the 1960s and which would define microcomputers in the mid 1980s to mid 1990s. This history would repeat once again for PDAs/smart phones from the mid 1990s to the 2010s.

Absolutely! And with a huge difference in the needs and intentions of “headless” systems (rack mounted systems that “do stuff” of a different variety than what became the desktop-computer). I could still use an Altair 8800 to manage opening the gate on my chicken coop (with the only smarts being to automatically adjust for daylight times throughout the seasons).

I think Gordon Bell is a great quote, because in fact I don’t think he was wrong at all - in today’s world of cloud computing, we’ve almost circled back to that idea: with five huge “data centers” (maybe figurative, not literally) in the world hosting the majority of our data (those whole buildings being mega-computers). Tablets and todays computers are largely “very smart terminals” – hence the idea of not even installing software locally (Sony even streaming interactive action games these days). TinkerCAD, GoogleEarth, Microsoft World Wide Telescope all being great examples of that – getting great utility, while hardly installing anything locally (including no need to install updates).

And that altogether may be a more economical way to do computing - when I need to process 100’s of astronomy images, the more cores I can get the better, so there are moments I could use 1000 cores. But otherwise, why should I locally power up an 88-core DL360 to send an e-mail. Of course, the difference is the wired-world didn’t exist back then – and while wireless has gotten much better, still doesn’t beat high speed direct interconnects.

Anyhow, it’s why in this poster, we tried to focus on what led up to the “home computer” (and tried to avoid more of the industrial and business computers, and the game consoles - a whole slew of activity going on parallel). My daughter had this reaction on the KENBAK, “that’s not a computer!” Well, it depends on context. And I told her, I’d put together a replica KENBAK just to show we could also use it to control our chicken coop door at a computed schedule :slight_smile:

Just get up a 6 am let out the chickens. Reto computing has frozen chicken $25 a KG.
( well it is $25 a KG regardless :(). I question computing services for will my data or the service be around next week, next year, next decade. Will my old app from 5 or 10 years ago still work after all the new changes. Can I even get a new App.

I really question the internet model
of service. Use more bandwith by having ads,then have more ads to pay for the bandwidth.

Take #6.

  • Initial attempt at some commentary, testing it out.
  • Will be “keeping it simple” in this short-version (plan to have the more elaborate technical details on a site)
  • Added a little for the intro

Audio is only on the first half. Mix of “normal” and “increased speed”.

Still unlisted - not yet final.

Domesticating the Computer (take 6, partial voice trial, 1080 @ 30fps) - YouTube

This is even better. I would suggest not using “increased speed” since the viewers can select the option themselves.

While I can see why you called the MCM/70 a programmable calculator as its display is only one line, I would call it a personal computer instead. I have a TRS-80 PC-4, for example, which is a Basic computer in a calculator form factor and a single line display but it can do anything larger computers can (not counting memory limits).

When describing the Apple II it seemed like you said it has a “color vector display”. If that is the case then it isn’t right - it had a TV compatible display.

True, I noticed that about MCM/70. I think I meant to say “calculator-looking device.” But you’re right, I’ll try to fit in “personal portable computer.”

The Apple logo was changed from its original to the colorized apple - since the system did offer a color graphics mode? You’re right it shouldn’t be characterized as “vector”. Was it a form of semigraphics only?

The increase speed was kind of an experiment, I’ll try to normalize it all in the end. Turns out my Davinci is not as easy to adjust the playback speeds - all the tutorials show an “adjust speed” option, but mine isn’t showing that option. It has an “adjust duration” option that is slightly different - so it can be done, but requires shifting the entire rest of the sequence to make room (so it’s more tedious).

Thanks!

edit: Take #8

  • Finished last half of commentary (still DRAFT)
  • normalize the speed

If spot any gross technical error in narration, appreciate any heads up.

On the Origin of Home Computers (DRAFT, 1080 @ 30fps) - YouTube

thought on considering the Atari2600 as a competitor to the Apple2 from the perspective of 1977? Woz always wanted the Apple2 to be a game system - so both were released around the same months in 1977. A $1200 4K Apple2 vs a $200 Atari2600, seems like no contest - people are gonna buy the Atari. This changed by the 1980s, but I propose that the Atari2600 may have eaten into early Apple2 sales and hindered there success in the very early days, even though inside they were quite different systems.

I wasn’t really there at the time, but I don’t see this at all - Woz certainly wanted there to be games on the Apple II, but I don’t think that makes it a game machine. There was a great hunger to have computers, to be able to write and run programs, which I think the Apple II aimed to satisfy. A console such as the 2600 is no comparison - it’s not there to be programmed or to run programs, it’s there to entertain.

Edit: I think this is a really important historical and cultural point, and right in line for any educational treatment. A younger person will equate computers with games, a much younger person will equate computers with entertainment, or distraction. But in the 70s, small computers were about programs, about liberation and communication. Big computers were clearly the future, and were changing the world in big ways, but were locked in air-conditioned temples. Having a computer in the home, or having a computer to oneself, being able to choose what software to buy, being able to learn how to write one’s own software, these were revolutionary aspects.

1 Like

NOTE I’ve revised the prior link to Take #8. The audio speed is now normalized and I think the content is mostly set. Do still plan an add an audio track. Might possibly get this ready before the next weekend.

Hmm… But from that Christmas of 1977 - there was basically no Apple software. Atari 2600’s were in KMart’s and Sears (and I assume a few prepared launch title ROMs). So, I think I meant competition in terms of “consumer dollars” – which I’d say went to the Atari 2600. But that problem probably wasn’t unique to Apple. And true, I’m thinking on the Christmas sales cycle then, and the perspective of parents only thinking of buying a microcomputer for their kids (again, lack of software in that first couple years to motivate grandma to buy a computer for herself).

The statement was:

“The Apple Two offered a high-rez graphic mode, including color, and ended up competing against the Atari 2600, a competitor that both founders of Apple had previously worked for.”

I’ll think on how to revise this - I mostly just wanted to tie in the Apple/Atari relationship (which socially may have been a partnership of sorts). Though the competition may have been from Atari’s perspective – where without Apple, Atari may have entered the programmable microcomputer industry sooner than they did (i.e. Atari did later release microcomputer products in 1979).

Perhaps:

“The Apple Two offered a high-rez graphic mode, including color, which was a similar feature found in the Atari 2600 also released in 1977 and was a company that both founders of Apple had previously worked for.”

It might be interesting to review any consumer-facing advertising from late 1977, to see how Apple II was placed, if indeed it appears at all. Was it a professional product, a hobbyist product, a business purchase, or a consumer purchase?

I’d be very surprised if the 2600 and the II were two things to choose between - perhaps not least because of the price difference.

It’s true that the Apple II had (limited) colour. And true that both are 6502 machines, and that there’s some linkage in the history, but I’m not sure how I’d draw the connection myself.

In 1977 the Atari VCS (later 2600) was an expensive toy for children, while the Apple II was for adult hobbyists, professionals and small companies. There was no overlap that I can see.

The original Apple II had a high resolution mode with 4 colors (2 plus black and white) which was expanded to 6 colors in the Apple II+. It had a low resolution mode with large blocks that could be any of 16 colors (though the two grays looked exactly alike).

There is a direct link between Woz and Jobs at Atari and the Apple II low resolution mode: it allowed Woz to implement in Basic the game Breakout which he had done fully in hardware at Atari. The paddles input was another feature added just to make that one game possible.

Early Apple ads talked about using the computer at home to balance your checkbooks. The then president, Mike Scott, sat down and tried to do just that but found it completely impractical. The audio tape as storage was simply not up to reading previous entries and adding new ones. And the Integer Basic meant you had to do all your math in cents instead of dollars, which would limit you to a little over $300 with 16 bits.

He was afraid Apple might be sued for false advertising and made it the top priority to fix this. Woz focused on replacing the tapes with floppy disks and since he wouldn’t have time to work on adding floating point to Basic at the same time, Scott simply bought “AppleSoft” Basic from Microsoft. The new Basic and extra colors were introduced in the Apple II+.

Agreed, there is a mix of early ads as you mentioned – some suggesting “balance your checkbooks” and some line-graph-graphics, while the one early example below focuses more on the “program your own game” aspect:


http://www.macmothership.com/gallery/MiscAds2/Simplicity2.gif

I recall Woz’s Integer BASIC was technically faster and deemed sufficient since that’s all you needed for basic games to be written (to place things on whole integer pixels). And I recall that issue where Woz had to make a choice between finishing the disk drive controller work or finish the floating point work – he couldn’t finish both in time (in terms of sales season cycles).

(Woz had co-author a paper in '76 about floating point, so he understood technically what needed to be done – but given those too choices, yeah, they really needed a disk drive solution and was right choice to swallow their pride and just buy the floating point solution even thought that meant talking to Bill).

A peer once told me that in the early development of the F-16 (or rather, the algorithm simulation aspects like navigation modeling – not so much the aircraft itself), GD had chosen Commodore PETs over the Apple2 directly because the Commodore had the built-in floating point (and that the PETs were so much cheaper).

I didn’t know the Mike Scott story, of trying to actually try it himself; that’s neat and admirable (of acknowledging your product isn’t really doing what it claims).

Would the Apple low-graphics blobs be what other platforms called semi-graphics?

note: Apple’s floating point initially came it from a tape (AppleSoft as mentioned), not a ROM (it did come later in a ROM, probably the Apple II+, things changed rapidly between 1978 and 1980).

Well, and no one hacked a 13-bit Atari2600 into a computer or even any kind of terminal. But there has always been kind of a thin line between game consoles and home computers (I recall PS3’s being used to boot Linux and USB in a keyboard – I also remember adding a keyboard to a Sega Dreamcast and surfing the web in '97).

Apple and microsoft could both get away with the trick, making floating point
5 bytes rather than 4, since they used a 8 bit cpu. Everybody else got stuck
with 4 or 8 byte floating point. Thank you IBM and the IBM 360.

Notice 48 bit word size floating point was a typical standard, before the IBM 360.
Real users of floating where expected to buy the higher priced models and use
64 bits. As side note I am debugging a 8/16/48 bit cpu design. The front panel
is not displaying the led flasher program correctly.
Ben.

FWIW: The first version of Woz’s BASIC was called Game BASIC in the Apple 1. It only became Integer BASIC when ported to the Apple II.

-Gordon

1 Like