OS/2 Museum article on the IBM PC

http://www.os2museum.com/wp/the-ibm-pc-41-years-ago/

The topic of the design of the PC has come up recently on this forum, in particular here and in the following discussions, which mention the DataMaster, and in particular regarding other CPUs that may have been in competition with the 8088.

4 Likes

One of the photos in the post features a raw ISA card, which brings backs memories of my final year undergraduate University project. It was to build a video capture card using a video signal capture IC and IC components that would work on an IBM-PC compatible. I think the capture window size was around 100 x 100 pixels monochrome only, but I sure learned a lot about IBM PC hardware interfacing and wire wrapping from the project. :slight_smile:

Ben Hardwidge travels back to August 1981, when IBM released its Personal Computer 5150 and the PC was born.

A big ape had only just started lobbing barrels at a pixelated Mario in Donkey Kong arcade machines, Duran Duran’s very first album had just rolled off the vinyl presses and Roger Federer was just four days old. In this time, the UK was even capable of winning Eurovision with Bucks Fizz. It’s August 1981, and IBM has just released the foundation for the PCs we know and love today, the PC 5150.

1 Like

Interesting insights on their decisions! Given the similarities in hardware, I wonder why the DataMaster was so much more expensive than the IBM PC.

Dual 8" floppies come to mind. 16K dram was 10x? the cost. Notice how the
PC went from cassette to 160K floppies, then 360K then 720K then HD . New I/O with evey model.
$100 for a microsoft mouse.

Indeed, very interesting to see some, but not all, of the Datapoint decisions being carried forward. I think within IBM a product design would always be accompanied by a forecast of volume, and a price, and of course costings. They had standardised ways of deciding whether a product looked viable. So, the Datapoint would have been designed with the price point in mind - dual floppies is very convenient, but also fairly significant cost.

It always struck me as a bit of an oddity that the PC needed to use some of its few slots for the floppy controller and the display adapter - but of course, it’s a good way to risk-reduce and to offer flexible options.

Hmm … maybe much of the cost difference is illusory, due to the IBM PC base model theoretically not including a floppy drive. In practice, the floppy was absolutely necessary. There was, I think, only one obscure piece of software released on cassette tape - some sort of RAM test program or something. The cassette port was essentially never used at all by anyone.

Im mildly interested in the difference between Basic and Advanced Basic. Of course, I could look it up!

Interesting too that the Datacenter team were struggling with their Basic, and the memory banking needed for their 8 bit offering. I’d be interested to hear about 8 bit Basics which did succeed in transcending a 64k limitation. (Acorn’s BAS128 was a 16k Basic which placed the Basic code and data into 4x16k RAM banks, so you got a full 64k to play with, but that doesn’t quite fit the bill.)

I’m not sure what you mean by Basic and Advanced Basic, but I do remember typing in a program on an IBM PC when booted up to its ROM BASIC and getting shocked when I couldn’t save the program to disk. You needed to have loaded the disk version from disk, in order to save to disk.

As for versions of BASIC that broke the 64K limit … well, the Commodore 128 did this by having the program in one 64K bank and the variables stored in another 64K bank. As I understand it, other PET/CBM models using the 6509 CPU could split even more by storing arrays and strings and other variable types in different banks.

The way Commodore did memory banks had a certain elegance, but it made things stupidly difficult for developers. I think the way Atari 8-bits handled memory expansion was to simply have a 16K block of memory which could swap rather than the whole 64K memory space. That seems to me to be much easier to develop with. You could have your code in one actual place, and use some of the leftover RAM to copy stuff to/from.

Who knows? If the Datacenter had a memory banking system like that (only switching a 16K block), then maybe they wouldn’t have had those difficulties and the IBM PC might have just stuck with that.

This is particularly painful on the PC 5150, which only had 5 slots to begin with. It led to a proliferation of multi-function boards like the AST Six Pack to save slots. The 5160, in addition to requiring another card for the hard disk controller, increased the slot count to 8.

IBM was sufficiently aware of this problem that they sold the 5161 expansion unit, which was a nearly identical cabinet to the PC itself containing 8 more slots, and came with a pair of boards and a cable to carry the bus signals to the new chassis. (Very minicomputer of them!)

The Apple II users among us were very familiar with this particular pain. :slight_smile:

3 Likes

So, apparently some of them were indeed worried that if they had gone with a full 16 bit CPU, the project would get canceled for being too powerful.

[…]the PC team picked the 8-bit version because using a full 16-bit processor might have caused IBM’s Management Committee to cancel the project for fear of hurting sales of its more powerful products. Bill Syndes, who headed hardware engineering for the project, has said similar things in a few interviews.

And doesn’t it seem that USB-C Docks for laptops are a throwback to this? Again they’re being used because the manufacturers aren’t including needed ports/components anymore.