History Problem with a Friend

I’ve been looking up other documentaries online, and found out that TCP/IP came out long before MS DOS. I distinctly remember hearing that Bill Gates thought the internet was just a fad, and didn’t get into the internet until much later in the game. The real trick is to get him to see what I’ve found. Some people don’t like seeing their own notions get squashed. :wink:

Personally, I had always thought the Apple1 was the first home computer. I wouldn’t consider the Altair a home computer, since it was limited (more so than the Apple 1) as to what it could do, in addition you had to wire it up yourself.

TCP/IP was well routed in the big machines in ARPANET - a defense project at about the cold war period. For home/small networks,Serial cables, Arcnet was there. In the 80’s and 90’s, the main networking system software was Novell using the IPX/SPX protocol which shares some things with TCP/IP. There was Token Ring, but only big-biz installed that network. (there are other networks, but focusing on PC based networks)

DOS had no concept of networking… and even MS-DOS 7 does not have any concept of networks - even the whole principle of DOS did not lend itself to sharing as files. File locking was only introduced in Dos 3.3.

All these systems were local LAN based networks, and WAN was X.25

ARPANET was for military purposes and was only handed over in 1990. After that it went between universities and then public. Winsock for Windows came out in 1992 and before that there was the DOS KA9Q software stack; which was written by Amateur radio enthusiasts in 1985

The Networking was added to Microsoft… but saying that Microsoft had a role apart from providing the operating system. If MS had not existed, it would have been the next company. (perhaps IBM)

The virtue of TCP/IP was really in its ability to incorporate and route other, already existing protocols. (E.g., AppleTalk was part of TCP/IP and, if ISP on both sides supported it, you could encapsulate your AppleTalk connection and connect to a Mac on the other side of the globe.) TCP/IP is really more of a wrapper than anything else and facilitatated the use of much older protocols, like telnet/TTY.

As a historical footnote, TCP/IP as evolved from ARPAnet (three independent implementations of TCP/IP had been contracted by ARPA in 1974, one to Cerf at Stanford, one to BBN, one to London University College and all had been operational by the end of 1975) was already at some of a dead end in the mid 1980s, when OSI networking had become the official standard. What came to the TCP/IP’s rescue was CSnet (Computer Science Network), intended as some kind of equal access to ARPAnet for academic institutions, funded by the NSF (US National Science Foundation), an effort started around 1979/1980. In 1985, NSF launched five super computers, networked via TCP/IP, while there wasn’t any real world implementation of OSI to be seen. Apparently, it had really been NSFnet and its spread over educational institutions, which gave the decisive impulse for the Internet, we know today.

(Edit: Dates are based on the account provided in Dream Machine by M. Mitchell Waldrop, which I personally consider to be still a good source on the subject – and a great book in general.)

1 Like

The Sphere I came out in November 1975 while the Apple I was launched on April 11, 1976. The Sphere had quality control problems that doomed it to obscurity even though it sold twice as many as the Apple I.

It was as costly as the Apple, so I would consider it more a professional computer than one for the home.

2 Likes

My idea of the US market at that time is that professionals - doctors, dentists, engineers - might well have bought machines for their home use which would be considered unaffordable by families with more ordinary incomes, and certainly not machines for children or teenagers. That’s still a pretty big market, and indeed often these early adopters subsidise the development of the mass market machines. So, home computer as a computer at home is one thing, home computer as a teenager’s entertainment/educational device is another. As volumes go up, costs come down, prices come down, and more people are reached. Eventually a supplier can even afford television advertising with endorsements by recognisable people.

Only with the advance of cheap dynamic memory in the early 80’s did you get the mass market products. Simple cpu’s like 6502 and the 8085
and a VIIDEO display chip kept the logic simple. The IBM PC was the only thing aimed at the Professional market for a single user, so it sold.

I’m not so sure about professional markets only: The IBM 5150 was some of a hybrid, too, with a number of features we would identify as typical for a home computer: BASIC in ROM, a clock rate derived from the NTSC carrier frequency, cassette interface for mass storage… Also, at 4,77 MHz and an 8-bit bus it wasn’t much faster than a typical Z80 home computer (for some applications even slower, due to additional latency). 16K (basic configuration) or 64K RAM was also within home computer specs.
In order to make this a professional machine, you had still to add a number of (rather expensive) cards. And, if you did so, those built-in standard features were perfectly useless.

(Mind that this was not the PC as we know it, which came only with the 386 generation. The original PC in a corporate environment was more meant as an intelligent terminal to interface with applications, which were still running on a mainframe. Notably, limited local capabilities were apt to preserve IBM’s established corporate IT solutions. On the other hand, its success was some of a self fulfilling prophecy, since many were planning to introduce office computers, but had waited for IBM to enter the market. Even, if it may have underfulfilled expectations, it was to become the standard anyway. Accordingly, it was the proclaimed “industry standard”, even before any of the specs were known. In hindsight, the original PC was more of a transitional product – and not that disruptive as often said.)


Edit: Contrary to the popular narrative, IBM hadn’t just missed out over the emerging home computer market, but recognized this quite early on and was ready to lead, but struggled over the 1970s with various concepts and prototypes. (Some, like the IBM Aquarius prototype look rather amazing, with interesting specs, like applications distributed on bubble memory cartridges.) Apparently, at the end of the 1970s, they just had given up and there were even talks of buying Atari’s computer division with their new 8-bit lineup. As we know, this came to nothing, followed by the outsourced PC architecture. With a look at the specs of the 5150, we may guess that IBM hadn’t completely given up on the home computer idea, but then decided to market the machine to businesses only (as there was great expectation, probably also recognizing that selling this as a home computer as well wasn’t exactly going to further the marketing proposition) and to split the home segment to dedicated machines, as seen in the PCjr / Peanut.

3 Likes

Thanks for your expansive and interesting comment @NoLand!

My feeling is that the badge “IBM” on the front, and the slick advertising, had a lot to do with the initial success of their PC offering. And of course the clones, when they came, expanded the market for software and hardware still further.

There might be an alternate reality in which CP/M and perhaps MP/M held sway for longer, or maybe Flex, but there had to be a route to using more than 64k RAM. We know now that what counts is not hardware but software, and killer applications in particular. VisiCalc, dBase, Lotus were hugely important. I might mention Turbo Pascal too, although I don’t know what implementation languages were dominant for the most successful applications. It’s true, I think, that 68k machines were widely used in Germany for business purposes: the PC platform was not overwhelmingly dominant everywhere instantly.

2 Likes

Here’s an image from an alternate reality, where the Atari 800 became the IBM PC, an actual design prototype by IBM – illustrating quite well, how blurry the lines actually were.

(Image source: IBM Archive, reproduced in: Atkinson, Paul. Delete. A Design History of Computer Vapourware. Bloomsbury Academic, 2013.)

Edit: How hilarious this may seem in hindsight, it would have made perfectly sense. Performance wise, the gap to the actual PC wasn’t that big, especially for text oriented applications. Moreover, the Atari 800 featured an expandable architecture with an array of expansion slots and even the cartridge slot would have made sense, given the previous plans for an ecosystem (for the Aquarius) of applications distributed on bubble memory cartridges. Just a bit of IBM treatment for the motherboard (for industrial robustness) and you are ready to go. (The limited amount of CPU registers of the 6502 may have proven a bottleneck later on, however. But then, there would have been the 68000 readily available as the rather logical choice for a follow-up architecture. A dark alternate universe for Intel.)

2 Likes

Tandy made a bunch of IBM-Compatibles that used a similar form factor to this.

The real interesting scenario would have been Acorn entering the US market with the BBC Micro in a timely manner, with a 6502 running at double speed, superior OS, various text and graphics modes, networking and an expandable architecture (including the Tube interface for additional processors). Acorn had even contracted a GUI for the purpose. Apple was apparently pretty much afraid of this and I’m also not so sure about what this could have meant for the business market. (Ironically, it was for all those buses and expandability that Acorn struggled over FCC compliance and burried a fortune in the effort.)

2 Likes

Acorn’s BBC architecture very much had legs, as the 2MHz 6502 system only needed to act as a front end. As you may know, they sold 8 bit, 16 bit and 32 bit second processors, with substantial RAM and performance. Maybe it could do with some re-engineering, but the architecture was there.

I’ve previously written a couple of overview descriptions elsewhere:
Acorn’s BBC micro - some resources
Acorn’s Second Processors and the Tube - what, and why

1 Like

This may be a bit too much of a diversion from the original post topic, but doesn’t the Tube interface go back to the “fruit machine”? (For the benefit of the reader: Quite literally a fruit machine driven by digital electronics, an early contract work by Acorn.)
I’ve forgotten, why it had been there and what it was supposed to do (probably burdening I/O loads, as well), but details may be in the interview with Steve Furber by the Centre of Computer History YT channel.

(And, no, I didn’t know not too much about the various coprocessor offerings – as my general knowledge of the BBC Micro is rather limited, since it hadn’t been available here, probably a typical Austrian revenge for Hauser’s daring success abroad :wink: –, but I learned some from your write-up. BTW, it’s amazing, how much Hauser is still ignored in Austria. There are probably only few who know his name, while everyone is handling some ARM-powered device.)

I think the Tube interface is a relatively late development. Sophie made a cow feeder, Acorn (or maybe CPU) made a fruit machine, Acorn made the System and then the Atom. All these are single-CPU, I think. Then came the Proton project, where a 6502 was to be a front end to a second processor, and that project was transformed into the BBC Micro according to the specification (or guidance) of the BBC themselves. It’s at this point that the Tube appears in public, I think. As it happens, the Tube allowed Acorn to say that the Beeb can run CP/M, but it’s unclear whether this was crucial. It’s true that the Z80 and 6502 second processors were the first to turn up, but I can’t say in which order. I suspect it was 6502 first: a 3MHz 6502 with 64k RAM, rather a splendid machine for development or computation. For internal purposes only Acorn also tricked out a large-address-space 6502 second processor.

This thread is well worth a read:
“Outline specification for the BBC MICROCOMPUTER system”

There was a lot of discussion about this in that screening of “Micro Men” to the Acorn guys. Steve kept insisting the Tube was there from the start and Herman remembered it as being a late development. Steve was thinking about the connector which would later make the Tube possible while Herman was thinking about the logic on the expansion module that actually made it work. So both were right.

1 Like

Yes, “Tube” ends up with as many as four meanings if you’re not careful: the connector, which was early and cheap; the interface chip, which was a significant engineering effort and arrived a bit later; the protocol, which is a crucial part of the technology; the second processor, also sometimes called the parasite or the slave. And perhaps a fifth meaning of Tube is the overall architecture which has a 6502-based front end and a big fast simple machine in the back end. The original interface chips are of course a finite supply today, so a bit like the case with the VIC and the SID there are efforts to re-implement, which are now very high fidelity and open source. (There’s also one or two which are not open source.)

The internet was not born alongside MS Dos. The birth of the internet was a gradual process, in wich Arpanet, was slowly replaced and different network protocols were gradually merged into what we know as the internet today. That process can be roughly placed between 1975 and 1995. So you can not say that the internet was born with MS-Dos, as the process had already begun.

Regarding the Apple-1 not being the first real home computer. Well… I think he is right. Although the computer he is thinking about, might be the Altair-8800 or something like that. Nope… The Kenbak-1 is like the first affordable computer for the private market. That was introduced in 1971. That is around 5 years before the Apple-1.

Yet the Altair was indeed designed for home use. Yes the price might have been steep, yet price have nothing to do, with what market the designer intended the machine for. The steep price can be explained, by the time it was designed in, and how few machines that were made. Today things are cheap, because it is mass produced.

Okay, but how much could you do with an Altair once you have one up and running?