Bill Gates (briefly) on Basic bringup on the Altair 8800 (1994)

via the links from the RetroComputingRoundtable podcast:

2 Likes

Here Josh Bensadon enters bootstrap code via front panel switches on an Altair 8800, then loads 4K BASIC from paper tape:

Here’s a post about optimising that boot loader:

Spoiler

At 3:26 into the video, Bill talks about the bootstrap loader and says Paul wrote the first loader in 46 bytes and he (Bill) later wrote it in 17 bytes. I thought to myself, yes, 17 bytes is better than 20 bytes… especially if you are repeatedly demonstrating it. I played with the idea of figuring out those 17 bytes then set out to make it as small as possible. Well, that’s when I got it down to just 14 bytes and now down further (with some encouragement) to 12 bytes!

A nice series of posts here too, Running Altair BASIC on a Modern Computer by Rhys Rustad-Elliott:

Oops - these have bit-rotted, see below for new links.

1 Like

I’ve not heard about the relative speed of their 8080 emulator and the real machine before (the real machine was 5 times faster than the emulator on the PDP). Could give some insight into the emulator’s construction.

Loved that little article on optimizing the boot loader. Got me thinking that a Z-80 version can be smaller since it can use the 2 byte JR in place of the 3 byte JP. The overwrite pattern would need to be something like repeated 00 03 bytes (NOP; INC BC) that would change the final JR to go to the next instruction or 3 bytes past it.

The article points out that with the tape operating at 110 baud the shorter bootstrap program takes longer to load overall since it requires a bigger second-level boot loader.

So maybe the Z-80 version won’t be much better since the JR instructions will have negative offsets. When inputting the bytes you only need to flip switches which change so putting in “00 01” is no doubt faster than “F3”.

Leading to the inevitable question: What’s the minimum boot loader in terms of switch changes?

And the followup: What’s the faster bootloader when you take into consideration human switch change speeds and the size of the manually entered loader and any secondary (or Nth-ary) loaders?

Definitely in danger of a self-nerd-snipe here so I’ll leave it at that.

1 Like

What is says is more interesting.

On the one hand, I imagine that performance was “good enough”. Raw performance of the simulator was not a goal over proper behavior.

Even at “5 times slower”, the simulator was a better development target than the machine itself. The simulator was probably a better debug target than a real machine, not including whatever had to be done to transfer the binaries over for testing. I know even my crude 6502 simulator is vastly more capable than a raw board for development, and I can whimsically add more capability if the need arises.

If I am not mistaken, the 8080 simulator for the PDP-10 was just an adaptation of their previous 8008 simulator (for the IBM 360) that they used for developing their Traf-O-Data system. That allowed Paul do develop it very quickly.

1 Like

Found some bits in Paul Allen’s autobiography, Idea Man:

… Traf-O-Data … We knew that it would be painful, if not futile, to try to create software on the 8008 itself. We needed to build a set of development tools from the ground up, including a customized assembler… While the 8008 could address 16K bytes of memory, Bill and I could afford only a quarter of that in memory chips, not nearly enough for the tools.

So how would we program such a limited microprocessor on a machine that didn’t yet exist? For me, the answer seemed clear: I’d simulate the 8008 environment on a mainframe. Simulators had first cropped up in the literature in the midsixties, when an engineer named Larry Moss devised a way for an IBM 360 to “emulate” earlier-model computers and run their software. Moss’s work reflected a truism in technology circles that harkened back to the theories of Alan Turing in the 1930s: Any computer could be programmed to behave like any other computer. Software trumped hardware. Although I hadn’t read about anyone simulating a microprocessor, I figured it should be easy enough—I’d simply trick a big computer into acting like a small one. In the meantime, we could exploit the big computer’s abundant memory and advanced development tools.

We had no idea how much adversity lay in store for us.

SOME HAVE SUGGESTED that our Altair BASIC was remarkable because we created it without ever seeing an Altair or even a sample Intel 8080… …we had no choice. The Altair was little more than a bare-bones box with a CPU-on-a-chip inside. It had no hard drive, no floppy disk, no place to edit or store programs. And even had the machine been up to it, debugging on the memory-challenged 8080 would have been slow and difficult at best.

Any other programmers vying to bring an 8080 BASIC to Albuquerque would be facing an uphill climb. For starters, they’d have to realize that they needed a simulator and then to create one from scratch on a mainframe or minicomputer. Bill and I had a big edge in speed and productivity with our Traf-O-Data development tools.

I just tried Rhys’ links above, and they’ve bit-rotted. New links are:

2 Likes