Thanks for the link to the paper. This jumped out at me:
Interpretation is distinct from execution, and serves to verify that execution will proceed without difficulty. For example, a Do command is interpreted only once, and may be deleted during execution without ill effects. (This behavior is the basis of many of the intriguing “JOSS puzzles” concocted by RAND’s Oliver Gross, which usually took the form of “what was the deleted step?”)
It seems to be saying that a program can contain Delete statements, and a Do in progress can be deleted. Remarkable!
FOCAL is, but is BASIC? Maybe my memory is just faulty, but I’ve read a fair amount about DTSS, Dartmouth BASIC, and the proliferation of BASICs, and I don’t recall anyone particularly tying them to JOSS. There are certainly similarities, at least in the interaction environment, but the languages share less.
It seems plausible that JOSS could have been known to Kemeny and Kurtz, or their students, but I haven’t seen it discussed or suggested as a direct inspiration.
The Baker paper linked by @scruss does list BASIC as “JOSS Influenced” and says
Another category of systems can be best termed JOSS-influenced [Fig. 34], in that although the systems were designed and implemented with different goals, and hence with different emphasis on matters such as speed and efficiency, and were often aimed at a different category of user than JOSS, at least some resemblance between the languages can be discerned. Some of these designers had direct, and often extensive contact with JOSS, while others had only a short demonstration of an early version of the system
I’d like to learn more about BASIC’s pedigree.
That said…
There were lots of language ideas floating around.
The CS community was relatively small,
and it seems like everybody knew each other.
There is a temptation to say one person
(or in this case a duo) “invented something,”
but ideas have a way of building on one other,
And sometimes the time is just right.
It is interesting to note that,
while BASIC is known for being interpreted,
the original Dartmouth BASIC was compiled.
(And FWIW, BASIC PLUS compiled to byte code)
Maybe Kemeny and Kurtz were afraid the user
code would run too slowly if it were interpreted.
A BASIC interpreter’s true superpower is
that it can fit inside a small memory footprint.
And it doesn’t require secondary storage.
The entirety of ALTAIR “4k BASIC” was
implemented in 3kB of 8080 assembly language,
leaving 1kB for user code, arrays, and variables.
BTW, from the documentation,
I see hints that 8k BASIC was written first,
and then “cut down to size” to fit in 4kB.
I also strongly suspect that Bill Gates himself
wrote the ALTAIR BASIC documentation.
It definitely wasn’t writen by a tech writer.
The author knew too much about the internals.
Compiling makes sense, because BASIC was meant to be easy to use FORTRAN
replacement for small programs, like 99-bottles-of-beer. Complied programs take less
space as well as memory was in words,not bytes. Interactive computing back then was
having a program run on a computer the same day you submitted it.
It was compiled, but not in a way we’d recognize as compilation. It relied upon a terminal server to convert each line of each user’s program into a compiled job on the mainframe. So a program was potentially thousands of jobs sharing state*, with the terminal server stitching the output back together for the users.
It seems a bizarre way of doing things today. Some other time-shared BASIC systems copied the two-machine arrangement, such as HP. Emulating the HP BASIC system requires starting two emulators in parallel and getting them to talk to one another. It’s rather amazing that it works at all.
*: what I’m not clear about is where they shared state. Logically, I’d assume it was on the mainframe, but maybe not.
A People’s History of Computing in the United States by Joy Lisi Rankin spends some time on the early development of BASIC, and provides many citations for follow-up.
Thanks for the details. I recently read up on the HP2000 and noticed it had two CPUs too (one big, one small). This arrangement was not uncommon for time sharing systems, but the “front end” or communication processor usually didn’t do anything more complicated than preventing user terminals from interrupting the “big iron” too often and keeping a line buffer until you hit return, linefeed, or (bowing to TECO) escape.
What I had read (citation forgotten) was that the Dartmouth “compiler” used the disk for temporary storage, so I’m not sure how that fit into the architecture you decribe. (Not calling into question, just trying to piece to puzzle together).
We also knew about the interactive language JOSS, but we preferred to stay nearer the main stream of “standard” languages. Several JOSS conventions–decimal points in line numbers, and periods after statements–did not appeal to us.