This one has been around for a few years, but it’s a pretty interesting (and, I think, accurate) read.
It discusses how various mostly-dead languages (although COBOL and Pascal make this list, which I think are arguably still “alive”, if not active in new projects) have lived on in their influence on later languages. Being a selection of 10, some influential and important languages are necessarily left out, and I’m not sure I would choose exactly those 10, but certainly I can see the motivation for each one.
BCPL gets three mentions in the article. There’s even a (sort of) reason given why it’s not listed: it’s a CPL-alike, hence an ALGOL-alike, and they’ve already covered that. Hillel Wayne’s a good writer, but their style is “opinionated [informed] take”. It’s an opinion piece.
Personally, I would’ve put LISP up there. The LISP community was always small (but vocal and so, so full of drama) and vastly dependent on government grants. When the AI Winter hit (in other words, the funders found out that the academics couldn’t really do all the magic they claimed they could) it almost vanished. Seeing all the massively expensive custom hardware developed to run LISP in the 80s, didn’t somebody ever think “Maybe we’re doing it wrong?”
LISP was never ment for big problems. It is the BASIC of list processing.
As for AI, maybie a Trinary processor is needed. Some where I read lists
are better as trinary node than binary one. Only HAL 9000 knows for sure.
I saw this video recently that went through a popularity ranking of programming languages from 1965 to the present, and saw that (at least from this reckoning) from 1973 to 1975, Lisp ranked 3rd, below Fortran and Cobol, then had a resurgence from 1983 through the rest of the decade, ranking 4th below C, Ada, and Pascal. I was impressed. Though, this was, of course, during the hype around AI. Once the AI Winter hit, Lisp died off quickly.
As I think Paul Graham would agree, this was unfortunate. I think it’s fall-off in popularity was in no small part due to the fact that it had been pigeon-holed as an AI language for decades. As a consequence, once the AI hype was seen for what it was, most people thought Lisp was useless. Another thing that hindered it was that for decades, it could only run well on high-end hardware, which, combined with what most people thought it was good for, meant that it could only sustain itself so long as government contracts held up.
Re. scruss’s question about “Maybe we’re doing it wrong,” I agree with that, but I think of it in terms of how the rest of the industry/field didn’t seem to see any good ideas in the architecture (something that could be miniaturized, made more efficiently, and mass-produced to bring down cost), and what a loss that is. However, as Richard Gabriel talked about, by the late '80s, compiled Lisp had decent performance on register machines, and that was part of what killed the Lisp machines off. It didn’t need the custom hardware anymore. The custom hardware turned out to be a stopgap while register machines became optimized enough to run it well.
BASIC was far more widely used than he credits. Having spent far too much time looking into the various dialects and standards efforts, I think there was a time in the latter 1960s and well into the early 1970s where basically the entire market thought BASIC was the next big thing. The mainframe vendors were actively attacking the timesharing market with BASIC, you had mini vendors like HP and Wang with entire product lines for BASIC, and many of the European vendors adapting it for business use. I think the death of BASIC has much more to do with the non-emergence of this timesharing-as-utility than it does with the microcomputer world a decade later. The micros kept it alive, but in a different form and for different reasons.
Smalltalk’s did not have “poor runtime performance”, it had absolutely useless runtime performance, both in speed and in memory footprint. I worked at a company that tried to use ST three different times, on different platforms, including a high-end VAX and a risc box. All three resulted in programs that required multiple megabytes in the era of ~1MB RAM and 80 MB HD’s, took minutes to start up, and ran at speeds that made HyperCard look like a speed daemon. ST was dead in my locale long before Java came along. I went looking for a list of apps that used it, but both of the servers that apparently held such lists are long gone, and the remaining list of examples is a handful of client/server apps.
Worth noting perhaps that this paper is by Donald E Knuth and Luis Trabb Pardo and the abstract is rather promising:
This paper surveys the evolution of high level programming languages during the first decade of computer programming activity. We discuss the contributions of Zuse (Plankalkul, 1945), Goldstine/von Neumann (Flow Diagrams, 1946), Curry (Composition, 1948), Mauchly et al. (Short Code, 1950), Burks (Intermediate PL, 1950), Rutishauser (1951), Bohm (1951), Glennie (AUTOCODE, 1952), Hopper et al. (A-2, 1953), Laning/Zierler (1953), Backus et al. (FORTRAN, 1954-1957), Brooker (Mark I Autocode, 1954), Kamynin/Liubimskii (ПП-2, 19654), Ershov (ПП, 1955), Grems/Porter (Bacaic, 1955), Elsworth et al. (Kompiler 2, 1955), Blum (ADES, 1956), Perlis et al. (IT, 1956), Katz et al. (MATH-MATIC, 1956-1958), Hopper et al. (FLOW-MATIC, 1956-1958), Bauer/Samelson (1956-1958). The principal features of each contribution are illustrated; and for purposes of comparison, a particular fixed algorithm has been encoded (as far as possible) in each of the languages. This research is based primarily on unpublished source materials, and the authors hope that they have been able to compile a fairly complete picture of the early developments in this area. This article was commissioned by the Encyclopedia of Computer Science and Technology, ed. by Jack Belzer, Albert G. Holzman, and Allen Kent, and it is scheduled to appear in vol. 6 or vol. 7 of that encyclopedia during 1977. (Author)
(I’ve linkified to HOPL in there - lots to explore!)
Also I notice (from this HN discussion) this paper was first delivered by Knuth as a talk:
I would say that Basic was the Cobol for microcomputers in the very early 1980s, quickly replaced by dBase and clones (Clipper and Summer) through the rest of the 1980s until Windows came along and Visual Basic took their place.
I am not aware of any serious Smalltalk for the VAX, just two experimental ones (from DEC and Berkeley). Digitalk’s Smalltalks were very usable on PCs and Macs while ParcPlace required high end Unix workstations.
How many people feel replacing ← with _ was bad idea, since it killed all
the programing lanuages that used it for assignment in ASCII. Looking at a few papers.
Makes intersting reading, as Kuth’s Vol #7 has yet to be published.
PS: how do you print a ←
Re. The Early Development of Programming Languages
Thanks for posting this. I can’t say I followed everything that was explained, because some key features were really dependent on understanding the machine the language was designed for (which I don’t), but what I really appreciate about this is it gives me a sense of going back to the earliest notions about programming languages.
I was rather fascinated to learn that the earliest high-level languages were often to be evaluated left to right. For example, in modern programming, we would often say: x = a + b to assign a new computed value to x. In many of these early languages, the same expression would be written something like: a + b = x. In other words, the computation comes first, and the result is then assigned, left to right. Though, this convention also often went along with the language being purely notional (not implemented).
I saw a bit of myself in what was described in how programming developed, since it said that the earliest programming languages in this history were just manual methods of notating programming logic. They were not implemented on the computers they were written for. When I first started trying to design my own programming languages, I did the same thing, writing out example code, trying to create a consistent set of rules for syntax and semantics, but these were just what I call sketches. I haven’t implemented any of them (yet). Though, in the last year, I have made some progress in implementing a simple virtual machine, and an assembler for it.
The history gradually gets into the first assemblers, and the first compilers (which were implemented on a computer). It was interesting to read that compiled languages were called “automatic programming,” since “programming” was seen as doing it at the machine level (makes sense, given what computer operators typically dealt with). Though, the term “compiler” was also used early on.
Since SIMUL 67 is probably my favorite language of these and the “Further Reading” link is dead, here’s “Compiling SIMULA: A Historical Study of Technological Genesis” by
Jan Rune Holmevik (IEEE Annals of the History of Computing, Vol. 16, No. 4, 1994; archived):
You can pick only one, (or revise the symbols ) for the language of your
choice. In many cases SPACES were optional to save space on paper tape.
Notice that [ ] was not found as standard character. BCPL / FORTRAN IV
may be the only high level languages with this I/O.