"Pascal and its Successors" - Niklaus Wirth

I struggled to find this, but it’s apposite:

Regarding being worthwhile to pay for the protection, I would quote Hoare, regarding his experience with Algol compilers in production presented at the Turing Awards speech.
“Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to–they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980, language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law.”

Oddly, in this talk Hoare says something slightly different:

08:00 Adding checks to arrays added space and time to the program; on Tony’s first machine it ran at less than 2k operations per second (500 micro seconds per operation, and two such tests for each array bounds).
08:40 No undetected array errors, and customers didn’t know they could trade off safety for speed.

2 Likes

I am thinking more, NULL pointers was not the mistake, ALGOL was. The same
goes for PASCAL, all teaching, not production languages. JAVA is closed source.
Now if there was a standard for pesdo-code used text books, that would be useful
to know.
Ben,
PS: can we ever get a spell checker for the editor?

I think, it’s important to note that Algol wasn’t just a computer language, it was a general notation form for algorithms that happened to have a “machine representation”. Which is pretty amazing, if you think of it. (Null pointers introduced some of a schism into this.)

Having said that, I’m not aware that Algol was used much outside of computing.

PS: Meaning, Algol isn’t that much about what an ideal programming language may look like, but more like, what if we had a formal description language that would lend itself as well to the problem space as to a computer implementation, so that we could transfer any model or algorithm described this way directly onto a computer and run it. (Yes, we may have to transcribe some Greek letters and other symbols, but that’s essentially it.) It’s more about how computers could integrate into the scientific space.

I came from the mainframe corner with Fortran, Cobol and PL/I and was completely shocked how anyone could seriously consider such a wild macro assembler as a programming language.

1 Like

Not sure what BASICs you’ve used but traditional BASICs have always done array bounds checking and keep track of things like GOSUB, etc. stack sizes, “heap” allocation via DIM and so on.

Of-course peek & poke have a lot to answer for…

-Gordon

Algol. To much math theory for my liking. Never liked logic in general, The door is
open, the door is closed. Real world, the door is open but a Gorrila is standing in it.
How do I handle the set of numbers, that are prime and are the middle 200 of the first
1000 primes.

With BASIC I was thinking more like INPUT A$; if A$=‘Y’ gosub 100
What about ‘YES’ or ‘N’ as input.

Soon I will have meta compiler… more evil languages to write.

Ben.

To provide an example: say, a neuroscientist publishes a paper on the visual cortex of a cat, describing how the signals travel through the “system”, with all the relevant interconnections and bandwidths. As she publishes these descriptions using Algol, we can directly transfer this to a computer and run a simulation to validate it and maybe conduct some further experiments with this model. This is a huge vision, eliminating many intermediary steps and processes, as well as some systemic problems with computers in scientific research. However, as Algol tackles this in a highly systematic manner (which is somewhat to be expected, provided its scope), it involves some rather abstract (i.e., mathematical) approaches.
Conversely, it is not as much about how we want to deal directly with a computer. If it performed well in this category, as well, that would be (merely) an additional win.

(Regarding the spell checker: There should be one in your browser, check the context menu of the textarea. But I keep forgetting about this, as well. :slight_smile: )

This assumes ALGOL discribes the problem. It does not as brain uses chemistry and
physical placement to work. Electrical impulses are just one aspect of nerve cells.
real,int,boolean … no nerve cell data type.
What you do is write BRAIN-GOL for your problem. Use the tools to fit the problem.
Ben.
PS: At one time you could get a APL I/O device, so you could use ALGOL.
ALT abcd4 to get a ⏨ seems arather weird way to enter data.

1 Like

I’ve to admit that I’ve kind of an aesthetic fascination going on with Algol, right from the beginning, as it was the very first programming I ever learned (but never used). This may be well responsible for me having caught on programming, at all.
On the age-old question, “is it art or engineering”, Algol is clearly on the artsy side of thing, which is sympathetic to me personally, but is probably an absolute no-go nowadays.

There are some obvious problems with Algol. E.g., just take switches in Algol, which allow you to mix in state logic into any other type of logic at about any level. So you end up with something that may look like an array and an index as an argument to an array related procedure, but in actuality, this totally changes the behavior of the program and what the code actually is about. Some Algol constructs are both very high level and very low level (compare again switches) or may occur both at a high level and at a low level (e.g., if-else clauses in value expressions). This leads to highly abstract programs, which may be hard to follow, if they go full in on Algol features. It’s like Perl, just from an opposing angle. However, as opposed to Perl, which comes quite naturally (while writing that is), there isn’t an ideal level at which to “think Algol”. So, yes, it’s an awkward tool. But a brilliant one, especially given the lack of precedence.

For fun, have a look at that beauty of a switch declaration combined with in-line if-else, one of the examples provided in the Revised Report:

switch S:=S1,S2,Q[m], if v>-5 then S3 else S4

For those not familiar with the concept, a switch in Algol is a data structure consisting of labels and/or other switches to be used with goto and an index/subscript. So what will be the meaning and consequences of the following statement (again from the Revised Report), where we pass switch “S” as an argument to a procedure, where it is locally known as “Town”?

goto Town [if y<0 then N else N+1]

Again, this is not an extreme abuse of the concept, this is exemplary text book behavior, provided in the very definition of the language. I do admit, this is somewhat alarming.
Also, yes, we do want run-time checks in Algol, compare this example.

I remember trying out UCSD Pascal, I think on a Commodore PET, at one place I worked. We thought it was interesting, but not suitable for writing production code for our customers (at that time we used a mix of BASIC, compiled BASIC, and Assembler for that). I’ve not been able to find out much about UCSD Pascal on the PET, so perhaps I’m misremembering.

I tried Modular-2 on the Amiga, and again found it interesting, but not really suitable for writing fast action game programs (I had a Lattice C compiler I used for that).

Later, at another company, some of the programmers were using Borland Turbo Pascal, but by then I was using various C compilers on PCs, so I never really got into Turbo Pascal.

I didn’t know about Oberon and its various flavours until I read the article. Seems interesting, but it never really caught on, and I don’t have the time to invest in learning a programming language that no longer has many users.

I don’t think that’s a fault of the language. Happens everywhere. Sloppy/Lazy programmers, sadly.

Have you looked at ML?

-Gordon

Note that you could buy machines that used p-code as their native machine code (implemented in microcode).

And Wirth did a machine that had m-code as its native machine language (also microcoded) for running Modula-2.

2 Likes

I got FPGA card here with Oberon here, another Wirth product, with his version of a windows type operating system with a RISC cpu.
Too busy playing with my own cpu design at the moment, for that.
As for ML it uses too many hard to pronounce words for me. :slight_smile:
Small here, as I have 48kb user space and 32kb os space. DOS era programing
640k or less.

1 Like

I got to use SML when I was in college, in one of my upper-division courses. I took to it well. It was the first time I felt like I understood functional programming.

The thing about it that was a revelation to me was how it used patterns as function signatures. I wrote a binary search in it, and I could just write out all the cases as patterns, with function code that would handle each case, citing new patterns, which represented what to do next, and these patterns would match the function signatures. It was easy to follow the code. Very nice.

2 Likes

I did something similar in … Erlang? I forget which FP language. But when it gels it’s a thing of beauty. Unfortunately I think I have one of those brains that was ruined by BASIC, so imperative and OO solutions come to mind before any other paradigm.

It’s worth pointing out that today’s “big” processors with branch predictors don’t suffer much from index checking. It does pollute the branch prediction cache and code cache, so it has a small measurable impact. For Google-scale services where a 1% overhead may mean tens of millions USD in yearly costs, it sure would make a difference. But on the desktop, and on the mobile? Not very much.

On older pipelined processors without sophisticated branch predictors, there often was a jump convention - say forward jumps not taken, backwards jumps taken. As long as out-of-bounds jumps were forward on such a system, they would not flush the pipeline. This was of some importance on machines with deep pipelines. To mind comes the disaster known as NetBurst (Pentium IV). On that one, a pipeline flush was a lot of wasted work!

1 Like

C vs Other.
Index checking really only needs to be done entering a loop. The problem is the index limits
are not bound to the array vector in C. structure array { constant index -low, constant index-high,
int max-indexed, * data array[#] } could have made a big change in how you wrote programs.

Nobody seems to point out in C you had only static, and stack local variables, very easy
to write fast code for. With ALGOL and PASCAL you get all kinds of weird nesting forcing use
of displays to be copied with every subroutine, and complex display addressing for every variable and slower code.
begin
int private,sailer := false;
if private then fobar := true
call enlist ()
end

subroutine enlist
… where := if fobar then ARMY else NAVY fi ;
this is not nice code.

People compained how early compilers produced poor code,
nobody looked at how the language was designed instead. 3 people for C
coming from BCPL. Great code.
ALgol (Pascal) 10 years of discussion + 1 year with Wirth
making things simple again. Good code on a main frame.
Algol … still being revised from 1968, and poor speed emulated BYTE code.

Modern CPU’s scope well with stack frame displays nowadays. If you think of a 6502, well, there even C is not a real performer and a memory hog.

Giving up scopes for speed makes C a low level language, just a layer above assembler. Abstract enough to make it portable though.
Combined with the syntax chosen, memory management by the user causing memory leaks, it may be efficient and portable, but not my choice for a high level language guarding the programmer.

But C is a production langauge, not like some toy langauge used for teaching that thinks it knows
best .Look how many Computers sold with (COLOR) BASIC in ROM. Writing computer programs
with a 8K or 16K word total memory space on early UNIX was the constraint C had back then.
Unix and DOS 2.0 are the only operating sytems I can think of back then that made using
files and devices easy, rather getting bogged down trivial access details Getting away from punched cards was the big new feature at the time.

I don’t mind being ‘guarded’ but knowing just what block of code compiles to is more useful
for me. I use ADHL for programing FPGA’s (hobby) and having strong checking has saved my
butt several times.I don’t simulate, but test on real hardware by trial and error. 10%
of my designs work, the other 90% don’t because of routing issues with set up and hold times
or the design works only with the other FPGA vender.
I don’t use VHDL or VERLOG because they are so verbose I don’t know just was it is doing.
WinCUPL works good for me, and a Xgen pro programer for GAL’s, this I can test until I get things right.
Wirth’s languages have better protection for the programmer, but the educational enviorment
I think limited its design as a student compiler untill modula 2, but by then C and Unix had taken over.

C can self compile, it self, Pascal can’t with type checking.
Ben.

C is a production language, not like some toy language used for teaching .. you should look at Delphi or Freepascal/Lazarus to see real modern production languages. Or VAX/VMS Pascal or many more good examples of production quality Pascal compilers in the 80ties.

C can self compile, it self, Pascal can’t with type checking. That is nonsense, Pascal compilers from the start were self compiling. Read about Wirth and his work on compilers!

2 Likes