Procrustean String

A “Procrustean string” is a fixed length string into which strings of varying lengths are placed.
If the string inserted is too short, then it is padded out, usually with or null characters.
If the string inserted is too long, it is truncated.
The concept is mentioned in the Sinclair ZX81 and Sinclair Spectrum user manuals, where a portion of a string is replaced by another string using “Procrustean assignment” —the replacement string is truncated or padded in order to have length equal to the portion being replaced.
ZX Spectrum User Manual:


Procrustres was a rogue smith and bandit from Attica who attacked people by stretching them or cutting off their legs, so as to force them to fit the size of an iron bed.
Procrustean-Solution
ZX Spectrum Manual:
http://www.retro8bitcomputers.co.uk/Content/downloads/manuals/zx-spectrum-basic-programming.pdf

3 Likes

I do recall that unpleasant bit of mythology from my schooldays!

We had a Basic brainstorming session over on 6502.org and string slicing came up as a possibly useful non-standard feature to add.

1 Like

Interestingly, I never heard this term in the context of Pascal, where strings were arrays of characters typed to a fixed length[1]. As a result, strings were often defined as "type string = array [1..MAXSTR] of char;", requiring padding.

[1] Compare Kerighan’s famous criticism of Pascal as a language for system programming (as opposed to the purpose of teaching, for which it had been designed) in “Why Pascal is Not My Favorite Programming Language”.

2 Likes

The big problem about Pascal, in my opinion, was the fixed strings and inability to use a “generic” function to handle strings and arrays (“dynamic” strings, arrays). That made things difficult. But that was not a problem in Turbo Pascal and not a problem with the Pascal I used with the mini I wrote for back in the day.
For most of the other points I don’t particularly agree with Kernighan - to this day I always write C code “top down”, I find that much easier to follow. And I like that arrays can be considered of different types if the type declarations define them of different size. As long as we can still have “generic” array handling functions. The “typedef” in C isn’t a true type definition is one of the weakest spots in C, and Pascal got this right. If you just add the fixes provided by Turbo and other “real work” implementations.
The missing “separate compilation” isn’t true of the same “real life” implementations either.

In short - the original teaching language Pascal does have some annoying limitations (although I don’t consider the “top down” a limitation), but the implementations used outside of teaching fixed most, if not all, of that. Fortunately implementations weren’t totally incompatible either, UCSD and Turbo were quite similar, for example. I also didn’t have much trouble porting Turbo source to my minicomputer Pascal either.

1 Like

Regarding types and array, some of this is already in the very structure of a Pascal program, which is very rigid. As I recall it, even, if there were arrays of arbitrary length of a common type in Pascal, you couldn’t do something as simple as ask for a length first and then define an array of that length inside your program, which was trivial in Algol (because a definition section could be at the start of every block and you could have a block everywhere).
Regarding the criticism of types and their limitations in general, while there were libraries and enhanced flavors, this wouldn’t have done for basic OS tools, which should compile out of the box, which was Kerighan’s use case. In this respect, lacking bit-level access is also damning. But, in all fairness, this isn’t what Pascal was designed for. (On the contrary, it was designed to keep you away from such things.)

break and premature return are nice to have, especially for the tasks Kerighan was working on, and Pascal adds certainly another level in the control flow to work around them. However, we have all seen worse things, I guess.

Some of Kerighan’s criticism isn’t without irony. For example, “There is no guaranteed order of evaluation of the logical operators and and or — nothing like && and || in C.”
First, Pascal isn’t alone here, but, I admit, once you encountered short circuit evaluation, it’s hard to see, why anyone should do without it. But, what about C and the “guaranteed order of evaluation”? What about

a = b++ * c + b … ?

In the original specification, nothing in C tells you, when b will be incremented and what a will actually be. As in Pascal, we have something to work around… :wink:

1 Like

You can see how Kernighan tried to lever the Unix/C world view on top of various Pascals in his “Software Tools in Pascal” book.

And the early Pascals were quite rigid, but the micro Pascals less so. UCSD and Turbo both had ways to work around some of the rigidity of Pascal. Because in the end, you had to have a practical language to do real work in the micro computer environment. In Turbo it’s straightforward to map memory to rigid data structures. For example, you could have something defined as an array of 1000, but then cast and convert arbitrary memory pointers to such an array, even if the array was, indeed, less than 1000. You would lose range checking (the compiler would check against the types range, i.e. 1000, not the actual range you allocated). But, a lot of code turned that off anyway just for performance reasons.

In the end, this is pretty much what C does to a point, you end up with a pointer passing style and dereferencing everything.

1 Like

Pascal is teaching language. Goto works for me.
so does (), Lisp and APL just give me a headake,
Only after C added structures did it become a production lanuage.
My cpu (TTL/FPGA) in development has Excess 3
decimals, so floating point I/O is easy to write.Decimal Math was dropped in the 1970’s, yet new standards are promoting its use again, like IBM’s big machines in hardware. PL/I claimed to everything and produced
big programs for the time,Too bad it was tied to IBM hardware.Algol was often only avalible on big machines.
C only made headway with the cheap PDP 11’s running
unix,and then cheap PC’s if you could afford a compiler.
Ben.

I guess, support on Burroughs must have been a thing too, since Multics…

Multics ran on modified GE mainframes, which were sold to Honeywell.