I’m finding a lot of information about this algorithm online, but not a lot about what machines actually implement it. Anyone know some?

I have no idea, but I suspect any early machine with microcode could of used

it for the faster version of the same model.

At one point, recently enough, nearly every multiplier on every chip used a modified Booths’ algorithm:

“Booth needed a hardware multiplier for his machine, and he went out one afternoon with Kathleen, and they went to a tea shop in Southampton Row in Bloomsbury in London, and, an idea came to him, which he says he drew on the back of a paper napkin. And, this was a multiplier, a quite simple multiplier, but nonetheless a very neat multiplier. And this is, this was the first Booth multiplier, and the Booth multiplier has a minor adjustment which a colleague of Booth’s proposed, and hence it’s officially a modified Booth multiplier. The modified Booth multiplier is what is in almost every chip that is being manufactured today. The multiplier is a Booth multiplier. Not many people know this, not many people even need to know it. But nonetheless, the Booth multiplier is one of two ways of doing hardware multiplication, and by far the most popular. So literally, every year billions of these are produced because, these days, as everybody listening to this will know, chips don’t have a single multiplier; they will have many multipliers on circuits across the chips, and, they are Booth multipliers, and they started in a tea shop in Southampton Row in London in, around 1949, 1950.”

That’s from

And for sure the multiplier in ARM2:

Edit: and the Pentium 4, according to Ken Shirriff

Real multipliers achieve further performance improvements by splitting up the adders and creating a more complex tree: the venerable Wallace tree (1964) and Dadda multiplier (1965) are two popular approaches. Another optimization is the Booth algorithm (1951), which performs signed multiplication directly, without converting the arguments to positive values first. The Pentium 4 (2000) used a Booth encoder and a Wallace tree (ref), but research in the early 2000s found the Dadda tree is faster and it is now more popular.

Edit: worth reading this quadibloc page, which says the Alpha 21264 had a Booth multiplier, also the IBM System/360 Model 91.

So the answer is: probably would easier to list all the chips that *don’t* use Booth multipliers to multiply during those 50 years.

Great stuff

In 1963-5 I had a very basic educational computer called CEDUS that was 16 bit, serial, built from Mullard Combi elements, and its various parts were plugged unit to unit; It had a 1000 word drum as memory and a couple of registers; the instruction set was Add, Subtract, Collate, Left-shift, Jump( not sure what test) and Read/Write to the drum. Within this very constrained environment, and with the admirable advice of RSRE Malvern recorded in a “handbook” on algorithms for series, arithmetic and sorting, I implemented double-length floating point, Booth’s multiplication and division and a a number of simple series for geometric tables. The store limitation and the wholly tedious method of code entry meant most of the above were undertaken over months and one after another as one chose what would best demonstrate to students how digital circuits and programming could achieve the sort of calculations required for interceptions and air traffic control at the most fundamental level. As a introduction to logic, circuits, and minimal coding at the most basic level both as a student and then as an instructor - the 8 week course was run for 3-4 years ; by the end we had acquired a couple of Elliot 803’s and so the level of training and demonstrations made enormous leaps, though it was clear that later students had much more interest in what could be demonstrated than in how it was to be achieved - perhaps a sad, early reflection of the future in which so much in IT is assumed or taken for granted without a clear understanding of design, performance, evaluation or security by “app-scribblers” nor their managers.

Excellent first-person history thanks @philp !

What does Collate do?

Hi, A… , It’s close to 60 years ago so memory a bit addled but Collate was probably a word-length comparison or merge, I just recall the very limited instruction set included it, not having any of the old paper cards I used to document a sequence I can’t unpick an algorithm where it would have occurred. Sorry: There was a 2nd CEDUS machine used by the Royal Navy for their teaching of the basics of computing ( or so I was led to believe at the time) but we never got in touch. If any ex-RNs (students or instructors) recall this primitive machine - it would be in the spirit of “Retro” to add their 6-pennuth to the thread,