Has anyone ever seen a machine use the digital root algorithm for verifying a multiplication was done correctly?
The digital root method has the advantage that it is significantly easier to do than multiplication itself, but it can’t catch every mistake. It’s cheap enough that maybe it’s useful for identifying problems in microprocessors like the VR4300 multiply bug though.
I don’t believe I have seen this. A related anecdote from the 1960s, the ILLIAC II:
a subtle design bug in the arithmetic unit, (-2) * (-2) giving (-4), escaped detection by the tests using pseudo-random operands but was caught after about nine months by a numerical double-check built into a user’s program
That’s about how long the Pentium FDIV bug took to be discovered.
I’d expect negative numbers to not be as well tested in general. But it’s interesting to think about how we might do the best possible job of testing with the least amount of effort.
I think we can assume packed or unpacked binary coded decimal. So 4 or 8 bits per digit. It’s multiplication of extremely large integers that I’m really concerned about, moreso than anything 64-bit or smaller. That has more potential to be messed up by cosmic rays or whatever. But that also tends to depend on the low level hardware multiplier working correctly too.
We could also precompute the digital roots of the first n integers in an array for faster lookup too.