I think perhaps I’m still not being clear enough about my “wider words” propsal here. This is much more along the lines of the PDP-10, or perhaps CDC-6600, rather than the common byte-oriented architectures of the IBM 360, PDP-11, VAX, and most microprocessors.
Let me start with the simplest example of the kind of simplification I’m talking about.
Byte and “small-word” need to do multi-step, multi-word arithmetic to achieve sufficient precision for scientific applications (such as, I’m assuming, celestial navigation). Here’s an example for adding the first two elements of a 2*n-wide array on an n-wide processor using little-endian storage:
inputs .const $12, $34, $56, $78
output .const 0
add: load inputs+0 ; least significant word of input 0
add inputs+2 ; add (without carry) to LSW of input 1
store output
load inputs+1 ; MSW of input 0
adc inputs+3 ; add with carry from previous operation
store output+1
So what I’m proposing is to make everything wide enough that you generally wouldn’t need to do multi-word operations; your code would instead look like this:
inputs .const $3412, $7856
output .const 0
add: load inputs
add inputs+1
store output
In the early days of computing, 36-bit words (10.84 decimal digits) were felt to be the minimum viable size for general scientific computing. (32 bits gives only 9.63 decimal digits.)
I’m not sure here if you’re proposing that 20 bits (6.02 decimal digits) is wide enough for most calculations or if you’re simply proposing to continue with methods such as my first example above. I’m
guessing the latter; even for late '70s microcomputers, Microsoft quickly extended the the 24-bit precision (7.22 digits) of their early 8-bit BASICs to 32 bits (9.63 digits).
As background, early computer manufacturers felt that 36 bits (10.84 digits) was the minimum precision necessary for scientific applications. (I am assuming that things such as celestial navigation
would come under this area.) Seymour Cray, when designing the CDC 6600 for scientific calculations of the 1960s, felt that 60 bits (18.06 digits) was more appropriate.
Well, none of course. You’ll note that in my examples above there isn’t even any such thing as “bytes.” But of course one probably does need to interoperate to some degree with the modern backdoored
computers of Earth, if only to exchange email, so it seems to me likely that you might want some sort of concept of “byte” (perhaps a 16-bit one, large enough to hold a Unicode basic plane character) that
makes that easier, if that’s what they’re still using in the age of the OP’s story. Bringing the concept of a byte into your own systems might introduce similar problems to the ones I’m trying to avoid above, though.
I guess the summary is, I’m proposing looking at using “retro” computer technology as the original poster suggested, but a very different kind of retro technology, more like the PDP-10 and CDC 6600
rather than the IBM 360, the PDP-11 and their spiritual successors, which include the processors used in '70s and '80s microcomputers.