Just to clarify again where I was heading with this, here’s the original question I was responding to.
I’m assuming two things in this scenario.
First, modern, rather than 1980s, technology is available. Therefore transistors, RAM, etc. are cheap relative to the 1980s: it’s no problem having an ASIC or FPGA with a hundred thousand or more gates in it. Reduction in size is driven only by verification needs, not by lack of technology or production cost. (I.e., if you go with fewer transistors on a chip, that’s primarily because it’s easier to decap and verify random samples of what you’ve bought.)
Second, verification being the main aim, simpler software and hardware is preferred even if it’s larger in quantity. If a program is simplified by using registers and RAM of 60-bit words, rather than 20-bit words, that’s preferable even if you end up needing 2-3 times the number of gates/bits for registers and RAM, because it’s easier to verify larger but simpler structures than smaller structures that save space/gates but need more complex logic to do that.
That’s why I took the approach of going back to expensive 1960s designs rather than cheaper 1980s designs. Especially in the microprocessor world of the '70s and '80s, the main thrust was to reduce the number of gates compared to the million-dollar computers of the 1960s, even at the cost of additional complexity. If you’re willing to go back to those 1960s-style architectures (because gates are now so cheap it makes little difference; in essence, everybody can now afford a 1960s-style “million dollar computer” and it fits on a single 10×10 cm board with three or four chips), you can remove some of the issues that microprocessor systems had to deal with, such as registers of small sizes and limited address space and memory. And you’d certainly want to avoid having several computers of different word sizes for different purposes just so you could save on costs: it’s better to keep things simple by using the same 60- or 64-bit CPU for everything than to have separate 18- or 24- or 36-bit CPUs for some purposes.
With the ability to have large numbers of gates on a single chip, things like TTL fan-outs make no difference any more: a standard computer would probably be just one large, very wide but relatively simple, CPU chip plus one static RAM and a ROM or some sort of persistent storage that could be loaded into that RAM on boot. Basically, exactly how we build our simpler “retro” single-board computers today. (Think, e.g., Grant Searle’s designs.)
There are various issues with this approach, of course, but I think that this alterative to '80s-style microprocessor systems is reasonable to consider.