For a tangible example, take my PET online emulator: I’m pretty happy with internal consistency (the memory access of the CPU T-phases are consistent with screen memory access – thus it runs the newest “racing the PETSCII character” high-res demos and even shows memory interference on the screen, so this is good enough for me ), but externally – I got actually frightened. here’s the story:
As you may know, the PET hasn’t a sound-chip or even a speaker, but you can set a timer in “free running” mode, which will shift a bit pattern to an exposed output pin (on the user port). This will produce pulse sound at 1Mhz. No problem with down-sampling this with internal consistency on a modern machine. So we end up with samples in high-res digital audio at 48 KHz, which we must provide to a playback callback of the browser.
Fine, So far, so good. The way to do this is to write the samples to a ring buffer and to exhibit a portion of this to the audio callback. So, how long should that ring buffer be? One PET frame samples down to 800 modern audio samples, so 2K to be on the safe side? I wrote and tested this first on an older (actually now rather old) machine and an older OS, because, if it works there, it will work everywhere, right? It worked well, so I moved eventually to a new machine, with a current OS and a current browser – and the ring buffer immediately ran into a tail crash. Well, funny. — So, double the buffer size? Triple it? — With 32K it still ran into crashes with a demo (midways into “Back To The PET”). We may keep in mind that this is a demo running on its own deterministically, without any user interaction. It turns out, we need 64K to avoid any crashes.
Mind that 32 x 1024 / 48,000 = 0.68…! At this point, the internal scheduling was more than half a second ahead! I checked what other emulators were doing, e.g., Beeb.js has the same buffer size, probably for a reason.
This really changed, how I think about these things.
PS: Resistance is futile, the Borgs have already won.