And BASIC let you use inexpensive Cassete tape not a floppy drive.
My first two computers both used cassette tape initially, and it had the great merit of being cheap and available. They both used some variant of the Kansas City standard, which uses two tones, and is relatively robust and portable. But it’s not terribly dense, and so saving and loading is not terribly fast. In my case, with a simple cheap commercial radio cassette, there was no random access either, but with saving and loading typed in Basic programs that doesn’t feel like a limitation.
Both the Compukit and the BBC Micro could in principle be upgraded to floppy disk: you needed to add a ROM to handle the boot sector or the whole filing system, and you needed to add a disk interface. And you needed a floppy drive. This would end up much more additional cost than the initial cost: in the one case I didn’t do it, and in the other case I did.
But back to tape: there were random access versions of cassette tape drives, and there were the endless loop types of systems which could at least always read any given block, but at the cost of a full circuit.
Some systems didn’t use tones, but pulses, and if they bit banged their interface, as I think both Sinclair and Commodore did, there was the possibility of pushing the protocol to a denser and therefore faster encoding. At the cost, I would think, of reliability and compatibility with other tape units.
And let’s not forget, plenty of commercial software was distributed on cassette tape, even if it did take many minutes to load.
I wrote a pretty large program in BASIC on a Dragon 32 (on behalf of someone who needed some software to create football betting systems based on certain criteria - what you wanted to optimise for etc.) So after each update I stored the source code on cassette tape. The cassette storage was so horribly unreliable that I always wrote three copies of the source to a tape, on three cassettes, i.e. nine copies. Only then could I be reasonably sure that I could get a readable copy the next day.
After that exercise (large program, stored on cassette) I never wanted to go back to (that type of) BASIC or cassette storage. Never again - and I never regretted that decision!
Agreed, reliability is the Achilles heel. Some systems did better than others. I’m pretty sure one system would save data twice as the default action. In the case of Acorn, tape saves had a block structure, with each block having an ID and a CRC. You could at least tell which block was broken, and if you had multiple copies, the OS would read only the next block ID, skipping the others. I don’t know of any system which applied any forward error correction - it’s perhaps too expensive to compute in comparison with 8 bit CPUs.
I did have a bit of a victory recently processing a tape image to patch over a dropout. Helped by the implicit redundancy of using tones for bits.
There were actually some fast and very reliable cassette interfaces, like the Sharp MZ-80’s built-in cassette drive featuring a transfer rate of 1200 bits/sec. (Compare this to the Commodore Datasette 1530, wich is rated at 50 bytes per second, about 400 bits/sec.) Due to this, disk drives never enjoyed much of a popularity on the entire MZ range, as the cassette’s performance was deemed good enough. Since the Sharp MZs were “clean machines” with just a simple monitor in ROM, but no language of any kind, a suitable loading mechanism was a requirement.
Fun fact: the transfer rate is too fast to work reliably with MP3 compressed audio files.
Before I had enough money for the 5 1/4" floppy hardware, I used a Panasonic cassette recorder with my Apple ][+ for more than a year, and I was impressed with its speed and reliabilty compared to the TRS-80 Model 1 Level 2 at school. I know, that’s an awfully low bar to clear, but still …
I believe that the Apple was about 1200 bits/sec too. When I finally got the floppy disk, I was in geek heaven, because it was FAST …