Emerald Bay - the forgotten dBase killer

Does anyone remember Emerald Bay, the database that was meant to be the next big thing after dBase? It came out in 1988 (I think) and was written by Wayne Ratliff, the creator of dBase, and sold by Migent.

I’ve just rediscovered an application I wrote using the Eagle client front end back in 1988. It still runs under DosBox because I have the Emerald Bay database engine runtime. But I’ve lost the rest of the software. So far I’ve been to find very llittle info online, other than an Emerald Bay engine/server manual selling for far too much money. Would love to get a full copy of the software and manuals, but can’t find them anywhere.

I remember seeing mentions of it in magazines at the time, but never saw or used it, or knew anyone who had.

1 Like

Thats a call from the past. Yes, I used Emerald Bay in …1993 with Vulcan.
We were testing it for a systems migration from a mainframe to PC’s so I wrote code to read a data dictionary and port the fixed width files to XBase files and also wrote a query builder tool. It did look promising at the time but DBXL Quicksilver and FoxPro were better in the long run and we never did the migration in the end.

I was a beta user in 88-91. I used it for a variety of tasks. Huge subscriber list, a program to keep track of repairs, parts lists of parts and there locations, etc. Also did a program for a non profit that held every copy of correspondence, check book, all money, in or out. It printed all the checks, the whole shebang! At the end, we found the “books” wefe off by 67 cents, out of (like) $670,000.
The links, indexes, size of databases (the whole disk it was on), indexes, the same.
I was running on an xt in my office, 286’s in use.
Taking care of the subscription db, adding subscribers, I could just put in their postal code and instantly (like as soon as I hit enter) fill in city, State, Country, on an XT!
The programming language, Vulcan, (Ratliff seemed to realy like that name!) was easy, straightforward to use. C was even faster, though a bit harder to deal with somethings!
I had to call a friend (Taught C at USC ) to figure out how to make a poiter to pointer, to a pointer, to a pointer, to (phew!) the final pointer to the data. 5 pointers that to got my data! Fast, once we got the compiler to digest it all !
I got sortta distracted by Desert Storm. When I got back, it seemed to have disappeared! :sob:
I was wanting a version for Linux.
My docs for it has long disappeared unfortunately :sleepy:.
Just to help you understand I will (as best I can!). You start out with the database file that holds Tables (that contains the actual data). Each table is formated to store different fields for the various tables. The index files could which tables to link together to run the index against the data. You could have different indexes for a certain uses. You could link the various indexes to apply each of them to isolate the desired subset for your query for your data.
It may sound convoluted, but it was very fast and flexible!
I had programmed with dBase 2-4. They were good for the time. Emerald Bay was much more flexible than dBase.
I tried out SQL, found it very klutsy. Haven’t done any database work since!

1 Like

Well, this is fun. I’ve now found my review of Emerald Bay, written back in 1988. But I’m now confused about where it was published. The catchline of the text says ‘PCA’ for the title, and other files in the PCA folder refer to ‘PC Amstrad’. I have no recollection of writing for that magazine and am having difficulty finding references to it on the interwebz. Memory is so fickle.

1 Like

Oh please share it! I suppose you’ve searched for key phrases in case there’s a scanned copy somewhere?

Okay, the review, plus all the program files I have, are now up on GitHub.

I’m writing an article about this for my Medium publication (you’ll need a Medium sub to read it because that’s how I afford food). The article should be out within a week.

1 Like