Project Xanadu - 1984 design documents (originally secret)

From 1984, a scan of the secret (“burn before reading”) documents from Project Xanadu:

There’s a remastered version of the docs online here:

As a quick intro or reminder, Wikipedia says:

Project Xanadu was the first hypertext project, founded in 1960 by Ted Nelson. Administrators of Project Xanadu have declared it superior to the World Wide Web… >

Wired magazine’s “The Curse of Xanadu” called it “the longest-running vaporware story in the history of the computer industry”. The first attempt at implementation began in 1960, but it was not until 1998 that an incomplete implementation was released. A version described as “a working deliverable”, OpenXanadu, was made available in 2014.

Found linked in an archive of Chip Morningstar’s blog Habitat Chronicles:

Things You Find While Cleaning Your Office

I was going through a bunch of old papers … and I came across a document I had written for Xanadu back in 1984. This was a tome documenting the various Xanadu data structures. It was written as part of a deal we were doing to try to get some funding from the System Development Foundation. This was back in the day when we were still stupid about intellectual property and regarded all our secret knowledge as highly proprietary magic to be guarded jealously.

As linked in this HN discussion about this essay on the topic:

Also linked there, another take on the sorry story

7 Likes

Re. the Finalist 12 article:

When I read the part about “Mark Miller from Xerox Parc wanted to write Xanadu in Smalltalk,” I had to laugh, because that sounds just like something I would’ve wanted to do. :laughing: Only thing was I had just graduated from high school when that part of the story was happening, and I didn’t know Smalltalk existed. :slightly_smiling_face:

The part that didn’t make sense to me was Xanadu being written on “PCs.” I guess they used Smalltalk/V from Digitalk? I was under the impression that a full Smalltalk system needed at least 1 MB of memory, and even then, not much could be accomplished with that. There was a 286 version of Smalltalk/V by 1988. It’d be interesting to know how much memory was added to their PCs. I guess to speed it up, they translated Smalltalk to C++, and compiled that. I had trouble understanding, though, why the C++ compile took a week. The only thing I feel I can compare it to is when I compiled MiNT kernel updates on my 16 Mhz Atari Mega STe, in 1993. I had MiNT and GCC set up, running on a hard drive. The kernel took an afternoon to compile, and I think the binary took up 256-384K of memory. A C++ compile would’ve been slower, for sure, and 286’s were likely running at half the speed of my STe, but it makes me wonder, “How big was this thing?”

I take Gwern’s point that Nelson thought of all this stuff before it had been practically tried out, and so there were bound to be some problems with it. Nelson tried to do “the whole thing,” when it would’ve been better to do trials of pieces of it, to see how well they worked out.

When I read over Nelson’s vision, I saw one big problem was the only way he would’ve been able to make it work was if the entire network, the entire Xanadu system was proprietary, mainly because of the tracking issues with copyrights, and enforcing payments. I’ve tried to think about what would need to be done to make something like it work on the internet. The only thing that came to mind was to use some kind of DRM, and turn Xanadu functions into network protocols, but of course, then the issue would be how to attract people enough to prefer it over the web.

I sort of agree with the criticism about the side-by-side reading scheme Nelson wanted to use. The only time I felt like I would’ve liked that was when I was reading a computer history book called The Dream Machine, by Waldrop. It was rather frustrating to read, especially when I was using it as a research source, because with many of the chapters, Waldrop had you “rewind” the clock back about a decade. The feeling was like, “Okay, I know I’ve gotten you from 1960 to 1970 with this one set of characters and projects, but let’s go back to 1960, because I have this other stuff to tell you.” I saw why he was doing it, because to intermix the stories together in book format would’ve been a mish-mash, distracting the reader, back and forth between different research projects it described that were going on simultaneously, but when I was using it for research, I felt I would’ve preferred to have the parallel stories side-by-side, because I was fine with forming my own narrative out of the source material. I could line up events together, as I felt appropriate for what I wanted to convey, and reading the stories side-by-side, in chronological order, would’ve made that more efficient.

I have sometimes thought that while Ted Nelson, Doug Engelbart, and Alan Kay have deeply wanted their thought processes about content to become mainstream in society, the only places that would feel an excuse to use it–where it would feel practical–are in universities. Though, I was struck that when Google promoted its Wave collaboration platform–a service that showed some glimmers of Doug Engelbart’s vision–that university faculty didn’t understand the first thing about how to use it, and avoided it. The students weren’t much better, just using it like a chat app., which professors felt distracted from their school work. Google spent much of their promotion efforts with Wave at universities, having the same idea that while the general public wouldn’t find it useful, universities were fertile ground. However, that turned out to be a bust, as well. Wave was cancelled in 2010.

It was interesting to read that there were some early prototypes of Xanadu produced, and demonstrated where Gwern could see them, and that the early versions were not practical for use. He said the text was indecipherable, because the display resolution was too low.

I very much like Nelson’s idea of two-way links, particularly for open source texts. I can see where this would be impractical for commercial UIs, though.

2 Likes

Back then it was assumed what would be the internet, you would have a clean and safe links between the databases. Does impractical mean you can not post unwanted ads every where, or hijack internet connections? Other charges not for accessing text or services most likely been regulated in some way; like the BBC TV service. The current internet model seems to be like the US TV where Ads pay for services.

A shortage of memory and very slow compiles sounds to me like it might involve a great amount of paging in and out. Remember, virtual memory and swap files are both a blessing and a curse… that is, supposing we’re talking of some OS which can do that.

The Smalltalk version of Xanadu was written using ParcPlace Smalltalk (currently Cincom VisualWorks) which needed workstations to run.

2 Likes

Thanks for that clarification. Gwern was saying that the only source code that was open sourced was the C++ output (from a Smalltalk-to-C++ compiler that was written in the Xanadu project).

Looking at the web page for the project, it says the source code was stored at a site called udanax.com (Xanadu spelled backwards), but when I try to go there, I get a message saying “the domain is available,” which I take to mean the site no longer exists.

Looking at Ted Nelson’s personal web pages, it looks like he prefers to convey his ideas through concepts, rather than source code (not besmirching that, just noting it), which perhaps conveys some open-mindedness re. how those ideas might be implemented.

Looking at when the Xanadu team actually started on it… it wasn’t when I was in high school, but probably 5th grade…

A note about the articles. I’ve seen an annoying propensity with computer histories about obscure technologies, particularly given by younger folk, that they give a few valuable insights through their deep dive on facts, but fall into judging the work based on current context, and inevitably end up calling the people they profile “crackpots.” Every time I see that, I think they’ve done very little work to understand the people that were profiled, and the context in which they came up with their ideas, except from a techno-centric context (and a current one, at that, which in itself is a problem, because that wasn’t the context the person profiled was using). We don’t have to accept the person’s belief system whole, but perhaps that belief system could be appreciated for its desire to be helpful to humanity, particularly given the context in which it arose.

In the case of Ted Nelson, one thing that inevitably comes to my mind is that while he was determined to see his vision come to reality, something about his eccentricity got in the way, and you get a sense of that from the Finalist 12 article. So, while I can fault that with Nelson, I think it’s important to create some distance between the person and their ideas, because even though the person’s proclivities may rub one the wrong way, that doesn’t mean their ideas are bad, just perhaps implemented badly. It seemed to me what Nelson’s vision suffered from was not that he is a “crackpot,” but that he has trouble with practicality (which needs some attention, but not completely). If that makes him a crackpot, then a whole lot of artistic types that I’ve seen in my life would fall into that bucket. Somehow, that strikes me as painting with too broad a brush.

Something I’ve appreciated about Alan Kay is that he spent a lot of time trying to formulate some ideas about education that feed into what he wanted to see happen with personal computing. He realized he put the cart before the horse, by focusing on personal computing first, and needed to take the opposite approach. Though, once he got into that, he realized even there, he needed to take some further steps back from that context, and focus on some problems in the approach of education systems, which have been pretty ingrained.

3 Likes

Still available via the wayback machine, although it looks to have been “derelict” for some time.

2 Likes