LYS-16, a 16 bit micro from Sweden, in 1975

From the then-new and feisty Linköping University, based on the multi-chip IMP-16 processor from NatSemi:

image

More info in this document (in Swedish, amenable to machine translation)

the whole Lys machine was built in a bulletproof box of iron plate that weighed about 10 kg, because it was cheapest

To get the assembler started or to get something started in this machine at all, we came to the conclusion that this with cassette tape was very nice. We would run 1200 baud into a small card that had FSK demodulation and then into the CPU or the machine and it was this that was the most troublesome of all. Floppy disks were not invented. The guys at IBM hadn’t budged yet and started Seagate/Shugart

We wrote to two guys who were in a garage in California and we got a response from a guy named Wozniak. I have that letter left somewhere and I didn’t find it until today.

They also kept on but they had bet, we thought, the hell wrong processor. It was one of those 8-bit 6502 or 6800 that they had bet on and that was crap (laughter). Here we really had the solution: a 16-bit machine. It was very nice too. 8 MHz clock if I don’t mind, cycle time of 5 ns. It was pretty good. Misery in fact at that time for our oscilloscopes could only show about 200 ns so we saw sine waves everywhere. At that time, oscilloscopes looked like ones where submarine periscopes, they were round and plug-in modules and went on wheels and were hot as hell.

The Swedish wikipedia entry has more detail too (English translation here):

The essential basic idea behind the project is to give the teaching system, as well as private individuals, a tool, which has the potential to introduce the computer concept in a soft and effective way without sacrificing its inherent power. A computer in physical miniature format and inserted into a system environment that is perceived by everyone as familiar and natural, provides schools and educational institutions with an educational tool that, while being able to make significant contributions to computational processes, learning processes and laboratory exercises, can also contribute to disperse the aura of mysticism that computers have become known to.

And more here too (in Swedish) with photos:
image

via this request for a survey of contributions outside of America.

3 Likes

That’s interesting. I always assumed that the oscilloscope was a prerequisite to the invention of the computer. (Which gives, together with power supplies, the early 1930s as the earliest possible date.) However, they managed to do it without it (or, without a meaningful resolution). How do you manage this without seeing the signal, especially, if you’re advancing into (timing) domains, which are totally new to you?

Interesting question. I’d say a 'scope is a diagnostic tool. You use it to make observations, and you challenge your hypotheses, which you built by striving to make a mental model of the system, and (with luck) you home in on conclusions which lead to further experiments which lead to the system working.

An imperfect scope is better than no scope! A logic probe or frequency counter can be very useful, and you can rig up circuits to estimate things like mark-space ratios, or to record the presence or absence of combinations of events.

It helps if your system is made of simpler parts, because you have more visibility of internals.

It’s impossible to proceed without some theory of operation: and I think that’s one thing which sets apart successful builders. Successful pioneers are rare, but they have the advantage of having understood something by inventing it.

I’m tempted to conclude that a 'scope is useful, but not essential.

I’m especially thinking about ripples and noise. Where and when is your viable signal? You can probably do without an oscilloscope by experience, some rules of thumb and a defensive design. But, if you progressing into new territory, what may have always worked at, say, 0.5 MHz may be out of specs at 8 MHz. Also, your wired prototype setup is probably much noisier than a PCB. I think, it’s quite a feat.

I guess, the story is somewhat proving your point. :slight_smile:

I think we might see survivorship bias: people or teams who failed to get their computers working are not well-known.

But also, I think a solid understanding of principles and operations goes a long way. The engineers of the 50s, I might argue - certainly the very best of them - would have been very good at thinking things through. Conservative design would absolutely have been normal (for survivors!)

And so, perhaps, also in the 70s. Endlessly tinkering with something which doesn’t quite work is a luxury for the amateur to indulge in: figuring out the machinery and building it with margin is a quite different process. (This is even more clear with software: if it takes a day to turn around a batch job, you work a lot harder on getting your code right.)

My education, such as it was, in electronics was focused on synchronous digital design. In that discipline, waveforms are not so important, except for the clock. Preferably just one clock, and no strobes. Logic gates have thresholds which insulate against noise, up until the point where they don’t.

I’ve a feeling that the desire to see square waveforms without ringing causes some people to greatly overspecify their 'scope. You need the clock to deliver regular edges, and you need logic signals to cross a threshold for the last time just before the clock arrives. In time and in space, synchronous digital design is quite forgiving.

But saying all this I must admit I’m not terribly experienced at the practical side!

And yes, getting the LYS-16 working must have been quite a feat.

Well said!

In my understanding, engineering is very much about an intuitive understanding of that particular layer, where abstract building blocks interface the material reality. (I may add, I may be a bit too theoretical for this myself. And this is also, where I have a bit of a problem with the modern job titles, where everyone dealing with software is already an engineer.) This includes an understanding of constraints and tolerances and how to work with them. While the state of measuring defines the standards, you may still have to think a bit ahead of this, in order to survive with a robust design. I guess, experience helps quite a lot… (However, so I was told, in the British ideal this is achieved by effortless genius anyway, while in the German tradition, this is done by thorough over-engineering, in order to introduce as many breaking points as possible. I’m not sure, where the Swedish tradition stands. :slight_smile: )

P.S., speaking of ideals, we may add the definition coined by the Feilden Committee on Engineering Design in 1963:

Engineering design is the use of scientific principles, technical information and imagination in the definition of a structure, machine or system to perform specified functions with maximum economy and efficiency.

2 Likes

I do like the inclusion of ‘imagination’ in that!

In theory they could also downclock it to make sure that all of the glue logic is doing the right thing, and then rely on the specifications of the chips that it’ll work at the higher speeds.

1 Like

I was actually considering to add an emphasis…