The Mainframe - It's Part in my Downfall

When I got to university in Autumn of 1983, as first year engineering student, I had to take a couse in computing, which was run by the maths department.

All first year students doing a science degree, had to take maths and that included the computing course - about 200 of us in my intake year.

The Computing Centre, was a fairly modern, single storey, building in the centre of Bangor, North Wales, and it housed the DEC 10 and DEC 20 machines.

As well as the main machine room, or goldfish bowl, there were offices for the staff, a teaching room with about 30 CRT terminals and Teletypes, and an “advanced computing lab” that had a DEC Rainbow machine and some other unknown machines.

As well as the central Computing Centre, the Maths Tower, the Physics building and the Engineering building, plus some other departments also had some remote terminals, and some 300 baud acoustic couplers to fit the old GPO phones.

In my first year, I was going to be obliged to learn FORTRAN 77 - and we were told to buy a book of the same name.

Had I started a year earlier I would have had to learn ALGOL 68. At least Bangor University were keeping up with the times…

I had already had about 5 years exposure to microcomputers by this time. My school had a Research Machines 380Z from 1978, and a BBC Model B by 1981 or 1982.

In the Spring of 1983 I had built a ZX81 from a kit, doubled it’s memory capacity to 2KB and put the Z80A in a wirewrap socket - to make a primitive expansion bus that could be inserted into an add-on expansion board, below the main board. The whole thing was portable and ran on 3 C-size NiCad cells.

It got built into a home made Turtle robot based on a BigTrack toy gearbox - complete with wheel rotation counting and collision detector microswitches.

I spent the summer of 83 hand assembling Z80 machine code for a local company using a Z80 SBC with 4K battery-backed, non-volatile RAM for program storage, and had written my own table driven Z80 dissassembler.

Believe me, I was not prepared to go back into the Dark Ages of mainframes and FORTRAN 77.

Each week, we were set a computing assignment in FORTRAN 77, which had to be handed in by Friday at noon - if you wanted it to be marked.

There was nothing worse than 200 maths students all trying to access the DEC 10 and DEC 20 on every available terminal across the campus for 3 hours on a Friday morning. Response time from typing a command or making an edit was measured in multiple seconds.

After 2 terms of this BS, I rebelled and submitted my assignments in BASIC written on my ZX81. I stood up in a lecture towards the end of the Spring term, announcing I could do on a ZX81 in 45 minutes what would take me approximately 6 hours on a mainframe, suggested to the computer lecturer (Mr. Ryman) that he should “Keep banging the rocks together” (Hitchhiker’s Guide Reference) and walked out.

And that was the end of my FORTRAN 77 career - or so I thought!

I passed the computing course, amazingly, but vowed I would never touch a mainframe terminal again.

After another summer job coding Z80s, I commenced my electronic engineering degree properly, where I took a couple of courses in microprocessor programming and digital design. The cpu in question was the 6502, and the engineering department had a bunch of AIM 65s.

Coming from a Z80 background, I found the 6502 somewhat alien, but realised that if I could write 20 lines of 6502 assembler each week to complete the assignment - that was good enough for me. I have stuck to this philosophy for the last 35 years or so - and it has, over the years worked out to be a reasonable strategy.

In my final year, I was given access to a very early scanning electron microscope SEM, that had been supplied to Bangor in the 1960s, one of the first from Cambridge Instruments. My final year project was to build a video capture unit for it.

The advantage of the SEM was that it’s video was relatively low bandwidth, almost audio frequencies, as it was intended to provide a slow scan that would expose a polaroid film, in a special camera that was attached to an unusual, blue phosphor slow scan CRT scope tube. The picture resolution was dependent on the scan rate, often measured in tens of seconds per frame.

With an 8-bit A-D converter and 12-bit D-A converters to generate the X and Y raster scan sawtooth deflection waveforms - I was able to capture an image to a very early 8088 based PC-clone (made by Multitech of Taiwan). Potentially I could do a 4096 x 4096 scan, but limitations of the PC meant that I was limited to 256 x 256 pixels.

I learned enough x86 assembly to energise the DACs with an incrementing count, sample the video on each increment of the X-count and write the sampled video signal from the ADC to a file on disk - all in about 40 lines of x86 assembler.

The PC only had 4 colour graphics (2 bit per pixel) and I seem to recall that there was a low level program in the BIOS tools that allowed you to read a bitmap and display it to the screen.

Well I got my degree, and after a summer dossing around Edinburgh, (my home city) during the 1986 Festival Fringe I took up a job at BBC Research near Reigate, in Surrey.

I learned a few things from my University computing experiences:

  1. Don’t spend half your life learning Z80 mnemonics and hex codes off by heart. This is a skill of limited future use. Other processors exist.

  2. The early mid-1980s were an important transition from mainframes to single uses micros. Academics that teach FORTRAN 77 year after year are deluded and not willing to accept these changes.

  3. Never write more hand assembled code than you need to. Just write the 20 to 50 lines of assembler to prove that your hardware works, and then hand it to someone who wants to program for a living.

  4. Stick to hardware design and implementation as long as you can. When you can no longer see the parts to solder onto the board - consider a career change to software or FPGA design - and turn the font size up on your editor.

  5. Retiring to live on a canal boat in rural West Yorkshire aged 55 is not a bad idea - under the current circumstances.

4 Likes

5 sounds good until you find you can’t have a bath tub.
When you pull the drain stopper, the tub starts to fill up more and and ship sinks down and down, tub, towels and all.

If you really have free time, I can give you a bunch of hand drawen schematic notes, for a 36 bit cpu, I need PCB’s
and a front panel. Must have blinking lights (red off,green on).

Ben,
PS: You get to program in Bengol 73.
PPS: No tiny PCB’s, the big ones 8x10 inches or so.

Ben - the simple problems of bath tubs on canal boats were solved a long time ago. Under the tub outlet is a “gulper” pump - that pushes the water up to an overboard outlet that sits above the waterline.

Nice post @monsonite!

I dare say it’s general that degree courses will be somewhat behind the times: especially when modernising is a matter of writing off a previous investment and making a new investment.

I’ve had my time too using shared computers, as opposed to a computer I can more or less regard as my own. Not only in Uni, where there were VAXes, and a PR1ME, but also working in engineering, where VAX and microVAX were very much the order of the day.

It was not uncommon for a new hire, especially a recent graduate, to run some program which used too much CPU for too long and slowed things down for others.

In fact, even when the workstations turned up, they were expensive, so only a few senior people might get one to themselves, and others would be sharing. The microVAXes were still there, and storage was still shared.

And not long after, I entered the world of the X Terminal, which I still rather like as a model. As an individual engineer, I get my own terminal on my desk. I can connect to one of a few shared SunOS systems (later, Linux systems) to do work, and when I have some serious application or experiment to run, it goes - more or less as a batch job - to a pool of compute servers. Those compute servers run one job per CPU (initially that’s one or two) or sometimes one job per multiple CPUs.

I reckon that was my world from the early 90s to the late 2010s. Almost all compute was shared resource, and the individual engineer had to know what they needed, and departments needed to get the pool of machinery right-sized for their projects. Even now, former colleagues are working this way, submitting 10k jobs overnight, or 100k jobs even.

The tricky thing is the resource sharing: there needs to be enough, in some sense, and then people need to share the commons according to need. Now and then something, or someone, runs wild. There are quotas on usage - there might even be costs accrued to departments, although I prefer not to do things in a bean-counting way.

In my working life, the hardware engineers very much understood the story and used it well. Software engineers less so: they are used to a laptop each. I think it was often the case that a build would take less as a batch job than running on their laptop, which is an incentive to think bigger and get organised. (Build times seems to be one of those things which moderates productivity, and always head in the wrong direction.)

Of course, in my personal life, I more or less use machines in a more personal way. But there’s a linux laptop in the house, and I ssh into it to run things, and now and again we have a conversation about resource utilisation… not often, to be honest. At 8G of RAM, it’s the largest machine we have.

Nice reflection on the writing assembly level code: don’t do too much of it unless it really is your thing. And don’t get too precious about one particular architecture, if you don’t want to get boxed in.

1 Like

Sounds like a very familiar experience to mine, only mine was a few years earlier…

By 1980, at school, I’d had 2 years of the Apple II (and a few other systems, including dial-up BASIC and access to the local university computing center where the local boffins had implemented their own OS on 32-bit mini in Imp77)

So I turned up at the uni, armed with the knowledge of BASIC, 6502 ASM, Imp77 (another algol-like structured language and a bit of FORTRAN) expecting new/shiny/the future and I got hand written coding forms to be given to the “girls in typing”, a slow and overloaded batch system, (eventually), a room full of loud TTY-33’s, a woefully under powered Prime mini and lecturers totally unprepared for the onslaught of this new generation.

To say I was frustrated was somewhat of an understatement. I did find a PDP11/40 running Unix v6 though - that was fun. the uni. course involved Pascal, FORTRAN, COBOL and “PMA” (Prime Macro Assembler). (It was a somewhat traditional place churning out programmers aimed at the commerce sector - banks, etc. where I’ve no-doubt that sort of thing still carries on today). Fortunately I also had C on the PDP11 and migrated to the Engineering department after some years.

And one little “fun” story - we did maths and numerical analysis a part of the course and one day we were doing some iteration type calculation, so myself and friend whipped out our programmable calculators (Casio fx502p which I still have), keyed in the calculation, let it run for a few iterations and had the result in seconds - while most of the rest of the class (not all were computing students) were doing it step by step… I remember the lecturer being somewhat angry at us though - even when we pointed out we were computing students. I think he did see our point of view in the end though.

My course involved a summer of work experience - and a small part of that working at the local hospital was to upgrade software on an Elliot 903 - (c 1966). So fast forward to the past, again …

Cheers,

-Gordon

1 Like

I missed the downfall part of this - it reads as though you’ve had a good and full career.

My journey can be summed up as:

  • Self directed learning the basics on early PCs. (Apple ][, Timex Sinclair 1000, IBM PCjr)
  • More formal training with Pascal and C before going to college.
  • Exposure to all sorts of wonderful things in college: COBOL, basic TTL and circuit layout, Unix (Ultrix on a PDP11), FORTRAN, Modula-2, ML (meta language?), and the beginnings of C++ to name a few.
  • First professional experience: being one of 200 to 300 people on a large project delivering a new operating system for the RISC based AS/400s.
  • Customer consulting
  • More exposure to advanced topics when I went back to University for an advanced degree. (Graph theory, advanced operating systems concepts, parallel computing, networking stacks, neural networks, etc.)
  • Another large project - the IBM BlueGene/L supercomputer (the first in a series of machines)
  • Hard drive firmware
  • Keeping all of the servers running at a large internet company while keeping things reliable, fast, and cheap.
  • Volunteering with an org that preserves the history of computing (VCFed.org)
  • My newest challenge: “hacking” people and keeping a team running.

I’ve done a lot of PowerPC assembly language (professionally) and x85 (for home projects).

I’ve loved most of it, but I’m sure looking forward to retirement in a few years. ;-0

1 Like

Does watching “Star Trek” re-runs and The blinking lights count as Introduction to computers? That is why even today I don’t trust computers with out manual over ride like a BIG reset button. Computers (hardware) have been the hobby side of my life.Ben.