Naive first contact with computers

I thought it would be interesting to have this topic where we’d share some of our naive impressions and expectations of computers as we discovered them.

For instance, when I was 11 years old or so (back in '92), I had my first experience playing a PC game, namely Prince of Persia for DOS. Yeah, you all know it. I played it at my mother’s workplace, where she took me one day, on a 286 or maybe a 386.
One of her colleagues wrote for me on a post-it the weird incantation text I had to write in a weirder black box thingie to start the game, and that was burned in my memory forever:

cd g ENTER
cd prince ENTER
prince ENTER

It was just like that, but he drew some 90-degree rotated square parenthesis symbols underneath each space to explain that I needed to press space when I saw that.

But what was so naive of me was that after I started playing the game, after I understood that the keyboard was actually controlling the character on the screen, I was left with a puzzle… How does that ‘thing’ of a computer move the character? How does it know I am jumping, and how does it know to fall?

At one point, I sat there, watching the flames on the castle’s walls, moving the character with the left-right keys, trying to figure out what made it respond like that.

Then I went back home, pulled up a notepad, and drew the already learned-by-heart first level, screen by screen, then overlaid a jumping prince with his full jump trajectory here and there. And it was such a puzzle for me that the prince can jump from x, but he can jump from x+1 as well. How in the hell is this done?

The conclusion I arrived at was that all possible combinations of movement have been programmed into this game, and depending on when the player presses buttons, one scenario or another comes into play.

This was helped by the fact that some moves (like falling or jumping) were uninterruptible after a point. But then I was wondering, “How do they know how to resume from where I was left then?” I wasn’t able to make distinctions between the character and the environment, either. If he jumps from behind the pillar, then he must have some jumping animation with the pillar. But what if I move a bit to the right? Is there an animation with half of the pillar as well?

I remember sitting for weeks trying to get it and leaving it as a mystery for years to come, until… Well, I started programming, and I started understanding variables and states.

5 Likes

Nice idea for a thread!

I do have a recollection of being bemused by how calculators worked. I had some idea that with batteries wires and buttons you could make light bulbs light up. I imagined it possible that some combination of buttons could do simple arithmetic, like say 3 and 5 buttons could cause an 8 to light up. But I couldn’t imagine how all the possible calculations could be compressed into a tangle of wires.

4 Likes

My first experience with computers (at least real ones - I made a tic-tac-toe “computer” out of Christmas lights and switches, when a kid) was in 1975, when a buddy took me to the high school (I was days away from 8th grade graduation) and sat me down in front of a typewriter (ASR-33). He picked up a telephone near the typewriter, dialed a number and placed the handset into a little box next to the typewriter. Then he typed something on the typewriter and the typewriter typed back, ALL…BY…ITSELF!!! I was amazed that a typewriter could type by itself, and, from that point on, my career switched from just electronics to computers.

3 Likes

First experience - 1977/78. At school at the age of 16 - computers were big things somewhere else, but there was one computer that did the rounds in Edinburgh on a big trolley with printer on-top and a mark-sense card reader by the side…

The HP9830A.

Teacher introduced it by having it play a game - 23-matches. The computer of-course beat me. Then the teacher listed the program and showed me the bit where it calculated the best number to take and I probably had my first “How hard can it be” moment and set off writing (BASIC) programs for it.

We did have to write some boring stuff first - calculate compound interest and stuff like that because that’s what computer were really for, not playing games, however…

But that was that. computers were my new “tech”. Shortly after that the school got an Apple II for a demo then we were asked to evaluate the Apple II, PET and TRS-80 for use in schools in Edinburgh - the Apple II won hands-down… At least for 4 or 5 years until the BBC Micro was more established…

Fun times!

-Gordon

5 Likes

I remember the experience of wanting to write a space invaders-like game at a friend’s house. Drawing a simple spaceship at a point on the screen in BBC BASIC was easy but, then, how to make it move?

You could use a variable for the x coordinate and then give it a new value, but how could you make that general and increment the value? Reading a magazine or book at some point revealed that you could do something like this:

X = X + 1

Although this was before algebra lessons, it seemed wrong to write something like that, and just not something I would have tried. Here, it seemed, was a case where the similarity of the syntax to mathematical notation could prevent you from exploring ways to achieve what you wanted. I kind of felt like I was being misled somehow.

3 Likes

I have always liked x + 1 → x better. Any more symbols than that my head explodes, like a /\ b.

I’ve actually no such story, which is a story of it’s own: My school had a voluntary class on “electronic data processing”, but this fell in a peculiar time, when home computers began to replace programmable calculators for the purpose. So, awaiting the arrival of a real computer, we skipped the electronic calculators. (I recall the VIC-20 being presented just before Christmas in class, I think, the Apple II was subject to Reagan era export restrictions, since Austria was/is a neutral country, Tandy didn’t sell here, and PETs weren’t much a thing, I think, you had to import them from Germany – so none of the “trinity” for me.) However, the arrival of that computer (which turned out to be a statically troubled domestic Philips product, which was more often calling for service than operating as intended) was delayed and the class must have been among the last to learn computing on paper. We got the textbook in advance and I stormed ahead and was, probably for aesthetic reasons, caught by the Algol section, with Algol becoming my first language – which I never ran on a real computer. To a certain extent, this division between software logic and the exotism of the hardware this runs on has stayed with me.

In terms of hardware, the school had a huge, board-sized logic trainer (with light bulbs) and also a lone ASR-like teletype keyboard mechanism. I though, the latter must have been the most exotic thing, I’ve ever seen.

(The real trouble with that machine was that, as it failed, it permanently locked in a “call service” mode with this as the sole startup message, which could only be unlocked by said service. Eventually it was found that this must be related to static interference and an antistatic mat was procured to counteract this. However, the class was held by a physics teacher, who, as a proud sign of his occupation, always – as in always – wore a nylon lab coat, which emanated an electrostatic field that began to inflict the machine at a distance of about one and a half meters – and thus easily defeated the antistatic mat. In hindsight, we should have placed the teacher on that mat. Due to this, we got about three short sessions out of that computer in two years.)

1 Like

Sounds like you want … APRICOT…

(Things to the right of the colon is what I typed, stuff to the left is what the computer typed and remembered for me).

Screenshot_2023-08-05_15-25-32

:wink:

-Gordon

Haha, I totally recognize that. At least in BASIC it used to make more sense since you would write

LET x = x + 1

… and that makes it somewhat clear this is an assignment operation.

Not sure if that expression graduated into what we use today as a shorthand, but indeed, algebra wise, looks quite weird.

1 Like

My first contact with a real computer was with a teletype (probably also an ASR-33) and tic-tac-toe. My father worked at a GE plant and they had an annual open house event. This was probably 1974 or so (when I was 12) and one exhibit was the teletype connected to a remove GE computer playing tic-tac-toe against the visitors. When it was my turn I was given the first move and selected the middle square. The computer played the one to its left. I could hardly believe it - it had lost on its very first move! Who had programmed that thing? Was it on purpose? Anyway, I played to the end and won as expected completely shocking the crowd.

Around the same time I had been impressed with the design of the cameras used in the Viking probes that would land on Mars a couple of years later. They used a pair of mirrors to scan the scene with a single photocell. I thought of using the same idea for a robot, but how to convert the output of the light sensor into a digital signal? I came up with the idea of injecting the photocell’s output into a carbon rod and since I didn’t know about voltages back then I imagined the current would become weaker and weaker as it went along the rod. I would have a motor move along the rod until the current was too weak to keep it moving and at that point a magnetic head would scan a strip with the binary value. There would be a different strip at each possible position (like the tracks on a magnetic drum). Then the motor would be pulled down to the start of the rod so the next pixel could be scanned. At 10 seconds per pixel it would take 9 days per frame to scan at a 320x240 resolution.

4 Likes

I don’t have a specific story, just generally, that (big) computers are smart.
But it reminds me of my post “On error goto”

1 Like

A couple of first contact types of anecdotes from my extended family:

  • An aunt, who wasn’t technical at all and must have read some popular presentation, propounded the idea, perhaps in the early 1970s, that computers could solve problems very much faster than people, but took ages to program. The remark bothered me - why make such things in that case? (I think I’ve seen a press clipping about the ENIAC or similar which said very much this kind of thing.)

  • A nephew, perhaps around the turn of the millennium, evidently held that common misconception often applied to desktop machines of the PC era, that it was the monitor that was the computer, and the big grey box on the floor was somehow related but not the thing itself.

2 Likes

My first contact was during an ‘O’ Level physics class in either 1978 or '79. Our physics teacher brought in a rectangular wooden box that had a perspex top to it, the perspex had a square cut out of it so you could access the keypad of the KIM1 contained within. Memory and embellishment leads me to want to believe that the box had been lovingly crafted with dovetail joints, although that is unlikely. I was impressed by this device but I don’t recall that we did anything useful with it.

The school was (still is) part of a larger college, leisure and community centre complex in North Manchester and the college had a couple of rooms full of various model Commodore PETs, some of which also had disk drives. We eventually were given access to these and used them for very simple programming. There was also an Apple ][, but you could only touch that if you’d demonstrated some level of skill, and I was probably too overawed by the hardware to develop any real software skill at that time!

Some time ago I was looking for information on the setup at my old school and found mention of my physics teacher, Mr E Purcell, in a copy of the 1982 Commodore Club News. He is listed as running Educational Workshops from the school. I put 2 and 2 together, possibly making an erroneous number, and believe that he probably assembled and built the KIM1 that we were introduced to.

Link to 6502 Org and the archive of Commodore Club News.

1 Like

I believe, this was the intuitive understanding when it came to hitting the computer, either as an admonishing tap or in real anger. Which poses the the question: How do you hit Azure in the age of flat panels? Computing has become so unrelatable, since… :slight_smile:

My dad taught 8th grade science and one day he discovered that the PTA had bought a computer for the school. The school didn’t know what to do with it, so they had stored it away and my dad found it while doing inventory.

It was a TRS-80 Model I Level I with 8K of RAM. He got the school to pay to upgrade the system to a Level II and 16K.

That summer, he took it home to get some experience using it and that’s when I found it. So that would have been 1979.

No Google. No Stack Overflow. Just the manuals, some type-it-in books and time. But by the time I graduated from High School, I had already “discovered” structured programming (ala Dijkstra).

1 Like

My problem was how a single transistor acted as an inverter. If you applied ‘1’ to it’s input, it was on, therefore at logic ‘1’, so wasn’t inverting?

It’s only when I considered the volts on input and output it suddenly dawned on me. With 0V applied, the output was ‘1’ - because the output would float high! With a voltage on input, the output would drop to 0V as the transistor pulled the output down. Hence an inverter!

3 Likes

My very first taste of computers (apart from a visit to the
University of Delaware with my 8th-grade math club to see a
DEC PDP-8 (a “straight 8”) in the flesh – not that I had
any idea back then of what the heck it was I was seeing ;-> )
was in the spring of 1969, at my high school in northern
Delaware. It had taken advantage of a Federal grant to equip a room
with some teletype machines and acoustic couplers.

The teletypes provided access to “Conversational Fortran. . .” followed
by a Roman numeral. Long afterward, after years as a computer programmer,
I’d edited this in memory to make the Roman numeral “IV” – because
“Fortran IV” is a thing, right?

It was only within the past decade, in the era of the Web,
that I was able to cobble together evidence (in addition to
the sketchy information in my high-school yearbook) that caused
me to conclude that the system on the other end of that teletype in
the spring of '69 was no less than a Univac 1108 at Computer Sciences
Corporation, who were testing a new timesharing network (“Infonet”) at
the time. And the Univac had a “Conversational Fortran V”. Presumably the
operating system was CSC’s own modified version of Exec II (CSCX)
rather than Exec 8, but I believe the Fortran V language processor
would have been the same in either case.

That taste of high-end timesharing was a luxurious experience, never
to be repeated. The following school year, those same teletype machines
were connected to an overloaded IBM 1130 at the University of Delaware
running BASIC. My fooling around with the Fortran system had been strictly
extracurricular, but I had the misfortune of signing up for the
official BASIC course that fall, and had to do actual course
assignments on the hideously slow system. Talk about bait and
switch! ;->

A paper about such computer-education projects by a scholar at the University
of Delaware comments “[D]uring the summer of 1969 and the following school
year. . . [t]he time-sharing service was provided by an IBM 1130 computer
housed at the University of Delaware and funded jointly by EDTECH and DSAA. . .
After the sophistication of the equipment utilized the previous year,
the three EDTECH high schools were generally dissatisfied with the service. . .”

Or, as a Usenet commenter once noted about a similar service, “I remember
using a multiuser BASIC interpreter that was running remotely on an IBM 1130
at Central High School in Philadelpha in the late 1960’s. It was godawful slow…”
It sure was! It was an agonizing chore to get assignments completed.
The 1130 multiuser BASIC that was inflicted on high-school students may well
be the one mentioned at 1130.org. It’s listed there as presumed lost.

That experience was not unlike my first trip on an airplane. The flight
took place in the early spring of 1976, and I was travelling to Key West from Delaware
(via a Philadelphia airport) to meet a friend for a holiday. Due to the
quirks of airline scheduling I was booked on an almost-empty L1011 widebody.
I had the whole row of seats to myself! “I could get used to this!” I thought.
Well, it’s never ever happened again. The flight back home was like being
crammed onto a crowded bus.

;->

3 Likes

I had something similar at school, in the subject “Computer Studies” which was a new, vocational, course: a CSE course, which was less academic than the usual O level courses. It was new to the teacher, who was also new to teaching, I think. Anyway, I asked the question as to how an inverting logic gate could produce a high signal if all the inputs were low. The teacher had a quick think, and said that it was probably down to the use of capacitors, as they store charge (or voltage.) I remained unconvinced, and not long after realised that it was because logic gates are supplied with power - not shown on logic diagrams, of course.

2 Likes

My first encounter with a computer was when I was about 7 years old, in 1977. My mom and I went to visit her sister in Boston. I don’t know if her sister had a boyfriend, or if he was just someone she knew, but this guy I met on our trip introduced me to a Heathkit computer he’d built. He loaded up a “turtle graphics” program on it, which when I look back on it worked in character mode. The “turtle” was an inverse square that I could move around the screen with arrow keys on the keyboard, and I could press another key to toggle the “pen down” to draw, or “pull the pen up” to not draw. I was entranced! I don’t know how long I was going at it, but it must’ve been an hour. I didn’t want to stop, but we of course had to leave.

On that same trip, we visited the Boston Children’s Museum, and they had what I think was a DEC Vax exhibit (at least some terminals hooked up to one). It gave each kid 5 minutes to play a game. You could pick from a selection of games, as I remember. They were all in text mode. I remember playing hangman on it. The exhibit was swamped; so many kids wanted to play on it. I went through the line over and over again.

I didn’t think about how it worked. I just loved it.

I didn’t get to use a computer again until I was 11. We stopped by our local library, and I saw this middle-aged man doing something on an Atari 400 computer. It wasn’t like what I’d seen before. Something seemed wrong, or broken. I kept seeing him flip between a blue screen with text on it, and a low-rez graphics screen that looked like a race track, with what looked like animal figures sprayed across it in lines. I sat down at a table a fair distance from him, just watching him. I assumed he was an employee of the library, and only people like him could use it. I started getting the idea that he was doing something to make the computer put up this graphics display. It seemed like he was trying to make a horse racing game, but it kept messing up. I thought wow, I’m seeing this guy make something on the computer. Though, he didn’t seem to be successful in making his horse race. I saw him go through the cycle of going back to the text, and then seeing the graphics do exactly the same thing.

I think what would happen was he’d line up a bunch of “horses” on one side of the screen, something would happen, and then all of a sudden, copies of the horse figures were plastered across the screen in lines. It was like he was trying to animate, but didn’t understand you had to do something to slow down the action, and that you had to erase where the horses had been.

Anyway, I was really interested in the fact that this guy was in the process of creating something on this computer. I had never seen that before. I mean, I kinda knew from TV shows that computers were programmed by smart people to do stuff, but I never thought I’d see it happen in front of me.

My mom noticed my interest, and asked me about it. I said I was watching this guy use that computer. She asked if I wanted to use it. I said, “They won’t let me.” I was just sure they wouldn’t let kids on it. I had never seen kids use a computer, other than that visit I made to the Boston Children’s Museum. She insisted I ask the librarian about it. I thought for sure they’d say no, but I asked anyway. I was surprised. The librarian said I could use it if I was 10 years of age or older, and I took a 15-minute orientation. I said okay, I’ll do it! So, I went to an orientation with some other people. We learned about how to operate the Atari, how to load software, how to check out software from behind the desk, and some basics for interacting with the computer. I asked about what that guy I was watching was doing. I was informed that he was programming in a language called Basic. They had the Basic cartridge behind the desk. They also had a tutorial series on learning the language. That was one of the first things I went for, but it was tough. I didn’t find the tutorial that helpful. I ended up reading the manual, which was better. I spent months learning how to program well in Basic, often getting help from people who happened to be hanging around the computer. That’s what got me started.

It was gratifying listening to Ted Kahn’s interview on the Antic podcast, back in 2016.

He talked about how he was part of a group at Atari that was promoting their computers through donations to non-profits. He mentioned they donated to schools, museums, etc. I remembered that the Atari 400 at the library had a badge on it saying it was donated by some foundation, or institute. I know it wasn’t the Atari Institute for Educational Action Research. It had a different name. I can’t remember it now, but I couldn’t help thinking that his work probably had a hand in me getting my hands on a computer I could program (though, once I found out they had an Atari 800 in a different part of the library, with a disk drive, I switched to using that. :slight_smile: I was glad to get away from the 400’s membrane keyboard, and slow tape drive!)

2 Likes

In a way, X = X + 1 was misleading. I hear mathematicians say that sometimes, that this doesn’t make any sense. However, what’s implied is a time dimension. What you’re really saying is X(t+1) = X(t) + 1, where t is your time factor: X at the next time step equals the value of X now plus 1.

1 Like