Programming for children: the old and the new

Not Retro Computing, but still interesting. More info at [ ].

The BBC has created a Doctor Who-themed miniature computer, powered by a SiFive RISC-V system-on-chip, to help teach children how to program.
At the heart is an FE310-G003 system-on-chip designed by SiFive; its documentation is here. It sports a 32-bit open-source RISC-V E51 CPU core clocked at 150MHz, with 64KB of data RAM and 512KB of flash storage. The core itself has 16KB of instruction cache. The SoC is fabricated by TSMC using its 180nm process.

The kit, we’re told, also has four analog-to-digital converters; I2C, SPI, UART, and GPIO interfaces; a microUSB port for programming, debugging, and power; a so-called eCompass sensor outputting three-axis acceleration and magnetometer information; a temperature sensor; and an ambient light sensor. This is as well as the 6x8 LED matrix, buttons, wireless, and so on. A battery pack is provided to run it without hooking it up via USB.

We all know Dr Who, never uses computers.
Sounds to complex for young ones, unless it is click and play.
Too little computing power, for a K9 emulation.
I still view the BBC is are the people I don’t want to design a computer.
Just all markerting a new TV series in my view.
Now if it was how to program your phone …
The real question is just they are trying to teach. Programming
is rather complex concept and I thought Logo worked best for teaching
that stuff.
“You want weapons? We’re in a library! Books! The best weapons in the world!” Dr Who.

Well, first of all I’m happy to see RISC-V making some progress in the world.

But they don’t design computers, do they? It appears to me that they get others to do that.

And while I don’t know if it’s really done intentionally because there are knowledgable people inside the BBC or it’s just an accident of fate, but the Beeb does seem to have a record of choosing fairly good computing technology from what’s available. I’ve only limited knowledge of the BBC Micro, but Acorn’s system designs (especially including software) were in many clear ways superior to the other widely available systems out there at the time.

While that may or may not be the case in this particular instance, I’ve found that a “too complex for children” is often severely misguided, mostly by people who think of simplicity as what’s easiest for them right now, rather than doing any serious examination of what really is simple and what cultural baggage of their own they’re bringing to the decision.

After all, the first mathematics we teach to children today involves some pretty complex concepts: “basic” arithmetic is fairly high up in the hierarchy of mathematical structures, yet they learn this and more complex concepts just fine. The real issue there is that it sometimes seems to criple them, preventing them from learning anything further about mathematics. (Such a tragedy was recently demonstrated to me in a conversation with someone who had difficulty learning to use an RPN calculator; he finally decided that four-function arithmetic was the most basic mathematical concept, mathematics did not extend much beyond this—Peano numerals were apparently nonsense in his view—and that human beings were hardwired such that infix notation was the only natural way of representing equations and processes. Presumably people comfortable in Forth or Lisp are aliens.)

Sure, but no more complex than many other things that grade-schoolers learn. I think that the biggest issue is really that many wannabe educators don’t understand what the simpler concepts are in computing science and instead teach far more complex things (that can be derived from those simpler concepts!) as if they were the basis underlying programming. Functions, a conditional construct, evaulation, and lack of limits on recursion are together enough for a powerful programming environment suitable even for large programs with a lot of inherent complexity in the domain, but it’s rare that anybody will start teaching with something like that. (In part this is probably due to the teachers not learning that way, and then being forever stuck in a world where whatever they learned is the best and only way of doing things.)

You’ll get no disagreement from me there; Logo is a decent enough language, and certainly a lot better than many out there. (Though it’s weird to me that they basically took Lisp and added complex and confusing syntax on top of it; I do not see how it makes things simpler, it just makes it more what certain naïve teachers expect.)

That said, Python is by no means a terrible language (it’s probably my favourite of all the popular languages outside of functional ones) and I am guessing here that MicroPython didn’t destroy it too much.

I’m not sure I get this; programming a modern smartphone is not really different from programming anything else, albeit less convenient, particularly in terms of I/O. (Reading a button press or sending a character to some sort of display device is considerably simpler than handling touch input and graphical output.) I think that the plethora of relatively simple I/O devices on the kit above is actually a pretty good selection, and it provides a nice range of interaction with the physical world for the many people who find that quite motivating. (Apparently kids loved that physical turtle that could draw on the floor that was part of the original Logo system.*)

* Just as a reminder, this is in no way part of the Logo language; it’s an API/library usable in any language that just happens to be implemented mainly only in Logo systems, for some reason.

(I’ve retitled this thread, in the hope of getting an on-topic discussion, relating to retro computing or computer history. I think it can be done.)

Logo, the first computer language explicitly designed for children, was invented by Seymour Papert , Wallace Feurzeig, Daniel Bobrow, and Cynthia Solomon

This, to me, seems rather an important point. Whether or not an effort is successful, whether or not it has flaws, it’s interesting when something is explicitly designed for teaching (or, perhaps I should say, for learning.)

Logo is commonly misunderstood as being ‘that turtle language’ - there’s a lot more going on, as @cjs notes. We’re told that children can discover for themselves some ideas of program structure. For me, the locality of the syntax feels like quite an advantage. (Children make grammatical mistakes in spoken language all the time - but they still manage to communicate. Their peers and the adults around them do the extra work, which may be very minor, of figuring out what is meant. I’m not sure I can point at any do-what-I-mean language, still less one with didactic value, but an easy syntax feels to me like a good idea.)

As an unexpected connection between Logo and retrocomputing, if you pop over to and download the transistor-level simulator there, and dig in a bit, you’ll find that while the outer layer is written in run-anywhere Java, what’s implemented in Java is a Logo. The transistor-level simulator itself is written in Logo.

Our mission is to make computer science and digital electronics accessible to students, hobbyists, and the curious public around the world. Our first interactive exhibit was commissioned by the Intel Museum in 2005.

Back to Logo, and Cynthia Solomon’s history here comes with a video:

But the question is, “What is an easy syntax?” Many people seem to answer that with, “The syntax of a language I personally already know well” (usually without realizing that that’s what they’re saying). In other words, often whether a thing is considered “easy” or not has little or nothing to do with the thing itself, but is all about the personal store of knowledge and the biases of the viewer.

Is Logo syntax easy? Have a look at the code below and see how easy it is for you to figure out what it does and why. I tested it in this interpeter; you may find that useful for experimentation.

make "sum "sum-is
print list :sum sum 2 3 + 4
sum-is 9

I suspect that most people here will find this a lot easier to understand when the exact same program is re-cast in Lisp syntax. (I’ve used more indentation and line breaks than would be usual in Lisp and slightly non-standard parentheses placement just to make it more obvious at a glance exactly where the matching open and close parantheses are; more regular Lisp users would grok this from indentation conventions.)

The same code, different syntax:
; The following is not a real Lisp dialect,
; it's Logo with a different (and more explicit) syntax.

(make sum "sum-is)          ; Read "make" as "define".
(print (list sum            ; Logo is a "Lisp-2", so non-function and
             (sum 2         ; function variables have separate namespaces.
                  (+ 3 4)

One way to answer ‘what is easy syntax’ is of course to experiment, with children and with first-time learners. It’s true, and I think important, that the S-expression of Lisp was never intended to be a human interface.

It’s also true that Python’s use of indentation is controversial among seasoned programmers - but that’s the not the important audience, for me. Education is a science in its own right. One thing about indentation is that it has a certain consistency which C-like languages lack. For me, Logo (and Forth) are even simpler. Forth was not designed as a teaching language, and I think perhaps that’s fairly obvious.

The thing about balanced parentheses, or other balanced forms, is that the balancing might be quite long range. It’s fiddly, and perhaps there’s a better way, for the novice.

It is of course difficult to take the novice’s perspective, when you’re well-versed in another language, or in several. I think this is part of the reason for strongly held diverse opinions.

Another world, of course, is the world of block programming: Scratch, and its relatives. I’m not completely convinced, but I have supervised kids who have made progress, from zero (from scratch!) to the point of creating simple games and stories. The thing to take away would not be that block programming is an excellent end-point for life as a software developer, but that it might be a good introduction to some core concepts: including multiple communicating processes, as it happens.

Now, all this has got a little distance from the head post, which is about individual, personal, near-disposable computing devices for teaching with. That’s quite a new and non-retro development, I think, because those devices need to be cheap but powerful. Arguably the Dynabook is an example of something which might have come to be, if we understand the nature of the research behind it. It’s clear that the iDevice is not a Dynabook, but it’s not clear that a Dynabook couldn’t be made: Amazon’s first and second generation Kindle had a keyboard.

The XO-1 should probably get a mention too, although not as great a success as was hoped. Not so much disposable as indestructible. The vision behind it was certainly interesting - it’s part of computer history now, if not yet retro.


While true, no, I don’t think that’s important. What important is that Lisp has had alternate syntaxes from the very start (most of the early papers gave their code in M-expressions) and since then there have been multiple experiments with different syntax to replace S-expressions. Yet after sixty years, Lisp programmers still prefer S-expressions to any of the alternatives they’ve been offered. There’s clearly something going on there. (What that is has actually been pretty well described in the literature, I believe, but I won’t get into that here.)

Pythons use of indentation is not at all controversial; it’s the same style of indentation that virtually all Algol-style programmers insist be used, even when the language doesn’t enforce it. It’s leaving out additional block markers that duplicate the preferred indentation that’s controversial to some, which is a bit weird when you think about it. There’s a similar controversy about leaving off semicolons at the ends of lines, too. Not to mention whether one should use braces or words (such as begin and end) as the block markers. In all these cases it seems pretty clear that the arguments are far more often emotional than technical.

That’s not a “novice” problem; that’s a long-acknowledged issue for all Lisp programmers. Fortunately, relatively simple tweaks to editors, such as visual parenthesis matching, mitigate the problem a lot. And it’s also an issue (obviously to a lesser degree) in Algol-style languages, and enough of one that Algol-style programmers generally insist on having the same or similar tools in their editors to help with this.

Well, that’s not too suprising; Lisp and Forth (and to a lesser degree Logo) are all mathematically simpler than Algol-style languages in several ways. (That is to say, they use more general constructs, in the mathematical sense of “general.”) This goes far beyond syntax (as we can see from Logo being essentialy Lisp with far more complex syntax). The syntax comparisons are easier to see (if you look at most Algol-style interpreters and compilers, they first translate the code into the same abstract syntax tree represented directly by S-expressions), but there are also some far more fundemental differences about whether you have just expressions or expressions plus other stuff, how composition is and can be done, generality of control constructs, and so on.

As far as teaching novices, the main issue to me seems to be that a far too common approach is to pick a popular language, usually supporting only one particular programming paradigm, and teach as if that language’s particular approach to programming is all of programming and nothing more need be learned after that. We’ve already seen, with examples such as Guy Who Will Never Be Able To Use An RPN Calculator, where that kind of approach ends up.

Once you can convince teachers to take a modular approach to language features, and understand that the important thing is the concepts underneath the language constructs that particular languages express iin different ways, then it’s reasonable to start asking whether doing initial instruction with less general (sometimes significantly less general) tools can be helpful. But that’s certainly not been proven; we use this approach in mathematics and I’m not even sure it’s a good idea there, given some of the results its achieved.

I think you just said “Python’s preferred formatting isn’t controversial, its use of indentation as syntax is.” Which is @EdS’s statement that “Python’s use of indentation is controversial” when expanded as the context indicates.

I agree that the arguments against Python’s use of indentation as structural syntax are essentially emotional. I do think there are some buried concerns in there that are real, but also simply no longer particularly relevant. Many old-timers view whitespace as somehow fragile, in large part because, historically, it was (in the same general way that non-printing characters were fragile, and for similar, although different, reasons). (See, for example, the contortions that Internet mail went to to preserve whitespace at the beginning and end of lines.) On the other hand, in 2020 (and even in the late 90s when Python burst onto the scene), we can generally assume that even our most binary-fragile files are reproduced with accuracy, and whitespace in text files is a non-problem, perhaps modulo line endings.

Regarding syntax choices like whitespace (or parenthesis, or whatever!) in education, it really doesn’t matter. As long as the rules are reasonably simple and coherent (as they are in both Python and Lisp), youth and beginning programmers don’t know any better, so it doesn’t matter to them. They may be briefly confused when a program doesn’t run right because there are too few spaces at the beginning of a line, or because parenthesis don’t balance, but assuming at least a little bit of help from the editing environment, they’ll rapidly make peace with the fact that that’s just The Way Things Are, and move on.

It’s the fogies that object. And, as you said, for emotional reasons. :slight_smile:

It bothers me ever so slightly that my aim was to be able to comment on Python without stirring up a great debate about indentation. If we can’t mention a subject - even guardedly - without provoking a reaction, that’s not great for the conversation. It would be good always to be bearing in mind the intent of the thread, and the enjoyment of the audience, and the improvement of the thread, rather than pursuing some instance of “someone is wrong on the internet.” The bigger thing to do, when someone is wrong, is to let them be wrong, and be interesting, or otherwise add to the conversation.

(It bothers me slightly that I even need to say this. It’s in the FAQ!)

1 Like

I like Forth for the simple reason there is no infomation hiding.
“-” does pop the stack. sub , push on the stack. C++ could redefine
“-” to some thing like if type number sub iif type eggs sub mod 12

forth – eggs left
eggs broken - – works
eggs - broken – is got ya if want to just count the # of eggs,
You do what you see, so if I know what it does, the computer
does too.
I have my own programing language, Bengol.
It is based on the idea a expression is
exp := right value
or right value operand right value …
or right_value operand ( …
simple to parse
if statement := if expession statments
end := elseif expression statments
or else statments
or endif
while := while expression statments repeat
I have more specfic constructs rather than general ones
so it is bootrapable and uses a small amount recursion.
I have yet to see a cpu other than my own design that I can port
it to. A 68000 might be portable, but don’t have one.
I have to look at python and indentation, but I use indention to hint
at loop depth.
x x x IF
x x x ENDIF

foovar(int a b)
int h
  if b<0 printf(0 "debug h=%d \n" h) h=h*7
  elseif b=0 h = h+4 
 return h

Any more disscusion can be a new topic.
(of course white space is ignored, but was there)
(edited with ‘pre’)

If you want to show a piece of code with the proper formatting, just surround it with <pre> and </pre>

About indentation, the first language I used with that feature was Occam. Inmos had its own “folding” editor which was usable with serial terminals and made it easier to edit Occam than a generic editor.

Well, first of all, this does not appear to me to be the standard “indentation” flamewar; that’s certainly not my intention and I’m not seeing anything of that nature from others when I look at their posts with even a small amount of charity.

But the commonly-encountered controversy over the issue does serve as a good example of exactly the kind of thing we need to be able to properly deconstruct if we’re going to be able to talk about the differences between computer languages at all, much less discuss what’s actually being taught when using them. Without being able to separate the technical and conventional elements in things like this, we can’t have a technical discussion about differences between languages.

Perhaps we can apply the principle of charity to posts and, when there’s doubt or just bad phrasing, attempt to tease out a technical argument rather than assuming we’re descending into a the traditional flamewar on an issue.

Well I certainly hope that’s what Ed’s statement means! But I would like to make sure of that, because that is not the underlying conversation people are really having in most instances of this discussion. If we can move towards a more precise way of stating the issues we’re really talking about, I think that would help keep the conversation on track, and even better illuminate the real issues that I’d like to examine.

Indeed, there are a ton of real concerns about it. Your “whitespace fragility” one is an issue that hadn’t really occurred to me (beyond the tabs vs. spaces thing), but there are others embedded in Python’s particular choice of syntax and how it parses it, such as semicolons instead of newlines being usable to separate statements in some circumstances, but not others. There are a number of confusing inconsistences in it, though of course one can debate how often one will come across these.

Haskell has a far better implementation of reading whitespace for grouping/separating definitions and expressions (one that even allows you to use braces and semicolons if you like!—in fact what it’s really doing is inserting them for you), so I think we can probably put Python’s implementation issues aside.

But this does bring up the more interesting issue of whether we might actually want that redundancy, for pedagogical benefits, in a language designed for teaching programming. My instinctive reaction is that it would actually be useful, bringing us to the same place that many Algol-style programmers already live with automatic formatting tools either detecting inconsistencies in the two ways of specifying structure.

Well, Python is not nearly as simple as it looks on the surface. (Lisp, at least in its simpler forms, more nearly is.) And while I agree that some syntatic issues are probably not too important, other ones may well have a deep effect on what starting programmers actually learn. For example, Lisp syntax makes it very clear that + is a function like any other, whereas Python hides that pretty deeply, to the point where they need special library support to be able to get hold of that function (i.e., reify it).

Another example would be Python’s def foo(): syntax for variable assignment being so different from foo =. (I think that many people’s immediate reaction to that statement would be "def foo(): is a function definition, not a variable assignment!" which is exactly my point.) That function names are just variables in Python is a great thing, but perhaps deliberately obscuring that is not such a good idea when teaching lest it lead learners to believe that functions are (and perhaps even must be) something qualitatively different from other kinds of values.

I hope it’s OK to add something after so long. (The forum software wanted me to be really sure about it before posting a reply. :wink: )

I started learning FORTH as a teenager and never really got comfortable with it, though I feel I understand why it’s sometimes referred to as a meta-language. I encountered LISP at a similar time and my experience with it was similar to my FORTH experience. Both of them made me think about things differently, but I wouldn’t use them in day-to-day things, or try to teach them to children.

When I first encountered Python (version 1.5.2) I found the use of indentation to be really neat and (to me) intuitive. The interactive prompt and built-in containers (lists, dictionaries and tuples at the time) were a breath of fresh air to me. If only BASIC had included those.

These days, I think Python has strayed too far from its roots to be a good teaching language. Some language offshoots have tried to go back to basics, such as Pyxie and Snek which are both aimed at robots and microcontrollers. MicroPython and its own offshoot, CircuitPython, aim to cover similar use cases, but then they’re bringing in a full-ish Python language implementation with all that implies.


Another late addition to this thread, however last week I was involved in a holiday activity group(s) for (slightly disadvantaged) children and computers - the idea was the teach or at least demonstrate the (very simple) principles of computer animation.

We used ancient computing - namely BBC Micros (actually emulated on a Pi, but that’s irrelevant here).

I was leading the sessions I was involved with and as well as the teaching and demonstrating side, I was curious to see how todays “swipe and tap” generation would cope with keyboards and text.

They did OK. Just OK - many issues, but we got there.

The issues (for me) is mixing the animation side with the ancient coding side. BBC Basic is not forgiving - neither are most 8-bit BASICs, and the concepts of how to use the keyboard and it’s unique copy cursor were alien to the children.

However we got through - the question might be would I do this again? Well the company who developed all this are certainly going to do it again and yes, I’d do it again, however I have ideas on how to do it better - which I’m working on then will test some ideas with the folks who run it.




Curiously this is similar to me - After starting with BASIC and moving to the Apple II in '78, I wanted more speed (as we all did!) so as well as assembler, Forth was looked at… And while I did some stuff with it, and it was faster, like you, I was never really comfortable with it… Some years later I was even paid to write some (a lot!) of Forth too, but again, while I did it, I just never got into it…

I’ve found the opposite - to be constrained by a force indentation and coding style was not for me and Python isn’t for me.

Most BASICs have allowd a form of indentation - some will preserve spaces and I’ve seen some people use colons (statement separators) to effect indentation. I’ve also found some older programs (written in the 70’s) to be quite well structured - written by former Algol coders, I wonder?

My main bugbear is the startup costs - Booting a full-blown Linux or MS windows just to get into the Python IDE or separate editor, even if the target is a $5 microcontroller. It’s a long way from the one-second to BASIC prompt of the 70’s and 80’s …


I thought it was more like ~ 15 seconds as Basic checked memory on the Coco or C64 and a few minutes for the cassette tape load something.
I do remember getting a new PC and DOS with a 20 MB hard drive was impressive, for speed.
Sadly programs never seemed to run as fast after that . :frowning:

From power-on to BASIC: The BBC Micro was a fraction over 1 second. The Apple II was under a second.

The Apple II then (optionally) had to boot DOS which we got down to 5 seconds or so, when faster versions came to light. BBC Micro had it’s “DOS” in ROM and was ready to go. (If you had disks, yes, tapes at 1200 baud were much slower to load programs from but BASIC was in ROM and ready to go).

The PET and TRS-80 were more or less in the one second region too from what I recall. Same for VIC-20 and C64 - although then program loading was then slower from tape or disk. The longest wait was probably the old CRTs or TV’s warming up…

Old IBM PCs with BASIC on ROM were fast to boot to BASIC but that was all. They really still had to boot DOS which took a few seconds more. (And latterly MS Windows which now takes minutes. Even Linux now takes minutes to boot on some systems )-:


I’ve tried CircuitPython on a couple of microcontrollers and there’s no real cost in terms of start-up time. I suppose that it will run close to the metal on various Raspberry Pi boards, too. The downloads include ones for those boards and seem to be kernel.img files that will run in place of a Linux kernel.

I’ll download the ones for the Pi Zero and see what the experience is like.

I had the same thoughts.

I think you may have recently used the boot-to-BASIC flavour of RISC OS, on the Pi, @drogon - I think that boots in a handful of seconds, under 10 perhaps? (Edit: yes, turns out I timed it at 6)

There’s also BBC Basic for the Pi Pico, an even cheaper platform, and that boots within a second (unless it has a timeout configured, to wait for serial input, which some builds do.)

1 Like