The singular problem with this is that computing is neither simple, nor understandable. Not for novices. Not for people that “don’t think like a computer”. Computers are the most pig headed and opinionated things on the planet. And they will always be this way.
The only way to make them “simpler” and “more understandable” is to pile abstraction upon abstraction upon abstraction on top of them. That’s why a billion people can use Facebook on their phone without knowing a bit from a byte.
Because while a “computer” (i.e. a CPU) may (may) be simple, computing is not.
It’s a wide leap from getting a CPU to make an LED blink and doing what the vast majority of people want to do with computing today.
Machine language does not “reveal” computing, or make it more understandable. Especially today. We look at the herculean efforts of things like Visual 6502 trying to provide total transparency of the computing process, and it’s still not very approachable. And, moreso, it doesn’t remotely represent anything like what a modern CPU does today. The simple fact that “machine language” is actually a high level language implemented on top of something even more primitive on modern microprocessors makes the head swim enough as it is.
This is why kids are not taught machine language. They’re taught at a far, far higher level where the underlying CPU is utterly buried (as demonstrated by the fact that their tools run on lots of different CPUs and environments). They’re taught logic, sequencing, and syntax.
I am not a modern CPU designer. But I believe that even they must work at a rather high level. “Coding” new CPUs in higher level concepts and then compiling them down in to silicon for performance vs wiring together gates and transistors. Perhaps this is still done at the very tiny microcontroller level, but even they are doing high level work. When it seems most every peripheral chip is actually an embedded micro-microcontroller rather than a vast array of hand tuned gate logic.
In the end, to be successful with computing at many levels simply no longer requires intimate low level knowledge. When even the definition of what a computer is, is in flux. Our CPUs have sideline management CPUs built in to them, with their own OSes. A huge amount of computing today runs in virtual environments. When that black square things on the circuit board has 32 cores and can be “patched”, what does that even mean when talking about CPUs of old.
So, frankly, plugging “7F”, or anything else, in to a computer doesn’t really make it more understandable, or useful. It’s borderline interesting knowledge for those who what to go there, but it’s certainly not the place to start.