Bootstrapping the GNU Compiler Collection

I wonder what C compiler GCC was originaly written in?
CP/M has often 48Kb of memory just ample for a compiler.
I have felt GCC was like Mozilla, rewrite something every few
months and add new marketing features, and drop older features. The same goes for my free FPGA software.
Back then I suspect most things in the early 60’s was being done for the first time on machines that had so little
memory that every thing was swapped to DISC. TSS/8 for
the PDP 8 was good example. Writing code back then
was different than to today, where you use memory with out care, ignoring leaks and virtual memory thrashing
.
I guess having a GUI of any kind, doubles your program
size for stupid I/O, and debugging 3x the time.
Ben.

We’re drifting off topic to be sure.

GCC was written to be compiled with pcc, the “portable C compiler” that was readily available on Unix systems. It was meant to be compiled once, in to a new binary, then compiled again with itself. I’ve done this myself back in the day.

Nowadays, GCC can host itself, naturally, but it’s still able to be compiled with very old versions of the compiler.

Versions of GCC prior to 11 also allow bootstrapping with an ISO C++98 compiler, versions of GCC prior to 4.8 also allow bootstrapping with a ISO C89 compiler, and versions of GCC prior to 3.4 also allow bootstrapping with a traditional (K&R) C compiler.

So, any GCC prior to 3.4 can be compiled with a K&R compiler.

So, if you have pcc (a K&R compiler), you can build the latest version of GCC by bootstrapping the earlier versions of GCC until you can build the current compiler.

There is effort to create a very simple, auditable tool stack to build tools like GCC. They start with the simplest of languages (such as very crude Lisps) within which can be written something like an early K&R compiler. They don’t have to be fast, or create good code, just have to be functional. They don’t even have to produce machine code if they can run on an audited VM. A crude lisp builds a crude C compiler which builds an early GCC, and then you’re off to the races.

Why do they want to do this? Because of Reflections on Trusting Trust.

You could write a compiler, but they’re not very good. Lots of discussion of how awful Z80 C compilers are (partly due to the Z80 being a Z80, and partly due to writing a good compiler on system as memory constrained as a stock CP/M machine).

I don’t know of any real features GCC has dropped. At best they may have dropped support for a CPU that hadn’t been maintained for 10 years. There’s a vast array of legacy code that doesn’t care to be broken written in GCC.

1 Like

Maintained for 10 years. Ha Ha.
PDP 8’s are still going strong from the late 1960’s,
The problem with the 8 bitters, is that they are designed
as micro controlers, not a general purpose computer like
a PDP-11 or a IBM-360 ( Not counting decimal math here),
and don’t have the opcode space for muiltiple data sized operation.

  1. Correct Code is what is wanted.
  2. Faster code means caching of data, and that may make
    one architecture perform better than another for some problems, but not all of them.
    I am a fan of 36 bits, but that is way off topic,
    Ben,

I don’t know anybody who ignores leaks or virtual memory thrashing; both are pretty painful. These days a lot of people go so far as to choose languages that can’t leak, even at a fairly substantial performance penalty.

As for using memory without care, well, everybody does that when the amounts of memory they’re using are small relative to what’s available. You could accuse 1983 first-generation MSX machines (32K RAM, 16K ROM) to be using memory without care because their code wouldn’t fit on, and their data would use up most of the RAM in a typical machine from 1977.

But that’s the smart way to do it. Sure, for a parser I was working on the other day I could have built a system that would let me read in just a few characters at a time, instead of slurping the entire file (dozens of kilobytes!) into memory. But spending extra time to put extra code and, no doubt, extra bugs into the system doesn’t seem to help anyone.

But this is a pretty interesting topic of its own right, so I’ve split it into another thread! :slight_smile:

2 Likes