Stack depth as a measure of progress in software
3 min read

Stack depth as a measure of progress in software

I’ve been thinking recently about issues of software maintainability and how we just start over too much in software. Recall the idea from Structure and Interpretation of Computer Programs that the key tool in computer science is building up black box abstractions. A complete abstraction layer allows us to move away from the prior words altogether, giving us a language of higher order with the same or sufficient expressive power.

I will propose that the number of stacked layers put to actual use defines a metric for the progress of software development.

Let’s start at the physical “hardware” level, which we can call level 0. A human who is running an algorithm in their head or on a piece of paper is operating at level 0. Similarly, any Rube Goldberg type machine set up in an ad hoc way from arbitrary pieces is still level 0. Yes, an abacus or similar primitive hardware is also level 0, as there is no abstraction language, it’s just a memory aid and you have to directly change the hardware to operate on it.

1801

It all starts in 1801 when Joseph-Marie Jacquard invents punch cards. Punch cards allowed human s to program in the language of holes. Different programs can be created and edited on the same unaltered hardware.

~1948

Circa 1948, Nathaniel Rochester writes the first symbolic assembler for the IBM 701. Even with all the calculators and the Turing machine that have been invented since punchcards, this is the first case of a layer 2 abstraction. One could argue that assembly language rewrites to machine code rather directly, but in practice this was a significant push. The alphanumeric mnemonics and ease of development were key.

1952

Grace Hopper writes the first compiler for the A-0 System. I do not have sources to verify this was written in assembly language, and it seems too early to have been.

1957

John W. Backus leads the development of the Fortran compiler, arguably the first complete compiler. Given that IBM introduced the first assembler, this was likely written in assembly, however, it probably produced machine code rather than assembly code, so could as well be considered as another assembler.

Very shortly following this, however, the Fortran compiler was used to create the Fortran List Processing Language or FLPL, the precursor to LISP.

1991

Guido van Rossum invents Python, one of the most famous interpreted languages today.

More importantly, we get documented HTTP, it works out to become the communications protocol that enables modern browsers.

Browsers

The first browsers appeared in the eighties, way before HTTP. However, they were mostly for displaying hypertext.

1994

James Gosling invents Java. Here we have a famous language meant to run on a virtual machine using a byte code compiler.

1995

Soon after Javascript follows suit competing with Java in the browser.

On programming languages

I described some of these later programming languages, since they represent important ideas and developments, however, any programming language can in principle be translated to machine code directly and with relative ease. Programming languages do not really build on top of each other as much as their compilers are inevitably implemented by different languages. They are mostly independent efforts and no language has yet rendered all its precedents irrelevant.

Coffeescript?

Nope.

What’s next?

I don’t really know. Programming by telling the computer what to do? Programming by thinking about what the computer should do? These would definitely qualify as paradigm shifts. Textual natural language input is probably the first thing we will see.

Notice

Now, I know very little about software history and honestly did not conduct sufficient research to present these levels with any certainty. I would love to make this more complete, please reach out if you know earlier manifestations of any levels. Nor do I believe, especially after going through the history of computing, that the concept of the abstraction layer number is well defined. Nonetheless, it is a curious notion.

Also, I feel I have moved away from counting layers to paradigms as I ran into some fallacies. Hopefully this is still interesting to think about.

Sources

Wikipedia, mostly.