The Fruits of Civilization (11-1) History

History

That language is an instrument of human reason, and not merely a medium for the expression of thought, is a truth generally admitted. ~ George Boole

All the early computers were maddeningly difficult to program. They were instructed using machine code, which comprised binary words that were an inhuman language that the computer processor could understand.

The first computer language was assembly, developed by English computer scientist Alan Turing in 1948. Comprising numbers, letters, symbols, and short instruction words, such as “add,” assembly language was one small step, and yet one giant leap, away from machine code. An assembler translated an assembly-language program into machine code, which then had to be entered into the computer by an operator.

Most early computers were employed for scientific and engineering calculations, which involved large numbers. To manage these numbers, programmers used a floating-point system. Computers could not automatically perform floating-point operations, so programmers had to spend considerable time writing subroutines (program segments) that instructed machines on these operations.

In 1951, American computer programmer Grace Hopper conceived of a compiler which could translate human-readable program instructions into machine-readable binary. Hopper was one of the very few women pioneers in computing.

Fortran was developed by IBM in the mid-1950s. The language aimed at easy translation of mathematical formulas into code. It was the first compiled language. Immediately popular, computer manufacturers added their own customizations to Fortran, resulting in a plethora of variants. This typically resulted in programs tied to a specific manufacturer’s machine.

It was not until 1966 that Fortran was standardized. Even then, variants continued to spring up like weeds. New standards were set in 1977, 1990, 2003, 2008 and 2015.

Hopper advocated machine-independent programming languages, and was instrumental in the development of Cobol, beginning in 1959. Cobol was one of the first high-level programming languages.

Cobol was something of a reaction to Fortran, which was oriented toward scientific and engineering work. Cobol was created as a portable, English-like language for data processing at the behest of the US Defense Department, which then insisted that computer manufacturers provide it. Whence Cobol’s widespread adoption by businesses, despite it being verbose and not well-given to structured programming. The common result was monolithic, incomprehensible programs that were exceedingly difficult to maintain.

The use of Cobol cripples the mind; its teaching should therefore be regarded as a criminal offense. ~ Edsger Dijkstra

Early languages such as Fortran and Cobol were ineptly designed. As programs written in these languages grew in complexity, it became increasingly difficult to debug them, improve or add features, or keep them working when the hardware changed.

There were no theoretical constructs surrounding computer programming in the 1950s: just languages that arose ad hoc from concepts of getting a certain type of job done.

Cobol is exemplary. The driving idea behind Cobol was to make programs easier to read. Despite the fond hope that readability would mystically engender maintainability, that was not the case.

Fortran and Cobol both had a goto statement that allowed one-way transfer of control to a distant line of code. The jump capability of goto, which was a convenient shortcut for programmers trying to get their programs working as quickly as possible, decimated longevity, as it made programs unmaintainable.

The quality of programmers is a decreasing function of the density of go to statements in the programs they produce. The use of the go to statement has disastrous effects. The go to statement should be abolished from all “higher level” programming languages (i.e. everything except, perhaps, plain machine code). ~ Edsger Dijkstra in 1968

Via the widespread use of goto, Fortran became derisively known as a “write-only” language: that is, not readable. Its ability to produce “spaghetti code” in the hands of inept programmers begat structured programming languages.

[ Code maintenance drove software language evolution. ]

Procedural integrity was the paradigm behind structured programming. This regime aimed at code clarity. It was a reaction to the damage caused by goto.

The 1960 Algol programming language supported block structures, delimited by begin and end statements. Algol 60 was the first language that offered localized lexical scope for variables and nested function definitions: in short, execution compartmentalization.

Fortran and Cobol eventually acquired structured programming facilities. So did Basic, Fortran’s simple offspring.

In forcing programs to execute in a purely procedural manner – one routine calling another, with a subroutine returning to its caller upon its completion – structured programming centered on control structures.

Control structure is merely one simple issue, compared to questions of abstract data structure. ~ American software scientist Donald Knuth

Algol did not address an equally important issue: localizing data to help ensure its integrity. But its offspring did.

The stellar spawn of structured programming was Pascal, designed by Swiss computer scientist Niklaus Wirth  in 1968–1969. Intended as a language to teach programming, Pascal was something of a straitjacket in its inflexibility. Pascal had strong typing: data types were confined to their declared usage unless explicitly converted.

Steve Jobs was so impressed with Pascal’s programming etiquette that he made it the interface language of choice for the Macintosh computer. (For efficiency, the Mac OS itself was programmed in assembly language.)

From a historical perspective, Pascal was a dead end the day of its conception. (Wirth did not think Pascal’s paradigm a dead end. He went on to design Modula 2 (1977–1985) and Oberon (1986): Pascal descendants that were instantly celebrated and quickly forgotten.) Instead, the descendants of algol which bore the most fruit were a throwback toward assembly, and a new concept altogether: objects.

While Wirth was pondering how to impart proper programming practices to pupils, American programmer Dennis Ritchie had more nuts-and-bolts concerns.