Hardware
The Sumerians invented the abacus ~2700 BCE. This finger-fueled calculator was the progenitor of an endless array of devices for intellectual work.
In 1614, Scottish mathematician John Napier published his discovery of logarithms, which were intended to simplify calculations: specifically, to reduce multiplication and division – the 2 most difficult arithmetic operations – to addition and subtraction. Though the principle was straightforward, calculating logs with ease required lookup tables.
Logarithms were a lasting contribution to mathematics. But, to his contemporaries, Napier was more celebrated for his bones. Napier’s bones were a set of 4-sided rods that afforded multiplication and division by physical manipulation. They came to be known as bones because the most expensive models were made of ivory. Square roots and cube roots could also be sussed with Napier’s bones.
Napier’s bones were warmly welcomed by the mathematically challenged throughout Europe, which was practically everyone. At the time, even the lower rungs of the multiplication table taxed the well-educated.
English mathematician and Anglican minister William Oughtred invented the slide rule in the mid-17th century, which was used primarily for multiplication and division, though it also functioned for logs, roots, and trigonometry. Welsh clergyman and mathematician Edmund Gunter had developed the logarithmic scales upon which the slide rule was based.
The predecessor to the slide rule was Gunter’s rule: a large engraved plane scale that helped answer navigational and trigonometry questions, aided by a pair of compasses. Gunter also devised a device for calculating logarithmic tangents, something a slide rule could not easily do.
Gunter’s interest in geometry led him to develop a method for surveying using triangulation. To aid in that task, he invented an instrument – Gunter’s quadrant – which let one figure many common problems associated with spheres, such as taking the altitude of an object in degrees and figuring the hour of the day.
I have constructed a machine consisting of 11 complete and 6 incomplete (actually “mutilated”) sprocket wheels which can calculate. You would burst out laughing if you were present to see how it carries by itself from one column of tens to the next or borrows from them during subtraction. ~ Wilhelm Schickard in a 1623 letter to his friend, German mathematician and astronomer Johannes Kepler
German polymath Wilhelm Schickard invented the mechanical calculator in 1623. French mathematician, physicist, engineer, and Christian philosopher Blaise Pascal, who is generally credited with this invention, created his 1st calculator in 1642.
A prodigy, Pascal was a piece of work. His 1st essay on conic sections was so precocious that French mathematician and philosopher René Descartes was dismayed with disbelief that such mathematical matters “would occur to a 16-year-old child.”
A high-strung perfectionist, Pascal labored on his calculator prototype for 3 years, experimenting with different designs, materials, and components. The resultant Pascaline was conceptually more ambitious than Schickard’s contraption, but the German’s mechanism worked perfectly, whereas Pascal’s Pascaline was problematic.
Besides producing several seminal mathematical treaties in his 20s, Pascal demonstrated the existence of atmospheric pressure and vacuums. In his 30s he invented the syringe and hydraulic press, as well as enunciating the basic principle of hydraulics, now known as Pascal’s principle.
Along with Swiss mathematician Pierre de Fermat, Pascal laid the foundations of probability. Pascal’s project began as a favor for a card-playing nobleman who wanted to know the odds of a draw.
Pascal was also interested in worldly affairs. Shortly before he died at 39, in great pain from ulcers and stomach cancer, Pascal and other farsighted Parisians established one of the earliest public transport system in Europe: a bus line in central Paris.
At 32, Pascal entered a Jansenist convent outside Paris. Jansenism was a mostly French Catholic theological movement that emphasized original sin, human depravity, predestination, and the necessity of divine grace.
Pascal’s extreme religiosity was fueled by his repressed homosexuality and agonizingly poor health. He flagellated himself for more than his share of sins.
At the behest of his Jansenist order, Pascal generally abstained from scientific pursuits, devoting himself to castigating atheists and Jesuits in religious tracts that were considered masterpieces of prose. Thus, Pascal positioned himself as one of science’s greatest might-have-beens.
The 3rd 17th-century calculator was concocted by German mathematician and philosopher Gottfried von Leibniz, who was one of the first Western mathematicians to study the binary system of enumeration. It was this shift in perspective that enabled Leibniz to progress where others had stumbled.
Binary has only 2 digits: 0 and 1. Binary would find eternal fame as the basis of electronic computers, whose memory was based upon the presence or absence of electrons. The term bit is an acronym of “binary digit.”
To Leibniz, the significance of binary math was more religious than practical. He regarded binary as a natural proof of the existence of God, arguing that the all-knowing One had created existence out of naught.
Leibniz’s mechanical reckoner spawned a swarm of imitators. Almost every mechanical calculator built in the next 150 years was based upon Leibniz’s device.
The 1st mass-produced calculator appeared in 1820. The Arithmometer was invented by French inventor and entrepreneur Charles Xavier Thomas de Colmar. It was dependable enough to be used in government and commercial enterprises. The Arithmometer was only one of hundreds of mechanical inventions ushered in by industrialization, by which the commercial world moved at a faster and harsher pace.
Since the advent of logs, the tools of the trade of those who worked with figures were mathematical tables. These were indispensable for finance, science, navigation, engineering, surveying, and other fields. There were tables for interest rates, square and cube roots, hyperbolic and exponential functions, and mathematical constants, like Bernoulli numbers and the per-pound price of meat at the butcher. Many mathematicians devoted the greater part of their careers to tabular calculation.
The need for accurate tables became a matter of state concern. France had a national project for it that took 2 years, resulting in 2 handwritten copies of 17 volumes of tables. Fearing misprint, the tables were never published. Instead, they were stored in a library in Paris, where anyone could consult them.
Despite the cost and effort that went into making mathematical tables, they were invariably erroneous. Britain’s navigational bible was sprinkled with mistakes grievous enough to have ships run aground and get lost at sea.
Mathematicians were stymied for a remedy. Then a young English polymath, Charles Babbage, lit upon a solution: a calculating gadget he called the Difference Engine.
Powered by a steam engine, the Difference Engine was designed to calculate tables by the method of constant differences, and record the results on metal plates, from which results could be printed; thus, the gremlins of typographical errors could be eliminated.
Babbage managed a proof-of-concept prototype that proved the feasibility of his conceptual contraption. A full-fledged version would have required thousands of precision gears, axles, and other parts that were far beyond his budget.
In the end, Babbage managed to secure funding from the British government; an unprecedented move in support of private enterprise at the time. Through Babbage’s own lack of diligence, along with a petty, corrupt partner, Babbage blew his chance. In 1842, 19 years later, the venture was officially canceled.
If the Difference Engine had been built, it would have stood over 3 meters high and wide, 1.5 meters deep, and weighed 1.8 tonnes, filled with an intricate array of machinery connected to 7 main vertical axles. Its calculations would have been slow and cumbersome, but still preferable to pen reckoning.
In 1834, Babbage had a vision of a machine that could solve any mathematical problem. He produced the first workable design of his Analytical Engine by mid-1836, then tinkered with the idea off-and-on for the rest of his life; producing 6,500 pages of notes, 300 engineering drawings and 650 charts. It was a dream Babbage never realized.
In its supposed heyday, the Difference Engine received profuse publicity. It was only a matter of time before such a machine was manufactured.
Inspired by Babbage, Swedish lawyer and inventor Pehr Georg Scheutz and his son Edvard managed to make their own mechanical calculator, which they called the Tabulating Machine. Several difference engines were constructed in Sweden, Austria, the US, and England. The British Register General, responsible for vital statistics, had a copy of Scheutz’s tabulator built in the late 1850s. While the machine made the work somewhat easier, it often malfunctioned and required constant maintenance.
In 1884, American inventor Herman Hollerith filed patents for an electromechanical system that stored and tallied the perforations on punched cards. The cards were punched to represent particular statistics of any sort, whether census or inventory data.
Hollerith powered his device with batteries. His equipment was smaller, faster, simpler, and more reliable than the mechanical machines that preceded it.
The US Census Bureau was impressed with Hollerith’s work, but it first demanded a test against 2 other manual systems designed by Bureau employees. Hollerith’s system emerged triumphant, completing the job 8 to 10 times faster. The agency ordered 56 tabulators and sorters for the 1890 census, paying $750,000 in rental fees (the machines were rented, not purchased).
The 1880 census took 9 years to tally, at a cost of $5.8 million. The 1890 census took less than 7 years, but the tab was $11.5 million.
The benefits of automation seemed uncertain. But the comparison between censuses was somewhat apples-and-oranges.
The 1890 census was far more comprehensive. Indeed, the Census Bureau estimated that it had actually saved $5 million in labor costs.
Hollerith’s equipment did have hidden costs. Used to the hilt in the 1890 census, the price of providing electric power was considerable.
Nonetheless, Hollerith’s system became welcomed all over the world. By the early 1900s, the company couldn’t keep up with demand. In 1911, Hollerith’s firm merged with 3 others to become the Computing-Tabulating-Recording Company, which evolved into International Business Machines (IBM).
To facilitate the tally of the 1890 census, Hollerith devised a typewriter-like counter. This was one of many mechanical aids to calculating and writing to be constructed.
In 1647, English economist William Petty patented a copying machine in the form of 2 pens for double writing. What followed in the next 2 centuries were innumerable clumsy attempts in a similar direction.
The typewriter was first patented by English inventor Henry Mill in 1714. Others soon followed in a variety of configurations.
The 1st semi-practical typewriter was conceived in 1866 by American inventor Christopher Sholes. Technical difficulties delayed its manufacture until 1871.
The typewriter’s initial principal defect was that the type-bars had no return spring, making for a tediously slow mechanism. Sholes fixed this with springs.
Initially, Sholes typewriter keys were arranged alphabetically. But then Sholes came upon the idea of arranging the keys by combinations of most frequently used letters. After some study, the familiar querty keyboard was born.
The Sholes typewriter wrote only capital letters. Because the letters were struck on top of the platen, they could not be read while typing.
The Remington company – manufacturer of guns, sewing machines, and agricultural equipment – took an interest in Sholes’ typewriter, despite skepticism by one of its directors who could see no reason for machines to replace people for work which they already did well.
Starting in 1873, the Remington Model I was the 1st typewriter marketed in the United States. It followed on the heels of a writing ball-style typewriter by Danish inventor Rasmus Malling-Hansen in Denmark, which was sold in Europe beginning in 1870.
The 1878 Remington model offered lowercase letters. Finally, in 1887, the type-bars got mounted so that the text could be read while it was being typed. Hence the modern typewriter emerged.
American inventor David Parmalee constructed the 1st arithmetic calculator with a keyboard in 1849, receiving a patent for it the next year. It was an adding machine limited to single-digit numbers. Results had to be written down, and larger numbers added separately.
Parmalee’s invention deserved skepticism as a machine that replaced work done better by people without it; so too the ones that followed it. Numerous inadequate improvements followed in succeeding decades. The mechanical adders continued to require much preliminary manipulation and conscientious attention on the part of the operator.
What was worse was that the machines offered no numerical capacity of consequence and were tediously slow. Further, they were fragile in use, and so gave wrong answers if not treated with delicacy and dexterity.
The earliest usable adding machine was the Comptometer, by American inventor and industrialist Dorr Felt. Short on funds, Felt constructed the prototype in a macaroni box in 1885.
The Comptometer was manufactured without interruption from 1887 to the mid-1970s, with continual improvements in speed and reliability. It went electromechanical in the 1930s. In 1961, the Comptometer became the 1st mechanical calculator to have an all-electric calculating engine, thereby being the evolutionary link between mechanical and electronic calculators.
Of course, Felt was not the only one with an adding machine on the market. As a bank clerk, William Burroughs knew too well the inadequacies of those then-available. After working in his father’s model-making and casting shop, where he met many inventors, Burroughs began tinkering.
In 1884, Burroughs developed an adding machine with a keyboard, driven by a handle (shown). With backing from local merchants, he rushed his machine into production in 1887. Burroughs’ haste proved an expensive mistake: the machines did not stand up to everyday use. Furious with himself, he walked into the stockroom one day and pitched them out the window, 1 by 1.
In 1892, Burroughs patented another calculator. This one had a built-in printer. It proved a winner, far outselling all others on the market. But Burroughs did not live to enjoy his success. Suffering from poor health, Burroughs died at age 41 (1898).
Early calculators found homes in banks, company accounting departments, and universities. By the 1920s, electric ones were available: just push the buttons and results were printed out on neat rolls of paper.
While decent at basic arithmetic, these calculators were no good at more complicated math. To up the mathematical ante, many Rube Goldberg contraptions were devised by engineers, but they were cumbersome and expensive.
Differential equations were widely used in scientific and engineering circles. Dealing with derivatives, these functions were well beyond the ken of the tallying machines to date.
Differential equations can be attacked numerically via variable equations, or graphically, with waves on paper substituting for numbers. Beginning in 1814, when differential calculus was in its infancy, all manner of clever analog gadgets were devised to work these equations, targeted to specific applications. Scales and rules are exemplary analog instruments, as contrasted to the digital devices exemplified by Napier’s bones, mechanical calculators, and electronic computers.
Lord Kelvin realized that these special-purpose analog devices were the conceptual seeds of much more powerful machines. He outlined a generalized “differential analyzer” in an 1876 paper. But current-day technology was not up to the job. It was not until 1930 that a differential analyzer was built, and then by a man, American engineer and MIT professor Vannevar Bush, who claimed not to have been familiar with Kelvin’s paper.
Kelvin himself was not all theory. In 1873 he built a hand-cranked mechanical tide predictor that could calculate the time and height of ebb and flood tides for any day of the year, printing its predictions on rolls of paper; an especially handy device for an island nation like Britain.
Bush’s differential analyzer was inelegant but it worked: generating solutions off by no more than 2%, which is about the best that could be expected from an analog calculator. The machine became quite influential. Copies were made in several countries. Bush went on to build a larger, faster electromechanical version using vacuum tubes in the early 1940s.
But Bush and his MIT colleagues were barking up the wrong tree: analog devices were inherently ill-suited to accurate, versatile computing. Although special-purpose analog calculators continued to be built, the future was digital.
Spurred by the success of the differential analyzer, by the mid-1930s a handful of engineers and scientists in the US, England, and Germany gave serious thought to the mathematical potential of machines. Although they occasionally wrote articles or spoke at conferences, these men worked in relative isolation.
Babbage’s flexible Analytical Engine was all but forgotten except in Britain, and even there its underlying principles had to be rediscovered. The first to do so was young German civil engineer and inventor Konrad Zuse.
Zuse hated the mathematical drudgery of his chosen profession and did not relish the prospect of a career hunched over a desk twiddling equations. He also had the good sense to realize that another mechanical calculator with endless gears wasn’t the answer.
After carefully considering the problems with mechanical calculation, Zuse made 3 conceptual decisions that put him on the right track. 1st, he decided that the only effective solution was a universal calculator, which meant the same flexibility that Babbage had envisioned.
2nd, Zuse decided to use binary notation: an unobvious inspiration that ensured his success. The irreducible economy of binary meant that the calculator’s components could be as simple as on/off switches.
Larger-base number systems, such as decimal, could be had by ganging binary bits together to form words. An 8-bit word can represent numbers up to 256 (0–255).
Decimal had long been regarded as a God-given sine qua non until Zuse and a few other contemporaries independently questioned the axiom. Even Babbage had left the decimal assumption unquestioned; but then, the gears of the analog Analytical Engine were ideally suited to the decimal system.
3rd, Zuse devised a simple set of operating rules to govern his hypothetical machine’s internal operations. Although he did not realize it at the time, Zuse’s rule set was a restatement of Boolean algebra, named after English mathematician George Boole, who developed Boolean logic in papers published in 1847 and 1854.
Before Boole’s publications, formal logic was a sleepy discipline, with little to show for thousands of years of cogitation. Its most powerful analytical tool was the deductive syllogism, which Socrates had sussed.
Most logicians criticized or simply ignored Boole’s work; but mathematicians were interested. Babbage called Boole’s 1854 paper “the work of a real thinker.” In 1910, English logicians Alfred North Whitehead and Bertrand Russell extended Boolean algebra into the formidable intellectual system of symbolic logic.
Binary logic is relatively simple, and easily realized in electrical circuits. 2 bits may be operated upon via and, or, and not, which are the most basic of many Boolean algebra operations.
The tables below show how these Boolean operators work. The contemporary symbols for the operations are also shown.
Zuse graduated with a civil engineering degree in 1935. That same year he began working on the Z1, his 1st computer. Zuse finished the Z1 in 1938. It had ~30,000 metal parts, and never worked well owing to poor mechanical precision.
Zuse was given the resources by the German military to build the Z2, which he completed in 1940. The Z3 followed in December 1941. It was the first fully operational electromechanical computer.
Confident that the war would be won within 2–3 years, the military had refused to fund the Z3. Instead, an aerodynamics research institute indirectly funded the project.
No one was interested in a general-purpose computer. But the institute was interested in Zuse solving a problem regarding airplane wing flutter; thus Zuse got the resources he needed.
The Z3 could perform mathematics rather quickly, including finding square roots; but it could not execute conditional jumps, which are fundamental to logic processing. None of Zuse’s machines could jump, as the idea never occurred to him.
Zuse built a faster and more powerful Z4 toward the end of the war. It was discovered by the Allies when they invaded Germany, but Zuse and his computer were not deemed a security risk, so he was allowed to go his way.
In 1950, the Z4 was installed in a Zurich technical institute: the only mechanical calculator of consequence in continental Europe for many years.
The core component of Zuse’s calculators had been telephone relays. A relay is an on/off electromechanical switch, consisting of an electromagnet that closes an electric circuit when power is applied.
Zuse knew of vacuum tubes, which could switch on and off thousands of times a second; but they were hard to come by in Germany, and expensive, so Zuse stuck with relays.
A vacuum tube (aka electron tube or just tube) is a device that controls electric current between electrodes in an evacuated (airless) chamber. British electrical engineer and physicist John Ambrose Fleming invented the vacuum tube in 1904. American inventor Lee de Forest extended Fleming’s idea in 1906 to create a tube called a triode: 3 electrodes inside a tube. The triode’s ability to act as an amplification device was not discovered until 1912.
The triode revolutionized electrical technology, creating the field of electronics. It allowed transmission of sound via amplitude modulation (AM), replacing weak crystal radios, which had to be listened to with earphones.
Tubes made transcontinental telephone service possible (1915). Radio broadcasting began in 1920 in the wake of triode technology. Triodes were the key to public address systems, electric phonographs, talking motion pictures, and television.
Putting 2 vacuum tubes together, British physicists William Eccles and F.W. Jordan invented the flip-flop circuit in 1918. Coupled with a clock, flip-flops were the means to implement Boolean logic in a synchronous fashion. This was the essence of the computer hardware that evolved with tubes.
Contemporaneous to Zuse were 3 American computing projects. The first started in 1937, when American mathematical physicist George Stibitz, a Bell Labs researcher, tinkered with telephone relays and came up with an electromechanical adder in his kitchen.
In 1938, Claude Shannon, a student at MIT, published a thesis on implementing symbolic logic via relay circuits. Upon graduation, Shannon went to work at Bell Labs.
Bell was a telephone company, and its management saw no future in computation machines. Experiments in calculators by its researchers went nowhere.
American physicist Howard Aiken wrote a proposal for an electromechanical calculator in 1937. His motivations were similar to Zuse, but Aiken did not have as much savvy, notably in comprehending the advantage of using a binary system.
After being turned down by one calculating machine company, Aiken got the ear of IBM president Thomas Watson Sr. through a personal reference by a Harvard professor. Aiken taught math at Harvard at the time.
Watson was interested enough to take Aiken’s ideas and run with it, though he doubted there was much of a future for scientific computing equipment, which is how he viewed the project.
I think there is a world market for maybe 5 computers. ~ Thomas Watson Sr. in 1943
The result – the Mark I – was an electromechanical calculator working in decimal. A technological dead end, the Mark I was born a one-off dinosaur at the enormous cost of $500,000.
Once the 2nd World War started, the US military wanted ballistic firing tables, so that gunners might properly aim their weapons. It started with mostly women college graduates as calculators.
In 1942, American physicist John Mauchly, a teacher at the University of Pennsylvania’s Moore School of Electrical Engineering, wrote a paper about using vacuum tubes to build an “electronic computer.” The paper was poorly organized, conceptually unsophisticated, and badly written. It mostly wildly speculated about the hypothetical potential calculation speed of such a device.
Mauchly’s paper was sent to the US ordinance department as a proposal, where it was ignored. But then the trajectory calculation backlog became so bad in 1943 that the military decided to gamble on Moore to build an electronic calculator.
200% over budget and too late to help with the war effort – 3 months after the Japanese had surrendered, ending WW2 – the ENIAC was operational. The beast was nearly 3 meters high, 24 meters long, and weighted 27 tonnes.
Working ENIAC was tedious in the extreme. It took many hours of setting thousands of switches and cables to set up. ENIAC could not store programs or remember more than 20 10-digit numbers, so problems had to be solved in stages. Further, the machine required constant maintenance, as vacuum tubes frequently blew.
Having experimented with nuclear death during WW2, at the cost of 2 Japanese cities, atomic bombs became a US military obsession as the cold war began heating up. Getting a gauge on how to destroy cities by crushing atoms required serious number-crunching.
The military called on Moore for the next generation of calculator. This go-round the team had the wits to go with binary rather than decimal, thanks to a new consultant, esteemed Hungarian American mathematician John von Neumann. The Moore School would never have gotten the contract had it not been for Neumann, as the ENIAC had been a fiasco. Neumann lent a badly needed air of legitimacy.
As it was, a contract to build a new computer was inked in April 1946 at an initial budget of $100,000. The machine – EDVAC – was not completed until 1952, at a cost just under $500,000.
EDVAC weighed 7.85 tonnes and covered 45.5 m2 of floor space. Staffed by 30 people at a time, the computer worked reliably.
Computers in the future may weigh no more than one-and-a-half tonnes. ~ Popular Mechanics in 1949
Though others contributed, von Neumann was the chief architect of how the machine was to be structured. Besides using binary, the design included a central processor operating serially using random-access memory. This came to be known as von Neumann architecture, and would dominate how computers were built from then on.
ENIAC had operated on all bits in a word in parallel, which is faster but more difficult to build; hence, von Neumann suggested serial processing. In contrast, RAM allowed data and programs to be stored and retrieved from memory directly (rather than serially). This considerably enhanced performance.
The Brits were not idle. English scientists were invited to see ENIAC as the war drew to a close.
The British government bungled its first attempt to build a computer, thanks to bureaucratic indecision and miscomprehension.
Victoria University of Manchester began work on a computer in August 1948 and had its first working version by April 1949. The Manchester Mark I was the 1st fully electronic computer that could store a program in memory.
Computers proliferated in the 1950s. Most projects ran over budget and came in late.
Unsurprisingly, given their cost, a handful of business machine corporations held the market for the behemoth computers at the time. One came to dominate them all: IBM. But it was the phone company that came up with the innovation that would revolutionize computer technology.
Besides being expensive, the tubes that comprised early computers were fragile, cumbersome, gluttons for electricity, and gave off a lot of heat that required dissipation. They were, in short, a severe constraint on computing capacity.
The rise of quantum mechanics in the 1920s gave scientists a theoretical tool for the tremendously tiny. Semiconductors were given a look, but the research did not get very far.
Semiconductors are highly susceptible to contamination which alters their electrical characteristics. It was not until the early 1930s that semiconductor substrates – pure silicon and germanium – were available. Even then, scientists remained baffled by the behavior of semiconductors, especially its ability to convert light into electrical power.
In July 1945, Bell Labs decided to get serious about researching the potential for solid-state physics. The phone company was a huge consumer of tubes and mechanical relays, which were not entirely reliable and a maintenance headache.
Bell’s research team was headed by American physicist William Shockley Jr. The team made good progress. In 1947, they invented the transistor, which became the keystone in semiconductor-based computing.
The 1st commercial transistorized computer appeared a decade later, in 1957. Compared to tubed predecessors, the new generation was superior in every way: smaller, faster, more reliable, powerful, and economical.
The next step was miniaturization. English electronics engineer Geoffrey Dummer was first to conceive of integrated circuits.
With the advent of the transistor and the work on semiconductors generally, it now seems possible to envisage electronic equipment in a solid block with no connecting wires. The block may consist of layers of insulating, conducting, rectifying, and amplifying materials, the electronic functions being connected directly by cutting out areas of the various layers. ~ Geoffrey Dummer in May 1952
Dummer could not drum up support for his vision from the British government; but the US Defense Department was very interested. It had spent gobs of money in that direction but came up empty-handed.
Working for private corporations, American electrical engineer Jack Kilby and American physicist Robert Noyce created the first integrated circuits (ICs) in 1958. They only contained a few transistors.
Yeah, microchips,… but what are they good for? ~ IBM senior engineer in 1968
From there the race was on to pack more circuits in and get the cost down. It was not until the mid-1960s that ICs became reasonably priced; only then did they begin to appear in commercial computers. By 2007 an IC might contain tens of billions of transistors.
IBM faced a dilemma about the same time as the IC was making its debut. The company had a burgeoning product line, filled with incompatible machines: a program written on one IBM computer would not work on another. Nor were hardware peripherals, such as printers, compatible.
After extensive internal debate, the decision was to construct a comprehensive family of compatible computers. The plan was sound from a strategic perspective, but risky in every way. It rendered IBM’s extant product line obsolete. It involved betting on ICs at a time when they were unproven. It meant an enormous effort at tremendous expense. And it would let other companies introduce compatible equipment and software, thus cutting into IBM’s sales.
It was roughly as though General Motors had decided to scrap its existing makes and models and offer in their place one new line of cars, covering the entire spectrum of demand, with a radically redesigned engine and exotic fuel. ~ Fortune magazine in September 1966
4 years and over $5 billion later, IBM introduced the System/360. The venture was an enormous success, propelling IBM to even greater dominance. So much so that the US Department of Justice (DOJ) sued IBM in 1969 for being an illegal monopoly. The suit dragged on for 13 years before the DOJ dropped it, concluding that its accusation was “without merit.”
From the 1950s, innumerable computer companies went in and out of business. A quarter century later, the identical dynamic happened to software companies.