Go Forward, Go Back, Go to Table of Contents, Go to On Line Documents, Go to Go to Antique Computer home page



Reckoners, Rev.?, page 0149



7
The Revolution?



.
The Revolution is over.
-Sarah Turing, quoting her son Alan M. Turing, c. 1950
.

DIRECTIONS FOR USING:
For THINKING:-Wind the Clock-work Man under his left arm,
(marked No. 1.)
For SPEAKING:-Wind the Clock-work Man under his right arm,
(marked No. 2.)
For WALKING
and ACTION:-Wind Clock-work in the middle of his back,
(marked No. 3.)

-from L. Frank Baum, Ozma of Oz (1907)

Someone could have built a working computer long before 1935-Babbage almost did. The components (relays, vacuum tubes, teletypes, etc.) had all been around for quite a while. But the burst of activity after 1935 shows that only then were social and economic conditions really favorable for the computer's invention. The demands of business, government, and science had built up to a point where it is not surprising that the idea of an automatic computer occurred simultaneously to a number of inventors at that time. The Second World War was the catalyst that brought those demands together, while providing the necessary technical and (at least in America) financial support. There would have been computers had the war never taken place, but those computers would not look the way they do today, nor is it likely that there would be a computer impact as revolutionary as it is today. From the writings of the computer pioneers in the late 1930's one sees a modest vision: a few computers in the universities, some in government, modest-sized computers for industrial use, but not much more. They hardly foresaw the giant scale of the first computers, nor did they see the impact of their creation on everyday life. In America today, large and complex computer systems are an important part of industrial production, payroll and accounting, taxation, telephone and television service, transportation, and military defense.
Reckoners, Rev.?, page 0150

Yet the computer's impact, like that of any pervasive technology, cannot be easily judged. Certainly for the roomsful of clerks who labored over adding machines life has changed, as it has for the scientist who no longer has to struggle with doing matrix calculations by hand. That is really just a small segment of society that has felt the impact of the computer. One easily forgets that the pioneers in computing did not envision their machines as doing much more.

Nowhere is that better illustrated than by the frequent statements made in the early days that only four or five computers of the speed and power of the ENIAC would be needed to satisfy all the computing requirements of the United States. (It was felt that three would satisfy England's requirements-with perhaps one more for the Scots!) It is hard to pin down just who made those estimates, but such a vision of the future was common in the early days.1 Obviously they were wrong-they looked at the computer only in terms of the human beings it was replacing, as the automobile was first called the "horseless carriage," or the radio the "wireless." In terms of what the computer replaced they were correct: five computers like the ENIAC could do as much work as 400,000 human beings equipped with desk calculators, considering their respective multiplication speeds and the fact that a computer can work around the clock.

What they did not foresee was that the computer would soon be doing work of a very different kind, work that no human beings could do. With the adoption of the stored program principle, it became possible to program computers to sort and otherwise keep track of vast quantities of information, a task that human beings simply cannot do above a small threshold, and for which manual aids like file cards are likewise inadequate. The other unforeseen development was the adoption of high-level languages in lieu of arcane binary codes to get a computer to do its work. That innovation, also a consequence of the stored program principle, meant that a trained mathematician does not have to accompany every computer to its every job. And finally there has been the revolution in electronics technology that reduced the size and electric power requirements for computer who could have foreseen that?

Some of the skewed perception of the computer's potential was due to the military's influence. The American armed forces supported computing projects because they faced problems so complex (for instance, firing tables) that existing means for their solution were felt to be totally inadequate. The immediate onset of the Cold War in 1945, especially with its focus on nuclear weapons, continued that atmosphere. As late as 1950 only one large- scale computing machine (the IBM SSEC) was not under military control in America.
Reckoners, Rev.?, page 0151

Among other things that meant that computers would be geared toward number processing, and they would tend to be as large and fast as technically feasible. If anyone had an idea of building a modest-sized computer for small businesses, for libraries, or even for game-playing, he or she would not have found much support. But today the most exciting part of the computer business is at the small end-in computers for small businesses and the home. (Konrad Zuse was alone among the computer pioneers in wanting to sell small computers at first, slowly graduating to bigger machines as one gained more experience. It is no coincidence that he was working in postwar Germany, where he did not have the financial support of the military that John von Neumann had in America.2

But has there really been a computer "revolution" after all? Some have argued that its invention has not fundamentally altered social or economic patterns of American life; if anything, the computer has made it easier for the patterns existing before 1945 to persist.3 But "revolution" is an appropriate word aside from the question of the computer's impact on society. Its invention has triggered a revolution in thinking, in our understanding of our place in the universe, much as Copernicus's On the Revolution of the Heavenly Orbs did after A.D. 1600.

For one thing,, it changed the whole concept of a "machine." By the classical definition, machine is an assemblage or- stiff elements, so arranged as to perform one specific motion. Any other motions are defined as "play," of which a good machine has little.4 By that definition a computer is hardly a machine, since it is certainly not restricted to one specific "motion." A computer is a universal machine; it can do "anything"-that is, whatever we can program it to do. And we continue to be startled at the ingenuity of human beings in dreaming up new applications for the computer. Once programmed it begins to look more like a classical machine, doing one specific thing, but whatever that thing is can neither be specified nor foreseen by the computer manufacturer as it is sent out the factory door. That quality is a natural consequence of the fact that even when programmed, it is not possible to determine the actions of a computer in advance.

Computer pioneers used a biological analogy is sketching out their thoughts on building their machinery-for example, they called the storage unit a "memory," the programming code a "language." Von Neumann even based his description of the EDVAC on a neural notation, referring to the computer's various "organs." Those terms have probably caused more confusion than anything else. But they have won out over more prosaic terms like store or programming plan.

But the analogy works the other way, too: the structure and function of the computer provides human beings with a powerful model to aid our understanding of human thought. Mechanical analogies are certainly nothing new (Thomas Hobbes's introduction to his Leviathan being a famous example) but the analogy of the computer to the brain is much more powerful precisely because it is not a simple machine in the classical sense. The rich and complex behavior of a computer, set in motion by its executing an internally stored program of modest length, is an appropriate model for biological systems, in which an organism's characteristics represent an "expression" of information coded in its DNA. Modern computers contain read-only-memories (ROMs) which, like an organism's DNA, cannot be altered. They also contain random-access memories (RAMS) whose contents can be changed at will; these information stores also have their biological counterparts in those parts of the brain that can remember experiences and alter their future behavior based on what they have learned.
Reckoners, Rev.?, page 0152

All that reasoning by analogy has its limits, of course: the brain is not a computer, nor is human or animal behavior the expression of "programs." But there is enough in common in that both represent complex systems. Mathematical theories of such systems have proven useful to our understanding of both.5

Some of those ideas were on the minds of the builders of the first computing machines as they struggled with their arrangements of levers, pins, relays, tubes, wires, and paper tapes. The gulf between "reckoning" and "thinking" could at times appear narrow enough to suggest that one day an arithmetic machine might become a thinking machine. But on the whole their story has been one of perseverance, attention to details, and just plain hard work. The pioneers had a vision of creating something new and perhaps revolutionary. Precisely because they were facing unknown areas of inquiry and experience, what they said and did in response can be useful to us today, as it was remarkably free from a narrowness of vision that today's "mature" science of computing suffers from. Yes, we should understand computers and how they affect our lives today. If for no other reason, it might be a good way to help us understand ourselves.

  1. 1. John Wells, "The Origins of the Computer Industry: A Case Study in Radical Technological Change," Dissertation, Yale University, 1978, p. 96; Robert Noyce, "Microelectronics," Scientific American, 237 (September, 1977), 64; Edmund Berkeley, "Sense and Nonsense about Computers and their Applications," World Computer Pioneer Conference, Proceedings (Lladudno, Wales, 1970), p. 2/5; and most recently Philip J. Davis and Reuben Hersh, The Mathematical Experience (New York: Houghton Mifflin, 1981), p. 14; the estimate for England and Scotland is quoted in Simon Lavington, Early British Computers (Bedford, Mass.: Digital Press, 1980), p. 104.
  2. "The Thinking Machine," Time, January 23, 1950, cover and pp. 54-60.
  3. Joseph Weizenbaum, Computer Power and Human Reason (San Francisco: Freeman, 1975).
  4. Abbott Payson Usher, A History of Mechanical Inventions, revised edition (Cambridge, Mass.: Harvard, 1954), pp. 116-118.
  5. . See, for example, Douglas R. Hofstadter, Godel, Escher, Bach: An Eternal Golden Braid (New York: Random House, 1979); also John Holland, Adaptation in Natural and Artificial Systems: An Introductory Analysis with Application to Biology, Control, and Artificial Intelligence (Ann Arbor: University of Michigan Press, 1975).





Reckoners, Rev.?, page 0153

GLOSSARY A


Translations and Equivalents
of German Terms


The terminology used throughout computer science today comes from the English language. Those terms gradually assumed their meanings after 1950, when commercial installations of computers began. Inasmuch as Konrad Zuse was working independently and without much contact with Anglo-American computing activities before and after the Second World War, he developed an independent terminology, which is nonetheless equivalent to the more familiar words in use today. The following table lists first Zuse's term plus any related German terms, followed by a literal English translation, and finally the modern English term I feel comes closest to describing what Zuse intended. I caution the reader, however, that the modern terms are not exact equivalents, but are intended only to suggest a similar concept.


Zuse's Term Literal Translation Modern Equivalent



Bedingungskombinatorik conditional combinatoric predicate calculus,
(Aussagenkalkul)
. symbolic logic
"Ergibt"; = > results in, produces assignment, "let"
halblogarithmisches Form semi-logarithmic formfloating point, scientific
(gleitendes Komma)
. notation
Ja-Nein Wert (Dual-Ziffer) yes-no value binary digit, bit
Kalkul calculus language
Leitwerk control unit control
Logistic (lnformatik) logistic computer science
Plan (Rechenplan) plan program, algorithm
Planfertigung plan-preparation compiler, interpreter
Plankalkul plan-calculus programming language
Rechenanlage calculating installation computer
Rechenmaschine calculating machine calculator
Rechenwerk calculating unit central processor
Rechnen reckon, figure compute
Sekundal (Dual System) secondary vbinary system
Reckoners, Rev.?, page 0154

Speicherwerk storage unit memory unit
Vorschrift prescription algorithm
Wahlwerke, Wahlpyramid choice unit decision tree, memory
. . addressing
Zell cell word, address, register

Reckoners, Rev.?, page 0155



GLOSSARY B


Technical Terms

Used in the Text The following glossary gives a brief description of some of the technical terms discussed in the text. During the early history of computing, few of these terms had a fixed meaning. Some, like computer itself, changed their meanings several times from 1935 to the present. Others, like accumulator, had well-defined meanings but various individuals interpreted them differently. In all cases I have attempted to provide a definition of a term as it might have been defined in the period from 1935 to 1945. Where this definition is at variance with the modern one, I give that too.


Accumulator. A mechanical or electrical device which is capable of both storing a number and adding another number to it. Contrast with register (q.v.), which can store but not add. The main difference between the two is that an accumulator's digit elements are linked to one another so that a carry may take place. See also counter.

Addressing. The method by which a computer stores and retrieves data from its memory cells. Typically each cell is given a numerical address, and appropriate circuits direct the computer to connect a cell with a given address to its central arithmetic unit.

Algorithm. One of the most fundamental concepts in computing. "A finite set of rules which gives a sequence of operations for solving a specific type of problem," according to Donald E. Knuth (The Art of Computer Programming, vol. 1 [Reading, Mass., 1969], p. 4). A precise and unambiguous recipe which is sure to lead to an answer to a problem.

Arithmetic. The four fundamental operations of addition, subtraction, multiplication, and division. Sometimes the operation of taking the square root is included. The arithmetic unit of a computer is usually fitted with the capability of performing only those operations. All other "computing" must be built up of combinations of them, plus storing and retrieving numbers from the memory.

Binary. Any representation of numbers or logical values by a system in which only two possible states are allowed. A simple switch, a two-position lever, a relay, or a flip-flop are all binary devices. Binary numbers are usually written as combinations of the two binary digits 1 and 0; binary logic is usually expressed as combinations of true and false values. Konrad Zuse used the letters L and O for both, and in a computer it is common to mix both logical and numerical values.


Reckoners, Rev.?, page 0156

Calculator. A machine which manipulates primarily numerical data. Before 1935: usually a desk-sized machine that performed the four operations of arithmetic. 1935-1945: sometimes used synonymously with computer. Today: a computing device which operates in the (binary coded) decimal system and which is optimized for numerical calculations. If such a device is programmable, the program is usually stored in a memory kept separate from the data memory.

Calculus. Latin word for pebble-a stone used as an aid to counting. Modern collo- quial meaning refers to the method of differentiating and integrating functions, as simultaneously discovered by Isaac Newton and G. W. Leibniz in the 17th century. Its general meaning (as, say, Konrad Zuse used it) is the expression of mathematical concepts by a system of well-defined symbols.

Compiler. A computer program whose output is another program-one which can then be directly executed by the circuits of the machine. Compilers allow one to write programs using a richer and more familiar notation than long strings of binary digits.

Complement. The complement of a number is the difference between that number and 10000. . . (to as many zeros as the original number has). In the decimal system, the complement of 354 is 1,000 - 354, or 646. (Binary numbers have corresponding complements as well.) Computers usually subtract by adding the complement of a number, for example, 421 - 354 is computed as 421 + 646, or 1,067. If the machine's registers have only three places, the "1" at the left drops out, giving the true difference 67. A variation of this scheme is to take the difference between each digit and 9, giving the so-called nines complement, instead of the "tens complement" just described. That has the advantage of not requiring any borrowing of digits to form the complement, but a 1 must be added (the so-called end-around carry) to give the correct answer for subtraction.

Computer. Before 1945: a person who did calculations. After 1945: a machine capable of the four operations of arithmetic, automatic storage and retrieval of intermediate results, and automatic input and output, all directed by a control unit. The modern definition is a machine which can manipulate symbolic information in any combina- tion or way one desires, and which contains an internally stored program, which the machine may also manipulate if desired.

Control. That part of a computer that routes numbers to and from other sections of the machine: memory, input/output, arithmetic. Usually directed by a program.

Counter. A physical device that represents a number and which can also add quantities, but only by one unit at a time. If an accumulator contains the number 354, one can change its contents to 454 simply by adding the digit "1" to the third decimal place. (Old, full-keyboard adding machines had this ability.) But for a counter to go from representing 354 to 454, it would have to go through all the states in between: 354, 355, 356, . . . , 452, 453, 454. An automobile odometer is a counter: it counts the miles one at a time (unless someone tampers with it). Modern computers have at least one counter that steps through the program stored in successive memory locations. It normally increments itself by one unit at each program cycle, but there are provisions to have it make longer jumps as well. A much older definition is a table where business transactions were carried out with the help of pebbles, an abacus, or a counting board. Hence the phrases "over the counter transactions," and the like.

Data. Latin "things which are given" (singular datum). Generally, the material put into a computer in coded form and then acted upon by the machine. It usually refers to numerical inputs as opposed to the program which is also fed in.


Reckoners, Rev.?, page 0157

Electromechanical. Electrical circuits in which mechanical parts do the actual switch- ing, as in a relay, a household light switch, push buttons, etc. Electromechanical calculators work just like mechanical ones, except that the power is supplied by an electric motor instead of by human muscles pulling a lever or turning a crank.

Electronic. Circuits in which all switching is done by the electrons themselves, as in a vacuum tube or transistor. Because the mass of an electron is so small, electronic switches are much faster than electromechanical ones.

Floating point. Also called "scientific notation." Representing a quantity by two numbers: one showing the digits of the number, the other its magnitude. The second number tells where the decimal point is to be placed in the first number (binary point if using base two).

Large-scale. Does not refer to the physical size of the machinery, even though nearly all the early large-scale computers were also physically very big. It refers rather to the fact that it can solve problems whose scale would severely tax human computers. Solving a system of forty simultaneous equations would be such a problem; it would be almost impossible to solve without an automatic computer.

Memory. Any physical device which is capable of holding the representation of a number for later retrieval and use. A memory is said to be random-access (RAM) if the time it takes to get one number from it is the same as the time to get any other number. Cyclic or serial memories (for example, magnetic tape units) require more time to get some contents than others. Some memories cannot be altered once their contents are fixed; they are called read-only memories (ROM).

Parallel. Doing more than one thing at a time. In computing, there are two contexts. A parallel data structure means that when a number is manipulated in the computer, all of its digits are handled at once. (Once again, mechanical adding machines work this way.) A parallel program structure means that more than one program step is carried out at once (cf. the ENIAC). Contrast with serial or sequential.

Program. A list of instructions which is supplied to a computer in coded form, and which causes the computer to carry out an algorithm (q.v.) that will solve a problem. Strictly speaking, a program must be of the form that a machine can "understand" -an algorithm need not be. Neither may contain any ambiguities in its expression.

Programming language. The set of symbols, plus the associated rules for combining those symbols, with which a computer program is written. Only in that abstract sense do programming languages resemble human languages. A computer "understands" a programming language when it acts the way we want it to after being programmed in that language.

Register. A physical device which accepts and stores data. In contrast to an accumulator (q.v.), when a new number is sent to a register it overwrites whatever number is already there. The word register also loosely applies to units in the central processor of a computer in which arithmetic and temporary storage are both performed.

Sequential. Doing one arithmetic operation at a time. The classic von Neumann architecture specifies a computer that transfers numbers in parallel but which executes program steps sequentially.

Serial. Handling numbers one digit at a time. Examples are the way most humans do arithmetic using a pencil and paper, or the way a seven-digit telephone number is dialed one digit at a time.


Reckoners, Rev.?, page 0158

Symbolic logic. A set of symbols and rules for their manipulation, which treat logical rather than numeric values. The most common form of symbolic logic handles only the two values "true" and "false," thereby enabling it to mesh nicely with binary arithmetic, which is also two-valued. The "logic" of a computer refers informally to its internal electronic circuits. Also called Boolean algebra.

Synchronous. Operation of the computer is controlled by a central clock. A synchronous computer works like a symphony orchestra, in which the conductor coordinates all the activities of the players. All actions of the computer are done with reference to its clock cycle. By contrast, a human being does each step in solving a problem as soon as he has finished the preceding step; he does not have to wait for a specific clock cycle. The Bell Labs Model I was an asynchronous computer, but most others were (and are) synchronous.

Word. A block of physical elements in which numbers or other data are manipulated as a group. A "word-length" of ten decimal digits means that numbers of up to that many digits (to a maximum value of 9,999,999,999) can be handled as a block. Some computers have the ability to split a word in half, thereby doubling the number of storage cells in the computer, while halving their individual digit capacity. Where it is necessary to handle numbers of greater digit length than the word length, two adjacent cells may be joined together, for so-called double-precision.
Reckoners, Rev.?, page 0159



APPENDIX


Program Listings

Following are listings of original program codes written for the Z3 and ASCC. Each is then translated into a program for an HP 41C pocket calculator. While this calculator employs a logic quite different from the Zuse and Aiken machines, I have written the programs for it to conform as closely as possible to the logical flow of the originals. I also include a 41C program that "emulates" the operation of the Bell Labs Model I as it performed complex division. This program requires a printer connected to the 41C.


Reckoners, Rev.?, page 0160


Reckoners, Rev.?, page 0161


Reckoners, Rev.?, page 0162


Reckoners, Rev.?, page 0163


Reckoners, Rev.?, page 0164


Reckoners, Rev.?, page 0165



Go Forward, Go Back, Go to Table of Contents, Go to On Line Documents, Go to Go to Antique Computer home page