RECKONERS
THE PREHISTORY OF THE DIGITAL
COMPUTER, FROM RELAYS TO THE
STORED PROGRAM CONCEPT,
1935-1945
Paul E. Ceruzzi
Library of Congress Cataloging in Publication Data
Ceruzzi, Paul E.
Reckoners.
(Contributions to the study of computer science,
ISSN 0734-757X ; no. 1)
Bibliography: p.
Includes index.
1. Electronic digital computers-History. I. Title.
II. Series.
QA76.5.C4164 1983 621.3819'58'09 82-20980
ISBN 0-313-23382-9 (lib. bdg.)
Copyright © 1983 by Paul E. Ceruzzi
All rights reserved. No portion of this book may be
reproduced, by any process or technique, without the express
written consent of the publisher.
Library of Congress Catalog Card Number: 82-20980
ISBN: 0-313-23382-9
ISSN: 0734-757X
First published in 1983
Greenwood Press
A division of Congressional Information Service, Inc.
88 Post Road West
Westport, Connecticut 06881
Printed in the United States of America
10 9 8 7 6 5 4 3 2 1
Contents
Illustrations
PHOTOGRAPHS
FIGURES
Tables
Reckoners, Preface, page 0001
|
Preface
.
|
Human agents will be referred to as "operators" to distinguish them from
"computers" (machines).
| -George Stibitz, 1945
|
|
The modern digital computer was invented between 1935 and 1945. That was the
decade when the first machines that could be called true digital computers
were put together. This book tells the story of that invention by looking at
specific events of the 1930's and 1940's that show the computer taking its modern
form.
Before 1935 there were machines that could perform calculations or otherwise
manipulate information, but they were neither automatic nor general in
capabilities. They were not computers. In the 1930's the word computer meant a human
being who calculated with the aid of a calculating machine. After 1945 the
word meant a machine which did that. From that time on computers have continued
to evolve and improve, becoming dramatically cheaper and smaller, but their de-
sign has not really changed. So the story of what happened in that ten-year
period will reveal quite a bit of the entire history of the computer as it is
known today.
I have chosen four projects from that era that best illustrate how the
computer was invented. These are by no means all that happened, but they are
representative of the kinds of activities going on.
The first is the set of electromechanical computers built in Germany by
Konrad Zuse, who because of the war had no knowledge of similar activities in
America and England. His independent line of work makes for an interesting and self-
contained case study of just how one goes about building a computer from
scratch.
The second is the Harvard Mark I, built by Professor Howard Aiken and first
shown to the public in 1944. This machine was one of the first truly
large-scale projects, and because it was well publicized it served notice to the world
that the computer age had dawned.
Reckoners, Preface, page 0002
|
The third project is the series of relay computers built by George Stibitz
of the Bell Telephone Laboratories between 1939 and 1946. These machines represented
the best that could be done with electromechanical devices (telephone relays),
and as such mark the end of that phase of invention and the beginning of another.
The final project is the ENIAC, the world's first working electronic
numerical computer, using vacuum tubes for its computing elements,
and operating at the speed of light. With its completion in late 1945 all of
the pieces of the modern computer were present:
automatic control,
internal storage of information,
and very high speed.
What remained to be done after 1945 was to put those pieces together in a
practical and coherent way. From the experience of building and using those machines
there came a notion of what a computer ought to look like.
The old definition of a computer gave way to the modern one:
a machine capable of manipulating and storing many types of information at
high speeds and in a general and flexible way. How this notion came about,
and especially why the notion of storing the computer's program of instructions in the
same internal memory as its data gained favor, are also examined.
This book has a dual purpose. The first is to recount the history of the
computer, emphasizing the crucial decade between 1935 and 1945 but including earlier
events and more recent trends as well. The second is to explain in simple terms the
fundamentals of how those computers worked. Computing has certainly changed since 1945,
but the basic concepts have not; I feel that it is easier to grasp these concepts as
they were present in earlier, slower, and much simpler computers. I have included brief
explanations of some of these concepts in the text of the book; a glossary at the end
gives short definitions of many terms of modern computing jargon.
That the computer is having a profound effect on modern life is hardly at
issue. Just how and why such a profound change in our society is happening because of
computers can better be understood with a grasp of how this technology emerged.
I wish to thank the following persons and institutions for their help with
the researching and writing of this book: the Society for Mathematics and Data Processing,
Bonn; the Charles Babbage Institute, Minneapolis; the Linda Hall Library, Kansas City, Mo.;
the Baker Library, Dartmouth College; and Professors Jerry Stannard, Walter Sedelow, and
Forrest Berghom of the University of Kansas. Konrad Zuse, Helmut Schreyer,
and George Stibitz supplied me with personal archival materials and criticized
portions of the manuscript. I also wish to thank Bill Aspray, Gwen Bell, and Nancy Stern,
who also read portions of the manuscript and gave me helpful advice. Any errors or
statements of opinion are of course my own.
Reckoners, Background, page 0003
|
1
Background
.
|
All the other wonderful inventions of the human brain sink pretty nearly
into commonplaces contrasted with this awful mechanical miracle. Telephones,
locomotives, cotton-gins, sewing-machines, Babbage calculators, Jacquard
looms, perfecting presses, all mere toys, simplicities!
| -Mark Twain, 1889
|
|
Computers have appeared so rapidly on the modern scene that it is hard to
imagine that they have any history at all. Yet they do, and it is a most
interesting history indeed. The computer has been around now for at least thirty years,
maybe more. But there was a much longer period of activity before its
invention when important preliminary steps were taken-a period of the computer's "pre-
history." It is much more than just the story of specific machines and what
they did, although that is at the center of this study. It is also a part of the
history of mathematics and science-of the story of how mankind acquired a perception of
quantities, and of the long, slow acquisition of the ability to code and
manipulate quantities in symbolic form.
The history of computing, if not the history of the digital computer, could
begin at the dawn of civilization, when people first sought to measure and keep
track of surpluses of food, the management of which freed them from the anxiety of
daily survival. The first mechanical aid to calculation was probably a device
like the modern abacus, on which numerical quantities were represented by the
positions of pebbles or beads on a slab. (The modern version, with its beads strung on
wires, has been around at least a thousand years and is still in common use
in many parts of the world.)
Or one could begin the history of the computer with Blaise Pascal, who in
1642 built a mechanical adding machine that performed addition automatically.
(Like the abacus, it too survives-the cheap plastic totalizer sold in
supermarkets to keep track of a grocery bill works the same way.)
Reckoners, Background, page 0004
|
But the computer is something more than just a sophisticated adding machine,
even though at the heart of every computer is something like Pascal's
automatic
adder. The computer is a system of interconnected machines which operate
together in a coherent way. Only one of those pieces actually does
arithmetic. A computer not only calculates: it also remembers what it has just
calculated, and it automatically knows what to do with the results of those calculations.
And numbers are not the only kind of information a computer can handle.
Letters and words can be coded into numbers and thus made grist for the computer's mill;
the same holds true for photographs, drawings, maps, graphs:
any information which can be symbolically represented. A computer is a machine
which is capable of physically representing coded information (numbers,
words, or whatever) and which is further capable of manipulating those values in
any combination or sequence one desires. Numerical calculation is of course one
form of symbol manipulation, but it is by no means the only one.
By this definition, the history of the computer really begins sometime after
1935, when machines with such general and automatic capabilities first
appeared. It is that emergent period of their history-beginning around 1935 and ending
about ten years later-that this book examines.
The invention of the computer was the result of a convergence of a number of
different social, technical, and mathematical traditions.1 Some of those
traditions are external: the increasing need by governments for statistical
information, for example, brought on in the United States by political developments such as
the Progressive Era and the New Deal. Other traditions are more internal to the
computer: they are literally present as components of the machine itself.
The tradition of mechanizing arithmetic has already been mentioned: its
legacy may be found in the "central processor" of any modern computer. The
tradition of building devices that control a sequence of mechanical motions can be
traced back to medieval cathedral clocks, where mechanical cams directed an
elaborate sequence of movements at the striking of every hour. (The cuckoo clock is a
much simplified version; one of the most elaborate examples is the cathedral
clock at Strasbourg, France.) An electronic version of that mechanism, too,
is present in a modern computer.
A computer directs a sequence of activities, senses what it has already
done, and modifies its future course of action accordingly. That kind of
intelligence has its roots in medieval devices such as windmills which automatically
adjusted the pitch of their vanes to adapt to the prevailing winds.2 Another
example of such a device is the fly-ball speed governor that James Watt attached to his steam
engine, turning that machine into a reliable source of steady power.
Finally, there is a long tradition of techniques that record information;
certainly the invention of writing itself, and of printing with movable type
in the 15th century in Europe are part of it. The idea of using holes punched in
pieces of cardboard to represent and manipulate data was first successfully employed
by Herman Hollerith for the 1890 United States Census. (Punched cards had
earlier been used in the silk weaving industry to control the pattern made by a
loom, but that belongs more to the tradition of automatic control than to that of data
storage.) Punched cards are still the backbone of modern data storage,
although
much faster electronic devices are used in a computer's main store. More
modern storage techniques have appeared which may displace the punched card entirely; for
example, the use of bar-codes to identify items in a supermarket.
Reckoners, Background, page 0005
|
Each of those traditions forms a separate block that goes to make up a
computer. They are revealed whenever a computer is described by a block diagram that
groups its functions together. But each tradition developed separately from the
others, with a few exceptions-until the 1930's, when they merged and continued on as one.
Despite the general capabilities of computers, their immediate ancestors
were machines that performed arithmetic. But their enormous impact on modem life
is due less to their numerical abilities than to their ability to sort and
handle large amounts of data, and to their ability to direct their actions
intelligently. Nevertheless they bear the legacy of their "calculating" ancestry.
The word itself reveals that legacy: before 1935 a "computer" often meant a human being who
evaluated algebraic expressions with the aid of a calculating machine. That person
(who was often a woman-the job of "computing" was thought of much like typing)
would be given a mathematical expression-say a formula that evaluated the
solution of a differential equation-plus the numerical values that were
plugged into that formula. Depending on the "computer's" prior mathematical
training, the instructions given to him or her to evaluate the expression would be
more or less detailed.
These workers had the use of mechanical calculators which performed
addition, subtraction, multiplication, and division to an accuracy of eight to ten
decimal digits. (These machines are still in common use, as are adding machines that only
add or subtract.) The calculators were driven by turning a crank or by a small
electric motor, but all arithmetic done in them was mechanical. The person had one other
important aid to computation: a pencil and sheets of paper, on which intermediate results
were written down for use later on. With only a few exceptions, the mechanical
calculators could not store intermediate results.
Taken together, the person, the calculator, the pencil and "scratch" paper,
and the list of instructions formed a system which could solve a wide range of numerical
problems, as long as the solutions to those problems could be specified as a series of
elementary steps. That human and mechanical system was precisely what the first digital
computers replaced.3
By 1945 the definition of the word computer had changed to reflect this
invention: a computer was no longer a human being but a machine that solved computing
problems. The definition of calculator remained unchanged: a device which could
perform the four ordinary arithmetic operations, working with no more than two numbers at a
time. (The modern definitions are in the same spirit but a little different; see
Glossary B.)
Reckoners, Background, page 0006
|
After 1945 the evolution of computing technology followed a single line to
the present. With the end of the Second World War the many computer projects that were
in progress around the world became known and were publicized to varying degrees.
Conference reports and other written descriptions of the first computers
became templates for computer designs thereafter.4 The ideas and writings
of one man, John von Neumann, were especially influential, so much so that even today
computers are said to have a "von Neumann type" architecture.
So after 1945 there was a more common understanding of the nature of this
new invention, what it could do, and what overall structure ("architecture") it
should have. There was still little agreement on what its components should be,
especially for its memory, but by 1950 it was clear that a computer had to be made of high-speed
electronic components (such as vacuum tubes), and it should be organized into
functional units that performed the operations of storage, arithmetic, input, and output. Since
1950 those decisions have not changed, even though the components that make up a
computer have. So the analogy of a person working with a desk calculator and some scratch
paper still holds true, although everything has gotten much more complicated at every stage.
In the following chapters I have chosen four case studies of computer
projects undertaken between 1935 and 1945 which I feel best illustrate the emergence of the new
technology. These case studies by no means exhaust the subject, but the details
encountered represent the major issues well.
The first project examined was one of the first to be completed. It was
surprisingly not in England, where Charles Babbage had proposed building an Analytical Engine a
century before, nor was it in the United States, where sophisticated calculating
and punched card equipment was being extensively used, but rather in Germany, where the
story may be said to have really begun. Konrad Zuse, an engineering student at the Technical
College in Berlin, began looking for ways to ease the drudgery of calculations required for
his coursework--drudgery which the slide rule and the desk calculator could not relieve.
Zuse was not well versed in the calculating-machines technology of his day (which may have
been a blessing), but by the end of the war in 1945 he had not only designed and built
several working automatic computing machines, but he had also laid the foundations for a
theory of computing that would independently emerge in America and Britain a decade
later. Zuse's electromechanical devices were the first that could be programmed to do
sequences of calculations, so they will be examined first.
In America, a similar idea occurred to a Harvard physics instructor named
Howard Aiken, who had faced long and tedious calculations for his graduate thesis. The
result was a computing machine that used the components of standard punched card
equipment that the IBM Corporation had developed. Aiken's "Mark I" was the first large
computing device to be made public (in 1944), and as such its name is appropriate-it marks the
beginning of the computer age, despite the fact that it used mechanical components and a
design that soon would become obsolete.
The third example is the work of Dr. George Stibitz and his colleagues at
the Bell Telephone Laboratories in New York, where a series of computers was built
out of telephone relays, as were Zuse's machines. These machines, too, would soon
become obsolete, but their design and the programming done on them contributed
much to the mainstream that followed.
Reckoners, Background, page 0007
|
Finally I look at the ENIAC: the first computer that could carry out,
automatically, sequences of arithmetic operations at electronic speeds. It was completed
in late 1945, and from that time onward the age not only of computing but also of high-speed
electronic computing had begun.
For each of the machines examined in the next four chapters I hope to
establish the following data:
- how the machine was conceived, designed, and constructed;
- how it was programmed and operated; and
- to what practical use, if any, it was put.
(Actually, for all the experimental computers of that day the design specifications
were never really "frozen" for long. Someone was always improving and modifying them. I have
tried to establish those vital statistics for the machines when they first began
solving mathematical problems.)
Where it has been feasible, and in the Appendix, I have included sample
programs. The reader who is unfamiliar with modern computer programming may follow these
samples; the terms and details of each one are defined and explained as they are
introduced. Several early programs have also been rewritten for a pocket programmable calculator, for the reader who wishes to get a better feel for just what the prehistoric computers could
do.
The public dedication of the ENIAC in 1946 marked the dawn of the
electronic computer age; actually it was more like the herald of the dawn. The ten years from
1935 to 1945 saw the convergence of various traditions to make the computer; the ten years
following that saw both a continuation of the projects begun during the war, and an
intensive study of the theory of computing itself not so much how to build a computer as how one
ought to build a computer. This activity was made visible as conferences, reports,
memorandums, lectures, and short courses in computing that were held throughout America and
Europe. John von Neumann was one central figure; others who contributed to this phase of
activity were D. R. Hartree, Alan Turing, and Maurice Wilkes in England; Howard Aiken and
George Stibitz in America; and Konrad Zuse, Eduard Stiefel, and Alwin Walther in
continental Europe, to mention only a few.
What they accomplished can be summed up in a few words: the computer, as
before, was seen as a device that did sequences of calculations automatically, but more
than that, it was seen as not being restricted to numerical operations. Problems such as
sorting and retrieving non-numeric information would be just as appropriate and in fact, from a
theoretical standpoint, even more fundamental to computing than numerical problems.
Second, the realized that the instructions that told a computer what to do
at each step of a computation should be kept internally in its memory alongside the
data for that computation. Both would be kept in a memory giving access to any
data or instructions at as high a speed as possible. This criterion allowed the
execution of steps at speeds that matched those of the arithmetic unit of the
machine, but it also allowed for more than that. The data and the
instructions were stored alongside one another because they were not really different
entities, an it would be artificial to keep them separate. An understanding of that startling
fact, when implemented, made the computer not just a machine that "computed"
but one which also "reckoned"--it made decisions and learned from its
previous experiences. It became a machine could think, at least in a way that
many human beings had defined thinking.
Reckoners, Background, page 0008
|
It was the adoption and recognition of this stored-program concept that would
make the computer's impact on society so strong in later years. That recognition was slow
in coming- the principle is even today not full understood. How it emerged, and how it was
eventually incorporated into the design of computers after 1950, is the subject of
Chapter 6. As with the other chapters, I discuss the stored-program concept and its
implications in layman's terms.
In presenting the history of the digital computer in the years of its birth
I have several goals in mind. One goal concerns the state of the art today. The fact that
computers have such an impact on daily life today should lead us to understand more of
their nature and what gives them their power. But for someone unfamiliar with the
engineering and mathematical concepts, that task can be difficult, if not impossible.
Looking at the history of the computer offers a way out of that bind. The men and women whom we shall encounter on the following pages knew nothing of "computer science" when they began
their work--such a science did not exist then. They created computer science, out of
their diverse backgrounds and from their experiences in trying to build machines which
later generations would call "milestones." By retracing some of their steps, we can learn
something of the foundations of modern computer science, for it was then that the foundation
was laid.
Another goal, closely related, is to try and get a fresh outlook on the
computing world today. It is a crazy world: a mixture of wild speculation, careful
theoretical research, technological breakthroughs every few months, fortunes made (or lost)
overnight. We frequently hear that these machines are smarter than we are, and sooner or
later they will "take over" (whatever that means). Can a computer really think? Indeed,
does it even make sense to ask a question like that? And with twenty-five dollar computer
games that speak, with computer programs that make accurate medical diagnoses (and with
others that only pretend to be psychoanalysts), it is hard to find a good vantage point from
which to survey the field and get an overall view.
By looking not at those computer programs that dazzle us today, but rather
at simpler ones that did more routine problems on the earliest machines, we can get
such a view. We shall be examining the computer before it got so complex that, in Alan
Turing's memorable words, "I suppose, when it gets to that stage, we shan't know how it does
it."5
Today we often hear the command that we must learn about computers if we
want to keep up with the pace of modern society. We hear further that computers are
bringing us a technological Utopia (at last!), but if we do not learn about them, all we
can do is forlornly press our noses against the window looking in; we may never enter. I have
always felt uncomfortable with that scenario--I do not like to be coerced into doing
something I otherwise might never have thought of doing. Nor do I feel that learning
about computers is absolutely necessary to
manage in the world today. Humans can get by without them, just as many
live comfortable lives without telephones or automobiles. Why not learn about computers
because they are inherently interesting, and because it is fun to see what makes them tick?
They are, after all, "only" creations of ordinary human beings. And learning about them can tell
us something about how we tick, as well. And that should not threaten or intimidate
anyone.
Reckoners, Background, page 0009
|
NOTES
- Thomas M. Smith, "Some Perspectives on the Early History of Computers,"
in Perspectives on the Computer Revolution, ed. Zenon W. Pylyshyn (Englewood
Cliffs, N.J.: Prentice-Hall, 1970), pp. 7-15.
- Otto Mayr, The Origins of Feedback Control (Cambridge, Mass.: MIT Press, 1970).
- George R. Stibitz, "Relay Computers," Report 171.18, U.S. National Defense Research
Committee, Applied Mathematics Panel, Feb. 1945, p. 2.
- See, for example, Arthur Burks, Herman Goldstine, and John von Neumann,
"Preliminary Report on the Design of an Automatic Computing Machine," in
John von Neumann, Collected Works, vol. 5 (New York: Macmillan, 1963), pp. 34-79;
there were also several conferences held at Philadelphia, Harvard, and Cambridge,
England, just after the dedication of the ENIAC.
- Sara Turing, Alan Turing (Cambridge, England: W. Heffer and Sons, 1956), p. 98.
|