return to Introduction.
A discussion of the origins of PARC's Alto system
from Chapter 7 of "Fumbling the Future: ... "

electron beams are not permanent. That's why the television screen goes blank when you switch it, and the electron beam, off. In fact, televised patterns must be illuminated at least thirty times a second to appear fixed; anything less produces a disturbing instability. When calligraphic technology portrayed a complex image like this page of text, the time required to trace each character prevented updating the screen often enough Because raster technology controlled the computer image completely with switches (turning the beam on and off) instead of relying on brush strokes (drawing with the beam), it could re-light the screen much faster. Raster images appeared stable; calligraphic ones did not.

Consequently, only a bit mapped raster display, notwithstanding its voracious appetite for memory, could capture the richness of ink and paper while eliminating flicker. "Fortunately," Lampson and Thacker discovered, "surprisingly good images can he made with many fewer bits." They believed a grid of 500,000 picture elements instead of 4 million would "preserve the recognizable characteristics of paper and ink." Still, when finished, the Alto's bit map consumed nearly half of the computer's total memory. While Lampson, Thacker, and others could, and did, develop techniques to cut back on the display's demand for memory, the resources willingly dedicated to the Alto's user interface remained impressive.

The improved mouse and bit map display augured well for PARC -- if Altos could be built inexpensively enough for everyone no have their own. Thacker accurately predicted the Alto's total memory would cast only $35 dollars in the early 1980s. But in 1972, it went for $7,000. He and Lampson had to find other ways to save money.

In one respect, the choice of a raster display helped because large scale television manufacturing made raster monitors cheap to purchase. Lampson and Thacker also economized by stripping the Alto's arithmetic functions to a bare minimum. And the abandonment of timesharing yielded up some savings since protecting users against one another became unnecessary. Finally, Lampson and Thacker knew PARC would assemble its own machines, eliminating the expense of labor, administrative overhead, and profit included in the price of another company's products.

All these measures combined, however, were insufficient to meet the Alto's cost objective. Minicomputer performance specified more hardware than the Alto budget could tolerate. Moreover, in light of the history of computer engineering, sophisticated facilities like the bit mapped display should have added, not reduced, the circuitry in the system.

Computers consist of four major parts: input, central processor, memory, and output. Data and instructions are transmitted from an input device such as a keyboard to memory. The central processor fetches the data and instructions from memory, and it executes the required computation. The central processor then dispatches the results to an output mechanism like a screen or a printer.

This scheme has an added wrinkle. The central processor itself comprises two subunits. Its "arithmetic and logic unit" manipulates the data to produce computing results; its "control unit" keeps order throughout the system, much as an air traffic controller directs takeoffs and landings to prevent collisions.

Control units in first generation computers were required to direct more traffic than is typical today. They did virtually all of the electronic processing necessary to operate each of the computer's other subsystems: arithmetic and logic, memory, input, and output. This slowed dawn processing.

Recall that the central processor carries out a discrete step with each separate clock beat, or cycle. More work means more steps; more steps mean more cycles; more cycles, or clock beats, mean more time to completion. By burdening the central processor with the jobs of input and output as well as arithmetic, logic, and memory, computing results took longer no obtain. Furthermore, to minimize the demand for cycles, input and output devices had to remain relatively primitive.

Engineers tackled this dilemma during the 1950s and 1960s by adding processing circuitry directly to input and output accessories. Successive layers of hardware assumed more and more of the workload until, in the most advanced systems, input and output devices incorporated their own processors and memory, freeing the central processor's control unit of all responsibilities other than general coordination. Since the central processor could devote more cycles to memory, arithmetic, and logic, and less to input and output, computer systems ran faster. In addition, the added circuitry paved the way for more advanced input and output technology including keyboards, disk drives, and displays.

Of course the extra circuitry also cost a lot of money. Chuck Thacker realized a return to the concept of sharing a processor's cycles with input and output would reduce the parts bill for an Alto. The dilemma was how to cut back on hardware without sacrificing features like the bit map display that required access to powerful electronics - in other words, how to subtract circuitry while adding capability. A neat trick.

Thacker says, "The solution just came to me. It was an 'ah ha' experience."

His innovation, called "multitasking," effectively turned one processor into many. He wired the control unit of the Alto's central processor to take its instructions from up to sixteen different sources, or "tasks," instead of the usual one. Among these tasks were the bit map display, the mouse, the disk drive, the communications subsystem, and the user's program. The tasks were assigned priorities: if two or more of them signaled a request, the one with the highest rank took possession of the processor. When the display was in control, it was the display's processor; when the disk drive had precedence, it was the disk drive's processor; when the mouse took charge, it was the mouse's processor; and so on.

The instructions themselves controlled traffic. Each instruction contained information about its successor. Say the user's program was in control of the processor. As each instruction was processed, the ensuing instruction was fetched automatically from memory and signaled a request to continue using the processor. If this next instruction still had top priority, it too got processed, and the pattern was repeated. But when an instruction for the user's program ran into competition, far example, from the higher ranking bit map display, the user's instruction moved to an electronic warehouse while the display's instruction took hold of the processor. The user's instruction stayed in the warehouse until its request for the processor once again held the highest priority. At than point, the computation of the user's program continued exactly where it had left off.

Instructions for all sixteen tasks were subject to the same rules. If they had priority, they got processed; if not, they were maintained in the warehouse. As a result, the instructions for the display, disk drives, mouse, user's program, and other tasks were executed in the proper order regardless of when, and for how many consecutive cycles, they had control of the processor.

Multitasking provided more functionality for less cost. Priority control of the Alto's powerful central processor meant the computer's input and output facilities could perform sophisticated feats without their own circuitry. Total system hardware requirements dropped by a factor of ten. The parts bill for an Alto ran just over $10,000, about 60 percent less than spent on the components for a minicomputer.

Multitasking did slow down the Alto. The bit mapped display controlled the pros processor two-thirds of the time, leaving the rest of the system just one out of every three clock cycles to complete its work. Therefore, instructions and data took three times longer than normal to compute. The delays, however, were measured in microseconds. Furthermore, unlike timesharing, the speed of the Alto was thoroughly predictable. As one of PARC's researchers would later declare to the general applause of his colleagues, "The great thing about the Alto Is that it doesn't run faster at night."

In November of 1972, with the Alto's design in hand, Thacker went to work on a prototype. He was joined by Ed McCreight and Larry Clark. Thacker says, "The hardware business is not like software. In software, you get immediate feedback. In hardware, there is a fairly long, unkind period when you have no idea if this pile of junk is going to work!"

By that measure, assembling the first Alto was exceedingly kind. "It worked just the way it was supposed to," recalls Thacker. "It was the most satisfying hardware system I had done, working essentially the first time. And that was because its design was so simple."

It took under four months to build the system. The first picture displayed on the new bit mapped screen was the Sesame Street Cookie Monster, which had been programmed as a test pattern by a member of Alan Kay's group. Says McCreight, "I hadn't really imagined what the bin map display would be like until I saw It running Alan's Cookie Monster. He had digitized two frames, and by switching back and forth, we got to see the Cookie Monster eating a cookie."

Thacker, too, was thrilled. "I remember checking out the display. It was late at night. There were only the three of us standing around and we were, of course, overjoyed. The display worked. You could see it! We all knew intellectually that this thing was going to be neat. But until it worked, we hadn't truly internalized what in would mean to he able In pro pictures on the screen and change them on the fly. That made it all much more real. Later, people would walk by, and we'd just point, like the proud parents outside the maternity room in a hospital."