Even so, this was an authentic general-purpose digital computer, a device traditionally associated with air-conditioned sanctums and operation by a technical elite. The Altair's maker, counting on the curiosity of electronics hobbyists, hoped to sell a few hundred. Instead, orders poured in by the thousands, signaling an appetite that, by the end of the century, would put tens of millions of personal computers in homes, offices, and schools around the world. Once again, the greatest productivity tool ever invented would wildly outstrip all expectations.
When the programmable digital computer was born shortly before mid-century, there was little reason to expect that it would someday be used to write letters, keep track of supermarket inventories, run financial networks, make medical diagnoses, help design automobiles, play games, deliver e-mail and photographs across the Internet, orchestrate battles, guide humans to the moon, create special effects for movies, or teach a novice to type. In the dawn years its sole purpose was to reduce mathematical drudgery, and its value for even that role was less than compelling. One of the first of the breed was the Harvard Mark I, conceived in the late 1930s by Harvard mathematician Howard Aiken and built by IBM during World War II to solve difficult ballistics problems. The Mark I was 51 feet long and 8 feet high, had 750,000 parts and 500 miles of wiring, and was fed data in the form of punched cards—an input method used for tabulating equipment since the late 19th century. This enormous machine could do just three additions or subtractions a second.
A route to far greater speeds was at hand, however. It involved basing a computer's processes on the binary numbering system, which uses only zeros and ones instead of the 10 digits of the decimal system. In the mid- 19th century the British mathematician George Boole devised a form of algebra that encoded logic in terms of two states—true or false, yes or no, one or zero. If expressed that way, practically any mathematical or logical problem could be solved by just three basic operations, dubbed "and," "or," and "not." During the late 1930s several researchers realized that Boole's operations could be given physical form as arrangements of switches—a switch being a two-state device, on or off. Claude Shannon, a mathematician and engineer at the Massachusetts Institute of Technology (MIT), spelled this out in a masterful paper in 1938. At about the time Shannon was working on his paper, George Stibitz of AT&T's Bell Laboratories built such a device, using strips of tin can, flashlight bulbs, and surplus relays. The K-Model, as Stibitz called it (for kitchen table), could add two bits and display the result. In 1939, John Atanasoff, a physicist at Iowa State College, also constructed a rudimentary binary machine, and unknown to them all, a German engineer named Konrad Zuse created a fully functional general-purpose binary computer (the Z3) in 1941, only to see further progress thwarted by Hitler's lack of interest in long-term scientific research.
|