As I See It: The Legacy
April 9, 2007 Victor Rozek
An accomplished man, perhaps a great one, died recently. Certainly he was a man who exerted great influence on the computing industry and the people in it. He had one of those rare minds whose contributions helped shape the future, in his case by domesticating what was previously a primitive collection of magnetic drums, cathode-ray tubes, tape units, and card readers. By putting the power of a wizardly but cumbersome and elitist device into the hands of the many, he literally accelerated human progress. Yet most people don’t know his name. As a boy, his destiny was far from evident. Born of wealthy parents, he had the advantage of attending a prestigious high school in Pennsylvania, but did poorly. Although his grades were modest and his attendance spotty, he managed to graduate and in 1942 he enrolled at the University of Virginia. There, his educational woes continued. At his father’s insistence, he majored in chemistry but his interest soon palled and his attendance became so sporadic that he was asked to leave and follow his bliss elsewhere. Short on options and academically challenged, the young man then joined the Army, where he found modest success as a corporal in charge of an anti-aircraft crew at Fort Lewis in Georgia. The likelihood of someone attacking Georgia from the air, however, was slim, and he was spared from a life of protecting Georgian air space by taking an aptitude test. The Army decided he did indeed have a brain, at least by military standards, and sent him off to the University of Pittsburgh, where he was enrolled in a pre-engineering program. Engineering, as things would turn out, would have been an appropriate step along his eventual path, but he was waylaid by, of all things, another aptitude test. This time the Army discovered he had medical skills and promptly plucked him from the U of P and dropped him across the state in the oldest institution of higher learning in North America, Haverford College. There, he was enrolled in the pre-med program. But while working in a neighboring New Jersey hospital, his medical focus abruptly changed from study to treatment when he was diagnosed with a brain tumor. A piece of his skull was cut out, the tumor removed, and the opening covered with a metal plate. He recovered well enough to enter medical school in New York, but lasted less than a year. Alas, aptitude or not, medicine was not his cup of saline, and consistent with his other intellectual pursuits, he quit. A murky future was the least of his problems, however, when it became clear that his metal plate didn’t fit properly and replacing it would require a second operation. By the time he left the military in 1946, he finally had a tight-fitting plate, but still no clear direction for his life. Free of his father’s influence and the tyranny of military aptitude tests, for the first time he set his own course. He trained as a radio technician and discovered an aptitude for math. Once again he enrolled in school, this time at Columbia University, from which he emerged with a master’s degree in mathematics in 1949. While still in school, however, he had occasion to tour IBM‘s Computer Center in Manhattan. There, he saw one of Big Blue’s early electronic computers, the monstrous Selective Sequence Electronic Calculator (SSEC). He was hooked. He wrangled an interview on the spot, and his life changed direction again. The SSEC was the last of the large electro-mechanical computers ever built. It was a hybrid of 13,500 vacuum tubes and 21,400 electro-mechanical relays, and the beast sat on the ground floor of IBM’s world headquarters at 590 Madison Avenue, where it could be admired by the passing throngs. Cumbersome and difficult to program, the SSEC was used for highly specialized and esoteric work, such as calculating the position of the planets, which it did exceedingly well. The moon-position tables it produced–a project the young man worked on for two years–were used almost two decades later to plot the course of the Apollo mission to the moon. In all, he worked on the SSEC for three years, during which time he began to grapple with the issues of programming such a device, a process which in those days had few prescriptions and even fewer tools. In the early 1950s, programming was enormously tedious. Even simple instructions required rows of binary-coded numbers organized in precise order. Programmers were required to constantly tend their systems because if there was any error in the code, the program would simply stop and it had to be corrected and restarted. By his own admission, the young man was lazy and he became interested in ways the process could be sped up and simplified. But it took a war and a new generation of hardware to create the confluence of conditions that would allow his vision to manifest. With the outbreak of the Korean conflict in 1950, IBM chairman Thomas J. Watson approached government authorities and volunteered his company’s technology in service of the war effort. “What did the government need?” he wanted to know. A scientific computer, he was told, bigger and faster than anything already is existence. The military wanted computing horse power to facilitate nuclear development, missile design, and the building of high-speed aircraft. It was the impetus for IBM’s 701 series, known as the Defense Calculator, and the stage was nearly set. It was while he was working on the 701, writing baroque programs for computing missile trajectories, that he started work on a programming system to simplify the process. The result was something called Speedcoding, which allowed large numbers to be easily stored and manipulated, and was the precursor to his best known achievement. In 1954, IBM unveiled the 704, the first mass-produced computer with floating-point hardware and core memory instead of tubes, ushering in the modern computing era. The young man with the spotty academic career was one of its designers. But to his mind, mass production created an urgent problem: “The question became, what can we do for the poor programmer?” If more people were to use computer systems, programming needed to be faster, friendlier, and automated to the degree possible. At age 27, he wrote a letter to his boss, Cuthbert Hurd, head of the applied science department, claiming that it was possible to develop an automatic programming system for the 704 and that he wanted to be the one to do it. Hurd was no lightweight. He was responsible for hiring the legendary John von Neumann as a consultant and was quick to grasp the implications. He gave his blessing and provided a budget. The young man hired a team of mathematicians and programmers and set to work. Their challenge was two-fold: creating a programming language was the first step, but it would be meaningless without a “translator” that would turn the language into something the machine could understand. The compiler proved the greater challenge and took two years to complete. At long last, based on what the young man called “more faith than knowledge,” he issued a report claiming that the programming language his team had designed would enable the 704 “to accept a concise formulation of a problem in terms of mathematical notation and to produce automatically a high-speed 704 program” capable of solving it. Further, the report suggested that this miracle program would run as fast as one painstakingly written in machine code. Months of testing would verify the claim. He named the language Fortran, for FORmula TRANslator. The young man’s name was John Backus. Amazingly, in an industry that changes with the speed of innovation, Fortran has survived for an unprecedented half century. And although it was designed for scientific and engineering applications, its relative ease of use allowed it to expand to many other arenas. As the New York Times obituary noted, Backus “devised a programming language that resembled a combination of English shorthand and algebra. Fortran. . . was very similar to the algebraic formulas that scientists and engineers used in their daily work. With some training, they were no longer dependent on a programming priesthood to translate their science and engineering problems into a language a computer would understand.” It was the first widely used higher-level language, and it forever changed the way people interact with computers. John Backus’ professional accomplishments received numerous awards in recognition, including the IEEE W.W. McDowell Award, the National Medal of Science, and the ACM Turing Award. But perhaps his most impressive achievement was being courageous enough to quit the pursuits for which he had no passion and to plot an uncharted course of his own. His journey into the uncharted continued on March 17, 2007, when John Backus died at age 82 in his Ashland, Oregon, home. His legacy continues to unfurl.
|