After the Apollo 11 crew returned, President Nixon presented them with the Medal of Freedom for their work. He also gave an award to the young flight controller, Steve Bales, who had made the no-abort call on the program alarms. The citation commended Bales's ''decision to proceed with the lunar landing when computers failed.''24 While acknowledging that not every critical decision that day was made in the LM itself, the wording also blamed the machine.
Despite the subtle nature of the errors, public discourse framed the episode as fallible machinery versus skilled, heroic pilots. The press took up celebrating the human factor. Datamation magazine reported that Apollo 11 proved that ''mere mortals showed they can still put the computer to shame,'' and took the LGC to task for not being state-of-the-art in 1969.25 Electronic Design incorrectly reported that Armstrong ''seized the manual controls of the lunar module,'' and ran an article ''The Indispensable Man,'' in which they interviewed engineering and scientific experts, including Isaac Asimov, on the importance of the human role in complex missions.26 Popular press proved even more gushing, hailing a victory of human performance over the impersonal forces of science and technology.
Was the culprit a software bug? Not in the sense that there was a particular error in the programming. In fact, the IL's antibug strategies—testing, restart protection, code inspections—prevented the computer from crashing at all. IL engineers felt that they were unfairly blamed for the program alarms. In a 1973 paper Eyles attributed the problem to ''excessive interface activity'' on the part of the astronauts (Aldrin's calls to monitor DELTAH).27 ''The software actually saved the program,'' Fred Martin recalled recently, ''because it, in the face of this mistake in the switch... was able to go on with the highest priority jobs and not tank the mission.''28 To this day it galls Dick Battin when people refer to the program alarms as computer errors.29
Ironically, despite the attention that the program alarms brought to the computer, one potentially fatal problem did reside in the software that was not noticed until after Apollo 12. As with the program alarms, it was not a programming error but a problem of data exchange. Reviewing flight data, Grumman engineers noticed a ''castleing'' effect in the engines thrust commands—dynamic variations that made the plots of the thrust commands resemble the top of a castle's turret, as much as 25 percent of total thrust. Analysis showed that an incorrect set of parameters made the servo that controlled the automatic throttle only marginally stable, which could have caused it to oscillate wildly under certain descent conditions. Only a second error in programming a related constant prevented the unstable throttle from causing this catastrophe. The problem was fixed by Apollo 14, but behind the scenes the IL and NASA engineers recognized hidden dangers could lurk in program code.30
From a systems viewpoint, it is not a coincidence that the program alarms (and the castleing problem) arose from interfaces between different pieces of hardware. It was an interface problem between two components of a system made by different organizations. Software in the LM's control computer, the human-built glue that held the human-built system together, highlighted relationships (successful and buggy) between groups of people.
IL engineer Hugh Blair Smith takes even a larger view. He allows that the Apollo 11 program alarms could be called a software problem, but only if one realizes that ''the crew procedures are part of the software, as are the ground procedures.''31 Apollo crewmen followed carefully written ''programs,'' in the form of their timelines, checklists, abort criteria, and mission rules. These programs governing people's behavior were as important as the programs controlling the computer, and similarly embodied assumptions and links between organizations (recall that Apollo overall was called a program). In the human-machine system of Apollo, it often was not possible to distinguish between instructions for machines and instructions for people.
Indeed, every problem on the Apollo 11 landing stemmed from miscommunication, incorrect documentation, or failure to pursue and track minute details in a complex, unforgiving system. Yet, in a pattern that began on Apollo 5, NASA and the press found it easier to blame ''the computer'' and to narrate the successful landing as a triumph of humanity over the machine. This simpler, less anxious story affirmed the political goals of the Apollo program and did not require acknowledging the bugs in the program's vast and impressive human-machine system.
Was this article helpful?