T he year 2001. Can we even think of it, the real begin ning of the next millenium, without remembering the cinematic 2001 of the late Stanley Kubrick? Curiously persistent, this film continues to haunt our hopes for the future. This is especially so in the area of computer technology. Back in 1968, 2001: A Space Odyssey gave us some glimpses, both optimistic and sobering, of how computers might affect our lives. Central to that sprawling science fiction epic was a computer named HAL. "He" (and we realize our anthropomorphic tendencies here) closely monitored all systems, including astronauts, on a spaceship bound for Jupiter. He also ended up killing all but one of the crew in a chillingly methodical manner. HAL continues to hold our imaginations in a tight grip. Despite subsequent films with more realistic forecasting of computer hardware (War Games, Tin Man, The Net, Tron, etc.), none of these has received nearly the attention given to HAL and 2001. A recently published book on the design of artificial intelligence pays homage with the title, HALF Legacy: 2001 ~ Computer as Dream and Reality (Stork, 1997). In various articles, HAI2s image is brought up as a sort of benchmark to gauge how far we have progressed in developing higher-order thinking in computers (Garfinkel, 1997; Midbon, 1990). The computer "incapable of error" was even featured in an Apple television commercial during the 1999 Super Bowl. Why HAL? Why such ongoing interest in a film now over thirty years old? If the hardware of 2001 is by today's standards dated, then what accounts for both the widespread recognition of, and fascination with, HAL?
Unfortunately, ACM prohibits us from displaying non-influential references for this paper.
To see the full reference list, please visit http://dl.acm.org/citation.cfm?id=572202.