Personal tools


CPSR Newsletter - Vol. 17, No. 1
CPSR Newsletter

Winter 1999
Vol. 17, No. 1


Marsha Woodbury
Y2K: The Broad View

CPSR-Y2K Working Group Web Pages

Arthur C. Clarke
The Century Syndrome, from The Ghost from the Grand Banks

Anthony Ralston
Y2K and Social Responsibility

Peter Neumann
A Perspective on Y2K

Gary Chapman
Now For Another Daunting Y2K Task: Educating America's Masses

Lenny Siegel
OOPs 2000: The Y2K Bug and the Threat of Catastrophic Chemical Releases

Norman Kurland
How Y2K Will Impact the New York Times

Y2K and Nuclear Weapons

  • Letters Seeking Help on Nuclear Weapons Issues from
    Michael Kraig
    Alan Phillips

  • Four Prominent Scientists on Nuclear Weapons Concerns:
    Khursch Ahmed
    David Parnas
    Barbara Simons
    Terry Winograd

  • Gary Chapman
    A Moral Project for the 21st Century: Stop Creating Better Weapons


    Y2K Humor from the Internet and Beyond

    Cartoon (may crash older browsers)

    CPSR News:

    Aki Namioka
    A Letter from CPSR's President

    Netiva Caftori
    Chapter News

    Return to the Index.

  • Y2K and Nuclear Weapons
    Letters from Four Prominent Scientists


    Kursch Ahmed, McMaster University

    The year-2000 problem has major implications for many computer-controlled systems, resulting from the fact that many of these systems were programmed with a two-digit year, which will result in unpredictable consequences on January 1, 2000, if not corrected. This problem affects information systems much as it does hospitals, banking, and the airline industry. Operation of common devices such as elevators, telephones, and process control machines may also be affected. While the computer industry is frantically trying to test for and fix these problems, many specialists feel that there is not enough time to test and fix all possible problems resulting from the complexity and interdependence of computer systems. Some international airlines have decided not to fly on January 1, 2000 just to avoid risking the safety of passengers.

    It is highly likely that the computer systems controlling nuclear weapons, many of them programmed in the 1970s and 1980s at the height of the Cold War, also have the Y2K bug. Hopefully, the military programmers have realized the possibilities and tested all their programs. However, there remains the question of what risks are involved if a program fails to operate correctly. We know that no major program is 100 percent bug free.

    In my opinion, the risk imposed by even a very small probability of nuclear accident is too grave for humanity. The nations with nuclear weapons should strongly consider deactivating these devices to prevent a major disaster from happening. Scientists, engineers, and all responsible citizens of the world owe it to themselves to put pressure on the governments of the world and the United Nations to pass a resolution to deactivate nuclear weapons in the face of the Y2K problem.

    Khursh Ahmed, Manager
    Computer Services Unit
    Faculty of Health Sciences
    McMaster University
    Hamilton, Ontario L8N 3Z5


    David Parnas, McMaster University

    The newspapers are filled with simple explanations and illustrations of the set of computer program bugs known as the Y2K or year-2000 problem. These examples are kept simple so that nonprogrammers can understand them. Unfortunately, such examples make the problem sound as if it would be easy to find and fix. That is not necessarily the case. Many of the problems are much more subtle than the examples usually published.

    Computer programs are very complex constructions, and they are full of bugs because they are difficult to understand. When a problem is discovered, it often takes weeks to find it and additional weeks to fix it. Very often, the "fixed" program is still not right and requires further repair after the revised program is put into service. The Y2K problem is not easier to fix than other bugs. Au contraire!

    It is not always easy to determine whether or not a program is sensitive to dates. Sometimes programs that are not sensitive to dates exchange data with programs that are date-sensitive and therefore will fail when those partner programs fail. Many of these programs are poorly documented, their authors are no longer around, and there is nobody who understands them.

    Responsible organisations have been investing a great deal of time, effort, and money into reviewing their software to (a) find those programs that would fail around the first day of 2000, (b) find the parts of the program that must be revised, and (c) make and test the necessary revisions. However, because of the poor quality of most software and documentation, because it will be difficult to test complete systems under realistic conditions, and because much of the software is old and must be repaired by people who do not understand it, I expect that some errors will remain. It is common to find that when a software product is repaired, the "patch" introduces new problems. I see no reason why Y2K problems will be easier to repair than other bugs.

    The U.S. military establishment is heavily dependent on computers for communications, intelligence, and control of weapons. Failure of its systems could endanger all of us. The U.S. Department of Defense and other military organizations owe the public the assurance that they are doing what is needed to examine, repair, and test all their systems. Moreover, they must be realistic enough to create safeguards against the effects of residual errors in weapon systems. In some cases, the only way we can be confident that there will be no serious problems is to disconnect the systems until we can observe their behavior in 2000.

    Professor David Lorge Parnas, Principal Engineer
    NSERC/Bell Industrial Research Chair in Software Engineering
    Director of the Software Engineering Programme
    Department of Computing and Software
    Faculty of Engineering
    McMaster University,
    Hamilton, Ontario L8S 4K1


    Barbara Simons, ACM U.S. Public Policy Office

    10 November 1998

    While no one knows what problems will develop because of the Y2K problem, we can state with confidence that there will be problems. This is because our society is dependent on large complex computer programs. Even fixing mistakes in recently written programs that have good documentation can be very difficult. The problem expands enormously when the programs are old and poorly documented.

    It may be possible to state with certainty that a particular piece of software will have no problems on January 1, 2000. But it is mathematically impossible to guarantee that an arbitrary computer program will not make any mistakes.

    I don't worry about whether my VCR will become confused on January 1, 2000. The worst case scenario is not especially bad. I do, however, worry about whether or not a computer that controls a major weapons system will become confused. Although the probability of a computer-generated accident might be very small, no one really knows just what that probability is. The worst case scenario of accidental nuclear war is simply unacceptable.

    Moreover, the problem is not limited to the United States. Not only must we be concerned about weapons we control;we must be even more concerned about those weapons that are aimed at us. The economic situation in Russia is not reassuring. Russia does not have much money to pay people to guarantee that its weapons systems do not have Y2K problems.

    We don't need to gamble that none of the critical weapons systems will malfunction. Rather, we should actively push for an alternative approach. As a last resort, nuclear powers such as the United States and Russia can remove the warheads from their missiles during the critical time period. Then, even if some computers in the United States or in Russia malfunction, we will not face massive death and destruction. Other less drastic actions might be possible, but it is in no one's interest to risk a Y2K nuclear doomsday.

    Barbara Simons, Ph.D.
    ACM U.S. Public Policy Office
    666 Pennsylvania Avenue SE
    Suite 302b
    Washington, D.C. 20003


    Terry Winograd, Stanford University

    All the publicity surrounding the year-2000 (Y2K) problem should serve as a reminder of the dangers of assuming bug-free computer functioning in life-critical systems. Of course, all computer systems have some bugs, whether Y2K problems or others. We are accustomed to experiencing errors, then finding and fixing the bugs. But we can't afford to do that when the cost of a single error can be catastrophic.

    Large, complex, computer-based systems are always on the edge of massive consequences in the nuclear retaliatory launch systems of the United States, Russia, and an increasing number of other countries. When launch-on-warning systems are automated, the computer can unleash nuclear destruction so quickly that no human judgment can be applied, and the consequences can be immeasurable.

    We know that computer systems can fail, and we need guarantees that those failures will not be catastrophic. All governments with nuclear-capable forces need to ensure that there is a sufficient set of checks and safety locks (and sufficient time for judgment) between a computer-based warning of a nuclear confrontation and any action that escalates the situation. During the Cold War, many of us feared that human civilization would not reach the next millennium. Even in this period of reduced tension, we cannot allow a technical error to push us into destruction. Let us enter the millennium in a safer world, in which governments take action on the basis of wisdom about the limitations of computer systems.

    Terry Winograd
    Professor of Computer Science
    Stanford University
    Stanford, California 94305

    CPSR Home Page
    CPSR Home Page

    You can send comments or questions to
    Last modified: Sunday, 14 March 1999.

    Archived CPSR Information
    Created before October 2004

    Sign up for CPSR announcements emails


    International Chapters -

    > Canada
    > Japan
    > Peru
    > Spain

    USA Chapters -

    > Chicago, IL
    > Pittsburgh, PA
    > San Francisco Bay Area
    > Seattle, WA
    Why did you join CPSR?

    It was time to support the cause.