Personal tools
four.html
Winter 1999 Contents:
Marsha Woodbury
CPSR-Y2K Working Group Web Pages
Arthur C. Clarke
Anthony Ralston
Peter Neumann
Gary Chapman
Lenny Siegel
Norman Kurland
Y2K and Nuclear Weapons
Humor: Y2K Humor from the Internet and Beyond Cartoon (may crash older browsers) CPSR News:
Aki Namioka
Netiva Caftori Return to the Index.
|
Letters from Four Prominent Scientists Kursch Ahmed, McMaster University David Parnas, McMaster University
Kursch Ahmed, McMaster University The year-2000 problem has major implications for many computer-controlled systems, resulting from the fact that many of these systems were programmed with a two-digit year, which will result in unpredictable consequences on January 1, 2000, if not corrected. This problem affects information systems much as it does hospitals, banking, and the airline industry. Operation of common devices such as elevators, telephones, and process control machines may also be affected. While the computer industry is frantically trying to test for and fix these problems, many specialists feel that there is not enough time to test and fix all possible problems resulting from the complexity and interdependence of computer systems. Some international airlines have decided not to fly on January 1, 2000 just to avoid risking the safety of passengers. It is highly likely that the computer systems controlling nuclear weapons, many of them programmed in the 1970s and 1980s at the height of the Cold War, also have the Y2K bug. Hopefully, the military programmers have realized the possibilities and tested all their programs. However, there remains the question of what risks are involved if a program fails to operate correctly. We know that no major program is 100 percent bug free. In my opinion, the risk imposed by even a very small probability of nuclear accident is too grave for humanity. The nations with nuclear weapons should strongly consider deactivating these devices to prevent a major disaster from happening. Scientists, engineers, and all responsible citizens of the world owe it to themselves to put pressure on the governments of the world and the United Nations to pass a resolution to deactivate nuclear weapons in the face of the Y2K problem. Khursh Ahmed, Manager
David Parnas, McMaster University The newspapers are filled with simple explanations and illustrations of the set of computer program bugs known as the Y2K or year-2000 problem. These examples are kept simple so that nonprogrammers can understand them. Unfortunately, such examples make the problem sound as if it would be easy to find and fix. That is not necessarily the case. Many of the problems are much more subtle than the examples usually published.
Computer programs are very complex constructions, and they are full of bugs because they are difficult to understand. When a problem is discovered, it often takes weeks to find it and additional weeks to fix it. Very often, the "fixed" program is still not right and requires further repair after the revised program is put into service. The Y2K problem is not easier to fix than other bugs. Au contraire! It is not always easy to determine whether or not a program is sensitive to dates. Sometimes programs that are not sensitive to dates exchange data with programs that are date-sensitive and therefore will fail when those partner programs fail. Many of these programs are poorly documented, their authors are no longer around, and there is nobody who understands them. Responsible organisations have been investing a great deal of time, effort, and money into reviewing their software to (a) find those programs that would fail around the first day of 2000, (b) find the parts of the program that must be revised, and (c) make and test the necessary revisions. However, because of the poor quality of most software and documentation, because it will be difficult to test complete systems under realistic conditions, and because much of the software is old and must be repaired by people who do not understand it, I expect that some errors will remain. It is common to find that when a software product is repaired, the "patch" introduces new problems. I see no reason why Y2K problems will be easier to repair than other bugs. The U.S. military establishment is heavily dependent on computers for communications, intelligence, and control of weapons. Failure of its systems could endanger all of us. The U.S. Department of Defense and other military organizations owe the public the assurance that they are doing what is needed to examine, repair, and test all their systems. Moreover, they must be realistic enough to create safeguards against the effects of residual errors in weapon systems. In some cases, the only way we can be confident that there will be no serious problems is to disconnect the systems until we can observe their behavior in 2000. Professor David Lorge Parnas, Principal Engineer
Barbara Simons, ACM U.S. Public Policy Office 10 November 1998 While no one knows what problems will develop because of the Y2K problem, we can state with confidence that there will be problems. This is because our society is dependent on large complex computer programs. Even fixing mistakes in recently written programs that have good documentation can be very difficult. The problem expands enormously when the programs are old and poorly documented. It may be possible to state with certainty that a particular piece of software will have no problems on January 1, 2000. But it is mathematically impossible to guarantee that an arbitrary computer program will not make any mistakes. I don't worry about whether my VCR will become confused on January 1, 2000. The worst case scenario is not especially bad. I do, however, worry about whether or not a computer that controls a major weapons system will become confused. Although the probability of a computer-generated accident might be very small, no one really knows just what that probability is. The worst case scenario of accidental nuclear war is simply unacceptable. Moreover, the problem is not limited to the United States. Not only must we be concerned about weapons we control;we must be even more concerned about those weapons that are aimed at us. The economic situation in Russia is not reassuring. Russia does not have much money to pay people to guarantee that its weapons systems do not have Y2K problems. We don't need to gamble that none of the critical weapons systems will malfunction. Rather, we should actively push for an alternative approach. As a last resort, nuclear powers such as the United States and Russia can remove the warheads from their missiles during the critical time period. Then, even if some computers in the United States or in Russia malfunction, we will not face massive death and destruction. Other less drastic actions might be possible, but it is in no one's interest to risk a Y2K nuclear doomsday. Barbara Simons, Ph.D.
Terry Winograd, Stanford University All the publicity surrounding the year-2000 (Y2K) problem should serve as a reminder of the dangers of assuming bug-free computer functioning in life-critical systems. Of course, all computer systems have some bugs, whether Y2K problems or others. We are accustomed to experiencing errors, then finding and fixing the bugs. But we can't afford to do that when the cost of a single error can be catastrophic. Large, complex, computer-based systems are always on the edge of massive consequences in the nuclear retaliatory launch systems of the United States, Russia, and an increasing number of other countries. When launch-on-warning systems are automated, the computer can unleash nuclear destruction so quickly that no human judgment can be applied, and the consequences can be immeasurable. We know that computer systems can fail, and we need guarantees that those failures will not be catastrophic. All governments with nuclear-capable forces need to ensure that there is a sufficient set of checks and safety locks (and sufficient time for judgment) between a computer-based warning of a nuclear confrontation and any action that escalates the situation. During the Cold War, many of us feared that human civilization would not reach the next millennium. Even in this period of reduced tension, we cannot allow a technical error to push us into destruction. Let us enter the millennium in a safer world, in which governments take action on the basis of wisdom about the limitations of computer systems. Terry Winograd
CPSR Home Page You can send comments or questions to newsletter@cpsr.org.
|
Created before October 2004