CPSR/Boston Hosts 1987 CPSR Annual Meeting and Banquet
The 1987 Annual Meeting and Banquet of CPSR were held on the MIT campus on
October 17 and 18. The two-day meeting presented speakers on issues relevant to
CPSRÕs program, a panel discussion on computers and ethics, and reports and open
discussion on the character and future of the organization. The 1987 Annual
Banquet was held on Saturday evening, October 17, and featured as the keynote
speaker Lester Thurow, Dean of the Sloan School of Management at MIT. The
Banquet program also honored the work of former CPSR Chairman Severo M. Ornstein
and former Board Secretary Laura Gould, who were two of the founding members of
the organization. And the CPSR Board of Directors presented the first Norbert
Wiener Award for Professional and Social Responsibility to Professor David L.
Parnas for his public opposition to the SDI program .
About 125 people attended both the Saturday meeting and the banquet, with
representatives from nearly all CPSR chapters across the country. On Sunday
about 60 people heard reports from the CPSR national staff and CPSR Chairman
Steve Zilles, and then participated in small discussions about chapter
development, work on reliability and risk issues, computers and education,
computers in the workplace, and privacy and civil liberties. Sunday morning the
CPSR slide show, Reliability and Risk: Computers and Nuclear War, was shown in
its six-projector version. Throughout the weekend a table of literature and
membership information was available, which included copies of the just released
CPSR book, Computers in Battle. Will They Work?
The CPSR Banquet was reported in the Monday, October 19, issue of the Boston
Globe, and the article was accompanied by a photograph of Professor Thurow, CPSR
Chairman Steve Zilles, and CPSR Executive Director Gary Chapman.
and Civil Liberties
The Saturday morning program featured three speakers each addressing a major
part of the CPSR program. They were Jerry J. Berman, chief legislative counsel
for the American Civil Liberties Union and director of the ACLU Privacy and
Technology Project; Dr. Clifford Johnson, plaintiff in the lawsuit Johnson v.
Weinberger; and Professor Nancy Leveson, Associate Professor of Computer Science
at the University of California at Irvine, and a specialist in software safety.
Jerry Berman addressed the topic of computers and privacy, a field he has been
involved in for many years. He started by noting that he had spent the previous
month in his capacity as chief legislative counsel of the ACLU opposing the
nomination to the Supreme Court of Judge Robert Bork. Berman said that the Bork
nomination gave him an opportunity to talk about technology and privacy with a
considerably more current frame of reference than the usual appeal to Orwell. He
said that the Bork nomination hearings were a unique opportunity for the public
to learn that the law on "privacy rights, First Amendment freedoms, the public's
right to know, access to information, freedom in scientific inquiry . . . is
largely in a shambles." Bork, said Berman, was the first nominee to the Supreme
Court who explicitly believes that there is no constitutional right to privacy
at all. Berman noted, however, that of the 54 senators who declared against
Bork, 41 of them cited as one of their objections Bork's position on privacy.
The question now becomes how much have we dealt with the issue of privacy in the
defeat of Bork? Berman said that even the "mainstream" of political and judicial
opinion has sacrificed privacy for the sake of governmental and corporate
control over information.
In the absence of a constitutional right of privacy, the Congress has passed
statutory laws protecting certain kinds of information and communication. The
major law in this field has been the Privacy Act of 1974. Congress has also
worked to prevent certain proposals, such as a centralized, national depository
of information. However, the "computer revolution" of the last fifteen years has
made much of the work of the congressional consensus on privacy irrelevant. For
example, Berman said, there is a loophole in the Privacy Act for sharing of data
between governmental agencies called the "routine use" provision; now this
loophole has resulted in 110 separate computer matching programs in which data
is "routinely" passed between agencies, subverting the original intent of the
law. The congressional intent of preventing a centralized depository of
information has been subverted by the progress of technology in providing for a
decentralized network of the same kind of information.
Berman went on to discuss the FBI's proposed upgrade to the National Crime
Information Center (NCIC), which has become a major focus of the CPSR Computing
and Civil Liberties Project (see accompanying article on page 1.) The expert
panel convened by Congressman Don Edwards to examine the proposals of the FBI
and its contractor, the MITRE Corporation, includes both CPSR and the ACLU
Privacy and Technology Project. Berman also noted the participation of CPSR at
the founding meeting of another coalition of privacy advocates, whose activity
resulted in the Electronic Communication Privacy Act of 1986, a law which
protects all computer telecommunication as if it were private mail. Berman said
that there is a dramatic improvement in the effectiveness of such coalitions
when technical professionals such as those from CPSR are involved in the privacy
debate. He said that one of the most important contributions of CPSR members has
been to point out that proposed systems such as NCIC 2000 not only misapply
technology in ways that compromise privacy, but fail to use technology to
Another issue that is of paramount importance in the current climate of national
security is the restriction, and proposed restrictions, of access to
computerized databanks of technical and scientific information. Pentagon and
intelligence agency officials contend that hostile governments can access
publicly available information and through the use of computer technology,
assemble large and disparate sources of data to approximate classified
documents. The Reagan administration has declared a war on "high technology
leaks" to the Soviet Union and Eastern Europe, and has proposed a series of
measures to restrict access to information on high technology, and to monitor
the use of databases. Berman said that it is important to hold fast to
principles of scientific freedom and open access to information despite new
capabilities provided by computer technology.
Berman said that people concerned about information collection and dissemination
should think not only about restricting technology in order to protect privacy,
but also about how to use technology to empower citizens in the same way that
government agencies and corporate collectors of information are empowered by
computers. Berman pointed out that the government has access to vast collections
of information in electronic form, while citizens usually have access to the
same information, such as through the Freedom of Information Act, only in
hardcopy form. This creates an inequity in terms of the efficient use of
and the Constitution
The next speaker, Dr. Clifford Johnson, is the plaintiff in the lawsuit Johnson
v. Weinberger, which seeks a declaration from the Federal courts that "launch on
warning" policy, or automated release of strategic nuclear missiles, is
unconstitutional in that it usurps the power of the Congress and the President
in the declaration of war. Johnson also contends that the current "launch on
warning" policy is error-prone, and thus constitutes a grave risk to public
safety without due process. (For a more detailed description of this lawsuit see
the Spring 1987 issue of the CPSR Newsletter.)
Johnson described the details of his case and the central issues of contention
over "launch on warning" policy. He said that the definitions used in the case
may decide the issue. "Launch on warning" is defined as the release of missiles
upon sensor detection of an attack. But what "sensor detection of an attack"
might mean is often disputed. It might mean detection of an actual nuclear
burst, or it might mean only the detection of electronic signals of missile
launch from the Soviet Union. Johnson himself defines "launch on warning
capability" as a "set of procedures" that makes "launch on warning" possible.
This may be attached to any sensor detection.
Johnson said that the United States operates with a "launch on warning" policy
because of the concern of some military authorities, particularly in the Air
Force, that the Soviet Union has the capability to knock out American land-based
missiles in a "disarming" first strike. This results in the compulsion to "use
'em or lose 'em," in military parlance. The Air Force claims a "launch on
warning capability" is essential because to risk the loss of land-based missiles
undermines deterrence. Johnson characterized this as a "phobia" rather than a
rational fear of Soviet first strike capability.
Johnson also pointed out that there are really only three options available to
strategic planners contemplating action in response to a nuclear attack:
preemption, "launch on warning," and "riding out" an attack. The United States
does not have a public policy of preemptively striking the Soviet Union, and
"riding out" an attack is considered risky because of the devastation not only
to missiles in silos but to command and control apparatus. Thus, said Johnson
"launch on warning" becomes a de facto or "default" policy, even if it is not
explicit in war planning.
Johnson contends that "launch on warning" violates a number of important legal
principles. It violates the United Nations' Charter in that it jeopardizes the
peace with potentially inadvertent nuclear war. It violates common law under
which lethal "booby traps" are illegal because the taking of a human being's
life requires the judgment of another human being. "Launch on warning" subverts
the intention of a number of provisions regarding legally authorized
subdelegation of powers. The President cannot delegate the power of command of
the armed forces to a computer, and the Congress cannot delegate the power to
declare war to the President. The Atomic Energy Act also precludes the President
from delegating the launch authority for nuclear missiles to the military.
Johnson pointed out that during the process of passing the War Powers Act,
Senator Stennis offered an amendment to the bill that would have explicitly
allowed the President to operate a "launch on warning" policy, and the amendment
Johnson described the history of his case, filed for the first time three and a
half years ago. The first court to hear the case, the Federal District Court in
San Francisco, threw out the case because the judge considered it a "political
question," outside the domain of the judiciary. Johnson appealed this decision
to the Ninth Circuit Court of Appeals, also in San Francisco. The three judge
appellate panel also threw the case out, but on different grounds. The Appellate
Court ruled that Johnson had failed to contend that the United States government
operates a "launch on warning" policy at present, and the Court is prohibited
from enjoining a defendant from behavior the defendant has not yet engaged in.
Johnson took this as a signal that the Court of Appeals might reconsider the
case on its merits if this contention were included in a new suit, so the case
was revised and refiled. Once again a District Court judge ruled that the case
is a "political question," and the suit was again dismissed. Johnson has again
appealed the case to the same appellate division, and there seems to be
reasonable ground for reversal of the lower court opinion because of the ruling
of the first appellate panel. Attorneys from the Lawyers Alliance for Nuclear
Arms Control are now supervising the appeal. CPSR has endorsed the lawsuit from
the beginning, and has filed supportive memoranda as part of Johnson's filing
The final speaker for the morning program was professor Nancy Leveson of the
University of California at Irvine, where she specializes in software safety. It
was noted in Professor Leveson's introduction that CPSR has recently expanded
its area of concern about computer reliability from issues exclusive to military
applications to computer use in general. Leveson started out her talk by
explaining how she got involved in the issue of "software safety." She said her
training was in mainstream software engineering, and some years ago she would
have contended that there was no such discipline as "software safety." It was
only after exposure to the real-world problems of engineers that she began to
see the importance of studying "real-time safety-critical systems," which is her
Leveson defined "safety-critical" systems as those in which computers are
controlling real-time mechanical or electronic systems, the malfunction of which
"can result in death, injury, loss of property, or environmental harm." She
noted that there is a "growing trend in which manual intervention" in a system's
performance is no longer a feasible back-up measure for safety assurance. The
Space Shuttle, for example, is entirely dependent on its computers, even to
abort a mission, and new aircraft are often incapable of flying without the fine
control of onboard computer systems. This trend also includes systems in which
the computer does not provide direct control, but which feeds data to other
parts of the system, and the speed or volume of data transmission rules out
After reviewing some illustrative examples, Leveson said that the integration of
computers into total systems with some performance dependent on the computer
makes programmers responsible for understanding the purpose and the details of
the entire system, and makes the system's engineers responsible for
understanding the role of the computer. But this relationship is often
characterized by poor communication or oversight, with the result that no one
really understands all the potential states of the entire system. Some computer
scientists contend that computer professionals cannot be held responsible for
poor or incomplete specifications developed by system engineers. Leveson said
this may be defensible, but such an attitude offers no solution to a growing
social problem. She said that the evidence suggests that "inadequate design
foresight and specification errors are the greatest cause of software safety
Leveson contends that the safety requirements of current critical systems are
orders of magnitude away from what programming and system design can provide.
For example, the current air traffic control system demands an annual "down-
time" figure of only three seconds; a nuclear power plant in Australia has a
mean-time-between-failure (MTBF) specification of 40 years; and Chernobyl, the
Soviet nuclear power plant that experienced a core meltdown, had an MTBF of
10,000 years, which, as Leveson joked, "means we should be safe for a while."
The Federal Aviation Administration uses the figure 10-9 for failure rates, and
it is not even clear where this number came from. It is not even understood
today how software engineers can measure software safety in such terms, and
often, because risk cannot be measured in such terms, its existence is not
Leveson said that there is no such thing as a "truly safe" system; "everything
has some indispensable amount of risk in it." What we are after is "acceptable
risk," not "zero risk." She argued that we can best achieve "acceptable risk" by
separating out safety considerations and treating them as an independent goal
instead of as a subset of other goals in a system. This allows engineers to
confront conflicts between goals and to approach rational choices instead of
building an applique of safety checks for existing systems. Leveson took note of
the fact that systems engineering does this now, but for the most part computer
scientists do not. There is an independent specialty known as "safety
engineering," which has tended to take the responsibility for safety without
adequate knowledge of what to do about it, particularly with respect to the role
There may be nothing we can do to eliminate computer error or failure, Leveson
said, but we may be able to minimize the damage caused by such eventsÑin other
words, she said, 'to prepare for failure." This seems like a much more practical
and promising approach than the various ways that have been proposed to
eliminate failure or error.
Computer scientists will have to come to grips with safety issues, said Leveson,
if only because government and industry will be enforcing more and more
standards involving software safety. CPSR members Peter Neumann and David
Parnas, Leveson noted, have been leaders in addressing this issue within the
profession of computer science. She said that CPSR should take up the cause of
software safety in order to live up to its claim of "social responsibility," and
she outlined a number of things that the organization should do to promote
education about computer safety.
on Computers and Ethics
After a lunch break, Annual Meeting participants returned to hear a panel
discussion entitled "Computer Ethics: Personal Responsibilities of Computer
Professionals." The panel members included Professor Joseph Weizenbaum of MIT,
Professor David L. Parnas of Queen's University in Canada, Professor John
Fielder of Villanova University, Professor Deborah Johnson of Rensselaer
Polytechnic Institute, and Harris Sussman of Digital Equipment Corporation. The
panel was moderated by Ronni Rosenberg, who is a Ph.D. candidate in the Program
in Science, Technology and Society at MIT, and a member of the Boston chapter of
The first panelist to speak was John Fielder, an associate professor of
philosophy. Fielder started by noting that professionals of all types bear a
special responsibility to society because of their expertise. Computer
professionals have a special responsibility because of the importance of
computers in the strategic nuclear arsenal. Fielder said that because of this
responsibility, computer professionals have as one of their first ethical
options the capability of not participating in projects they decide are
violative of the standards of the profession. However, there is a particular
difficulty in the ethical options of computer professionals because, unlike
doctors or some other professionals, computer professionals work in hierarchical
organizations, and their careers can be jeopardized by evaluations of superiors.
Thus the question becomes how much we can expect computer professionals to do
while risking their careers.
A computer professional's ethical decisionmaking may be complicated further by
the fact that the ethical character of a particular project is often
controversial, and the professional's non-participation may do very little to
stop the project. Many other professionals may decide to work on the project
without ethical qualms, and the project may go ahead in spite of the
The professional may decide to take the case to the public, through Whistle-
blowing" or political activism. Fielder said that organizations like CPSR are
important for giving computer professionals an organized voice that intends to
affect the entire profession.
The next speaker was Deborah Johnson, also an associate professor of philosophy,
and author of the book Computer Ethics. Johnson also noted that traditional
moral theory is divided into two categories: ethical behavior that is required,
and ethical behavior that is "heroic." The latter category includes behavior by
an individual that is admired, but, if absent, for which the individual is not
blamed for not performing. Johnson said many situations involve ethical
decisions for which most of us have a predisposition, all things being equal.
The question again is how much of a sacrifice we can expect from professionals.
Johnson also said that an important thing to remember is that we should not
limit our discussion to the individual decision maker, but we should think about
how to change the environment in which these individuals and their colleagues
work. It may be more fruitful, she said, to think about how corporations,
professional societies, government, and legal institutions can provide a more
supportive environment for social responsibility by all professionals. An
example is a corporate policy that would let professionals move within the
company when confronted by a project that gave them ethical difficulties.
Johnson offered four claims concerning computer professionals and social
responsibility. First, she said, computer professionals have an obligation to
protect those who are affected by their work. Second, this obligation is not
"special," but is instead a specific instance of a general moral maxim, which is
"all things being equal, one should avoid doing harm." Third, this particular
obligation of computer professionals arises out of the expertise that computer
professionals have. Finally, the strength of the obligation is dependent on how
much knowledge the professional has, and the power that he or she has over the
David Parnas spoke next. He said he decided to approach the issue of
professional ethics by thinking about what he tells his graduate students about
good design in software engineering. Out of this he developed some principles
that might be applied to most cases.
First, Parnas said, professionals have to ask if what they are doing truly
contributes to society, and not just to the people who are writing their checks.
A professional might ask, "Will someone else find my product useful? If they do
use it, will it help them or hurt them?" Second is a standard of honesty.
Professionals should seek a correspondence between what they know they can do
and what is required by the customer. Parnas said that it was the application of
this principle that led him to resign from the SDIO panel of computer experts.
Another important principle is the acceptance of responsibilityÑnot "passing the
buck." The very minimal standard of professionalism, said Parnas, is being the
decision maker on the quality of one's own work. It is also important to make
sure that professionals work on "real" problemsÑthose relevant to the
requirements of the customerÑinstead of "toy" problems that try to get around
the difficulties of the real problem but which don't contribute to their
solution. In the case of the SDI, said Parnas, there are a lot of people who
will take money for research on various sub-problems of making nuclear weapons
impotent and obsolete," when they don't believe that what they're doing will
solve that ultimate problem.
Parnas said that professionals have to be careful about participating in
projects with goals the professionals want to avoid. It can be self-serving,
said Parnas, to work on a project with the rationalization that it's more
effective to influence its course "from the inside." This may be offset by the
appearance of support for the goals of the project that a professional's
But, Parnas said, one of the most troublesome problems in engineering is that
"anything can be misused." Automobiles, for example, can be used to commit
murder. Does this mean we shouldn't work on automobiles? Parnas says he can't go
so far as to consider auto engineers immoral because a car was once used to run
over someone. The professional has to ask what the most likely use of the
product will be. The professional has to ask the basic question: "Will the
development of this product have, over the long run, a beneficial or negative
contribution to society?"
Parnas recommended that professionals draw up a list of their own principles,
and then not enter into any agreement with an employer that violates those
principles. Parnas said he "wished" he had done this early in his careerÑhe said
there are confidentiality agreements he wishes now he hadn't signed, or places
where he worked that he now wishes he hadn't worked.
ÒThe New Priesthood"
Next to speak was Harris Sussman, head of the strategic management section of
the corporate personnel department at Digital Equipment Corporation. Sussman
called computer professionals the "new priesthood" of society, noting that in
the Middle Ages it was often necessary to employ a priest to write or read a
letter. This analogy with computer professionals leads to a consideration of the
responsibilities of this "new priesthood."
Sussman said that one of the difficulties with evaluating situations is that
contemporary work is usually so interdependent among workers that often an
ethical dilemma escapes the notice of even ethically sensitive professionals.
Even if professionals draw up a list of things they will not do, they can find
themselves on the other side of that line and viewing an ethical problem only
with hindsight. This is particularly true in the information industry because of
the often intangible material that "knowledge workers" deal with. It becomes
difficult to know when one is even producing a "product." Sussman called
attention to Buckminster Fuller's contention that we are witnessing "an
ephemeralization of work."
At the same time, said Sussman, we can see the irony in the fact that while
"knowledge work" becomes increasingly ephemeral in character, it is done on
machines which are the most unforgiving of ambiguity. This creates a friction
which makes clear ethical decisions very difficult to discern, let alone make.
Last to speak was Joseph Weizenbaum of MIT. Weizenbaum said he wanted to offer a
"fairy tale," an analogy between computer professionals and physicians.
Weizenbaum offered the example of a person going to a physician and asking to
have the smallest finger on his left hand amputated. The physician would quite
reasonably ask the patient why he wanted this done. The patient would only
respond by asking how long it would take, how much would it cost, etc. If the
physician was not given a reasonable explanation for conducting this surgery, he
or she would tell the patient that there was no justification for performing
such surgery, and perhaps the physician would recommend that the patient be seen
at a psychiatric clinic.
Engineers don't do that, said Weizenbaum. Engineers asked to build a bridge
don't usually ask the client, "Why do you want to go from this side to the
other?" The same applies to computer professionals. But Weizenbaum argued that
professionals should take every measure available to find out what the end use
of their product will be, and to act accordingly. Professionals have a greater
obligation to consider the ethics of their work because of their greater
capacity to transform the process as a whole greater than, say, an assembly line
worker who shows up in the final production of a commodity. In Weizenbaum's
view, it is probably more "heroic" for an assembly line worker to avoid working
on unethical projects than it is for a professional to refuse to participate,
since the evaluation of ethical issues is presumed to be part of what it means
to be a professional.
In a final point, Weizenbaum said that in the consideration of ethical issues,
we cannot avoid "the insanity of our times." This is the case not only for
issues relating to nuclear weapons, but to the booming business of conventional
weapons as well. Weizenbaum said that professionals have to ask if their work
will contribute to this "insanity" or diminish it. Weizenbaum also responded to
a point made by Parnas earlier by saying that it is probably insufficient for
professionals to consider only whether their work will genuinely meet the
customer's requirements, if in fact the work wastes valuable social resources
and addresses only a trivial need.
During the question and answer period, most of the panelists agreed that one of
the problems of dealing with ethical issues in the computer field is the lack of
exposure among computer professionals to ethics as a legitimate field of study.
Parnas, however, said he doubted whether an ethics course would have helped him.
Weizenbaum criticized the contemporary curriculum of the field because it rarely
allows enough time for undergraduate students to read outside technical
subjects. Johnson said that because of the radical division between technical
subjects and philosophical subjects, many undergraduates view ethics as a field
only ancillary to their main program of study, even if a course in ethics is
required. Fielder said that at Villanova they are experimenting with team-
teaching conducted by both engineering and philosophy faculty.
Deborah Johnson suggested that one role of professional organizations might be
to provide analyses of corporate practices, so prospective employees would have
an opportunity to assess the environment before accepting a position. Fielder
supplemented this by saying that organizations are very important with respect
to individual ethical dilemmas, because no individual is powerful or omniscient
enough to discern all the information or principles necessary for making an
ethical decision, and individuals can be strengthened in significant ways when
they challenge a corporate bureaucracy.
The panel discussion and the question and answer period were videotaped, and the
videotape can be loaned out for viewing by the CPSR National Office. The
videotape, however, is unedited, and runs nearly three hours. The tapes are
available as a loan only, for $10 to cover shipping and handling.
and the CPSR Newsletter
The Sunday morning program of the Annual Meeting was started by a showing of the
CPSR slide show, Reliability' and Risk: Computers and Nuclear War, in its six-
projector version. Following the slide show, representatives from CPSR chapters
around the country reported on what their chapters were doing, or in some cases,
Mary Karen Dahl, CPSR National Program Associate, then outlined the current and
planned work of the CPSR Computing and Civil Liberties Project, concentrating on
the NCIC program, which is described in detail in this issue of the Newsletter.
CPSR Executive Director Gary Chapman reviewed the financial status of the
organization, stressing the point that CPSR must move to a self-sustaining base
of funding because of gradually fading foundation support.
Most participants at the Sunday portion of the meeting shared ten pizzas in a
nearby classroom during the lunch break, surrounded by an intense discussion on
chapter development and local activism. Clearly one of the most pressing issues
facing chapter members is how to increase local participation. Chapters
characteristically share the problem of too much work handled by too few people,
and the core group of chapter activists eventually find themselves burned out or
discouraged or both. Chapter members also find it difficult to develop programs
that will draw CPSR members to meetings or other organization events. There also
appears to be some concern that the organization's activity is too focused on
the program of the National Office, making many members simple spectators.
Finally, it was agreed that the national program will not be able to sustain
itself without an increase in membership over time, and the best way to build
the membership is through local work. Thus, a focus for CPSR in the immediate
future should be how to integrate national programs with grassroots activity.
This discussion was continued along several different tracks after CPSR Chairman
Steve Zilles gave an address on the current state of the organization. Zilles'
main point was that CPSR is currently transforming itself into a multi-issue
organization, and there may not be the consensus within the organization on new
issues that there has been on avoiding nuclear war. Because of this, it is very
important that the CPSR Board become much more conscious of the process by which
new issues are taken up. When CPSR started, it was the product of a small study
group in Palo Alto which had educated itself about the role of computers in the
arms race. Now that process has to be disseminated throughout the entire
national organization. Very few CPSR members are experts on public policy
issues, so any new issues the organization decides to address will involve a
period of self-instruction. And it is important that this instruction be as
widely distributed as possible .
Another major issue discussed during the afternoon session on Sunday was the
content of the Newsletter. While everyone had high praise for the development of
the Newsletter, some members think that it does not provide a vehicle for
members to communicate with each other about important concerns. The Newsletter
has been viewed by the national leadership as the primary vehicle for the
message of CPSR. It is sent to members of Congress, to heads of corporations,
and to many other people in the public interest community. There has been some
hesitation to open the Newsletter up to subjects outside the official CPSR
program because this might portray the organization as too diffuse or possibly
riven by conflicting opinions. But some people believe that some kind of
publication is needed to communicate members' views in a fashion that is not as
demanding as the journal-quality articles that make up the content of the
Newsletter now. And some members feel that the constraints on what subjects can
be addressed in the Newsletter inhibit the creativity of the organization, and
preclude a national discussion on what social responsibility should entail.
The Board of Directors of CPSR had already recommended the exploration of a
vehicle to communicate to the membership as a whole what chapters are doing.
Starting with this Newsletter, a space will be reserved for reporting briefly on
chapter activities. The publications committee of the Board has also been
charged with finding some way to address the concerns expressed during the
Sunday afternoon discussion at the Annual Meeting.
Following the open discussion of all participants, the meeting broke up into
small groups for discussion on subjects of interest to those in attendance.
Small group discussions covered reliability and risk issues, computers in the
workplace, computers and education, and computers and privacy.
Although attendance at the 1987 Annual Meeting and Banquet was lower than
expected, participants agreed that the program of the two-day event was first
rate. Each Annual Meeting sets a standard higher than the previous one. The 1988
Annual Meeting and Banquet will be held in October in Palo Alto. CPSR members
are encouraged to participate in the planning of this event and to help make
sure that the outstanding program of the 1987 Annual Meeting is surpassed as
All CPSR members owe thanks to the members of CPSR/ Boston who helped arrange
and supervise the 1987 Annual Meeting and Banquet, particularly Ronni Rosenberg,
Steve Berlin, Karen Sollins, Tom Thornton, Adonica Gieger and Reid Simmons.
The National Crime Information Center
A Case Study in National Databases
Mary Karen Dahl
CPSR National Program Associates
"If the technology is available, why not use it?" Col. Carl R. Baker, Vice
Chairman, NCIC Advisory Policy Board
The Federal Bureau of Investigation (FBI) and criminal justice personnel from
across the country are currently engaged in the redesign of the nation's largest
centralized criminal justice information database, the National Crime
Information Center (NCIC). This is the system your local police officer
routinely checks when he or she stops someone for a broken tail light or for
failing to signal for a left-hand turn. It allows almost instantaneous access to
records on millions of subjects. Expansion of the NCIC has important
implications for the privacy and civil liberties of all citizens, and raises
critical questions about the potential benefits and dangers of federal
computerized record systems.
The NCIC System Today
The NCIC links together federal, state, and local criminal justice agencies by
means of a data telecommunications network. Most entries to the system are made
at the local level, transmitted to and stored in a host machine at FBI
headquarters in Washington, D.C., and redisseminated upon request to authorized
terminals throughout the fifty states, Canada, Puerto Rico, and the U.S. Virgin
Islands. The NCIC is accessed by 64,000 criminal justice agencies: local or
state police, prosecutors, parole officers, prison administrators, sheriffs, and
federal law enforcement agencies. In some areas, mobile terminals are in use.
The system operates around the clock, seven days a week, and handles as many as
600,000 transactions a day. Queries to the system are answered in two to five
seconds, depending on the file accessed. The FBI NCIC handbook advises that
"routine inquiries should be made on every person and all property encountered
by the criminal justice community. "1
As of September 1987, the NCIC contained over 19.4 million records. These
records are organized into twelve "information bases":
¥ "hot files" of wanted or missing persons and stolen articles of various kinds
(cars, guns, license plates, etc.);
¥ an index to the criminal history records on file in many states; and
¥ a list of individuals considered dangerous to those whom the Secret Service
protects, principally the President.
These 19 million-plus records and twelve files represent a substantial increase
in the amount and kinds of data available over the system since it was started
in 1967. Originally the system had five files. In 1967 it had just over 300,000
records. After ten years, it held 6 million records; by May 1984, it held 16
million; by August 1985, 17.4 million. According to the FBI, two years later the
count had reached 19,421,983.
Audits conducted by NCIC personnel have found that
(1) There is a wide disparity among some of the states with regard to the
quality of data entered into NCIC, and (2) some of the states and particularly
some local agencies do not consistently follow NCIC data quality procedures with
the result that entries from these jurisdictions have unacceptably high levels
of inaccuracy and invalidity.2
Bad data has serious consequences for real people. Sheila Jackson Stossier was
confused with a woman with a similar name on the basis of an NCIC record. As a
result, Ms. Stossier spent three days in jail and now has an arrest record. In
another well-known case, Terry Dean Rogan was arrested five times over a twoyear
period because, despite his continuing efforts to have distinguishing data added
to NCIC files, he was repeatedly confused by NCIC users with a murder suspect
who had used identification stolen from Rogan.
After twenty years of operation, the NCIC is up for review. In the words of
William A. Bayse, Assistant Director of the Technical Services Division at the
FBI, the FBI is now engaged in a "comprehensive study to begin a new system life
cycle for NCIC to serve our nationwide user requirements through the year 2000."
In this, the first phase of the study, the FBI has contracted with MITRE
Corporation to define the requirements for the new system, known as NCIC 2000.
The review and planning process has been underway for nearly two years. Once the
new system configuration is agreed upon, the procurement process will begin,
sometime in 1988.
As part of the initial study, NCIC users were asked what features would be
desirable. Some of those proposed would have expanded and changed the system
significantly. Among the capabilities suggested were the following:
¥ Iinking the NCIC with existing computerized databases at the Internal Revenue
Service (IRS), the Social Security Administration (SSA), the Securities and
Exchange Commission (SEC), and the Immigration and Naturalization Service (INS);
¥ using the NCIC to track individuals, including those under investigation, but
charged with no crime;
¥ adding records of misdemeanors and juvenile offenses to criminal history
¥ establishing modus operandi (MO) files on crimes and perpetrators;
¥ storage and transmission of photographs, fingerprints, signatures, and
artists' composite drawings.
In meetings in June and December 1987, the Advisory Policy Board (APB) of
federal, state, and local criminal justice officials that advises the FBI
Director on the NCIC narrowed these proposals. The APB rejected most of the
proposed online linkages, but recommended inclusion of some tracking uses in the
Policy Questions Raised by
NCIC 2000 Proposals
The addition of tracking and investigatory files would represent a notable
change in the kinds of records kept in the NCIC. (The NCIC has included one
tracking file since 1982, the Secret Service Protective file, by means of which
the movements of a handful of individuals are monitored. The number monitored
varies, rarely going above 100, and generally hovering between 60 and 70.)
Currently, records entered into the NCIC are "required to be supported by
documentation in the possession of the entering agency."3 Acceptable
documentation is an arrest warrant, a missing person report, or a theft
reportÑeach of which reflects an objective occurrence of some kind. Arrest
warrants are issued after the development of probable cause and require the
determination of a neutral magistrate. Entries in tracking files for people who
are under investigation (for possible involvement in organized crime, narcotics
trafficking, or terrorism, for example) would originate in a necessarily
subjective evaluation of the individual's actions and associations based on a
standard far less stringent than probable cause.
Take the example of the terrorist investigatory file. Not only would entry of
the individual's record not be based on a specific documented event, but
establishing criteria for labelling an individual a possible terrorist would be
problematic. The definition of terrorist might well vary from state to stateÑa
fact of considerable importance when one remembers that the NCIC is a national
system, routinely accessed by state and local officials in the course of their
If there is insufficient evidence to issue an arrest warrant, and if no
independent judicial review is involved, should individuals be subject to
nationwide tracking? Could we guarantee that the system would not be penetrated?
What happens when information collected through the tracking system is leaked?
Being known to be the subject of an investigation could irreparably damage a
subject's reputation, though he was never charged with a crime. On the other
hand, leakage of information could seriously compromise a criminal justice
CPSR and the
Review of NCIC 2000
Congressional oversight of the FBI is the responsibility of the House and Senate
Judiciary Committees. They exercise their influence primarily through the budget
authorization process. Presently, Congressman Don Edwards, Chairman of the House
Judiciary Committee's Subcommittee on Civil and Constitutional Rights, is taking
the lead in reviewing NCIC 2000 redesign.
In the fall of 1987, Congressman Edwards convened a panel of experts in law,
criminal justice, and relevant technology and charged it with examining plans
for NCIC 2000. The panel is to advise the Subcommittee as to the civil liberties
implications of specific proposals for NCIC 2000 design and the planning process
as a whole.
At the request of Congressman Edwards, the CPSR Computing and Civil Liberties
Project took an active part in forming the expert panel. A CPSR task force was
formed to study NCIC 2000, and members of the task force serve on the expert
panel advising the Subcommittee. In that capacity, CPSR/Palto Alto members David
Redell and Peter Neumann attended a Washington, D.C., briefing on NCIC 2000 by
FBI and MITRE personnel in September 1987. More recently, Dr. Neumann presented
the main points of our technical critique to the assembled NCIC Advisory Policy
Board at their December meeting in St. Petersburg, Florida. Our complete
critique was incorporated into the compiled comments of the entire expert panel
regarding the civil liberties implications of NCIC 2000. This compilation was
submitted to the Advisory Policy Board for their consideration as they voted on
which specific capabilities should be designed into NCIC 2000.
In substance, the CPSR task force maintained that, without an overall design
strategy ensuring system security, data integrity and user authentication and
accountability, it is impossible to ensure that the NCIC 2000 system can
guarantee the degree of record protection and accuracy central to civil
liberties interests. No such overall design strategy is evident in current
Moreover, the need to protect the civil liberties and privacy rights of citizens
requires, above all, critical evaluation of every single proposed system
capability in the light of a realistic assessment of the technology the system
uses. Dr. Neumann elaborated this point in his comments to the Advisory Policy
As soon as nonpublic information is contained in such a system, it is vulnerable
to exploitation under any of a wide variety of misuses.
Past experiences of developing and operating computer systems indicate that
there are innumerable sources of human misuse and system vulnerabilities. If the
acceptability of a concept must rely on assumptions of human infallibility and
perfect integrity, as well as on system correctness (e.g., no malicious users
and no security flaws), then that concept must be considered suspect. Either
technological limitations or human weaknesses (or both working together) can
undermine the confidentiality, integrity, and other desired properties of the
Concepts whose implementation cannot be adequately guaranteed to maintain
privacy, human safety, and other relevant requirements should be omitted from
NCIC 2000. Furthermore, if there is a reasonable doubt as to the feasibility of
implementing a particular concept and enforcing its proper use, that concept
should be omitted. It seems wiser to err on the side of safety.
The NCIC Advisory Policy Board has been responsive to many of the civil
liberties concerns raised by the expert panel. Several of the most troubling
proposals have been rejected on civil liberties and other grounds. OthersÑ
specifically, the notion of expanding the tracking capabilities of the
systemÑhave been recommended for inclusion in the system. The independent review
of plans for NCIC 2000 is expected to continue; Congressman Edwards plans
Subcommittee hearings in the spring of 1988 and has already requested continued
participation by the members of the CPSR Computing and Civil Liberties task
force on NCIC 2000.
The process of redesigning the NCIC offers an opportunity for rethinking the
system, not only for the law enforcement community, but for the citizenry as a
whole. Has the effectiveness of the proposed extensions for law enforcement
purposes been demonstrated? To what extent is there an inherent conflict between
law enforcement's stated need for efficient transfer of information and the
constitutional rights of individual citizens? Where there is such a conflict,
how do we balance the competing interests? Where there is not, how do we avoid
introducing one through inadequate design? And what kind of design would allow
the necessary degree of effectiveness for law enforcement at the same time that
it affords maximum protection for those most precious rights, our civil
liberties? As Col. Baker said, "If the technology exists, why not use it?" Why
not, indeed? And most important, who decides what technology is to be used and
then monitors that use?
Congressman Glenn English (D-OK) has said:
Americans are very concerned about threats to their privacy resulting from
increasing computerization and the growth of government. Yet we know from past
experience that privacy concerns tend to be completely discounted or ignored
altogether unless there is a dedicated and responsible spokesman capable of
representing the privacy interests of citizens.
Members of the CPSR NCIC 2000 task force now serving on the expert panel
convened by Congressman Edwards include Drs. James Homing, Peter Neumann, and
David Redeem all members of CPSR/Palo Alto.
1. National Crime Information Center, The Investigative Tool, A Guide to the Use
and Benefits of NCIC, U S. Dept. of Justice, Federal Bureau of Investigation
(Washington, D.C.: Government Printing Office, 1984), p. 7.
2 Criminal Justice "Hot" Files, Bureau of Justice Statistics Criminal Justice
Information Policy Series (Washington, D.C.: U.S. Dept. of Justice, 1986), p.
3. Criminal Justice "Hot" Files, Bureau of Justice Statistics Criminal Justice
Information Policy Series (Washington, D.C.: U.S. Dept. of Justice, 1986), p.
The Computer Security
Act of 1987 (H.R. 145)
The Computer Security Act of 1987, passed by the House in June and the Senate in
December, voids National Security Decision Directive 145 (known as NSDD-145),
which was issued by President Reagan in 1984 in an attempt to shift government
information policy from civilian to defense management.
The new law restores to a civilian agency, the National Bureau of Standards,
primary responsibility for developing standards and guidelines for the security
of federal computer systems (exepting those that contain regularly classified
national security information, which remain under the jurisdiction of the
National Security Agency). A Computer Systems Advisory Board made up of non-
governmental as well as governmental representatives will have a role in
identifying and reporting on computer security and privacy concerns. The
troublesome category of "sensitive, but unclassified information" contained in
NSDD-145 has been explicitly redefined so as not to create another category of
restricted government information. For more information, contact the CPSR
CPSR President Winograd
Presents Norbert Wiener Award to Parnas
The following remarks were made by CPSR President Terry Winograd, upon
presentation of the first Norbert Wiener Award for Professional and Social
Responsibility to Professor David Lorge Parnas.
Tonight we begin a new tradition for CPSR, presenting for the first time the
Norbert Wiener Award for Professional and Social Responsibility. It is
especially fitting that we initiate it here at MIT, which was the intellectual
home to Norbert Wiener for more than forty years.
Wiener arrived at MIT in 1919 as an instructor, and during his long and fruitful
years here he was the originator of the field of cybernetics and of many of the
ideas that grew into the development of the computer. His scientific
achievements were many and varied, but more relevant to us tonight, he also was
a pioneer in looking at the social and political implications of computing. In
his analysis and activities around social concerns, he anticipated almost the
entire program of CPSR by about forty years. If he were alive today, he would
certainly be an active and stimulating member of the organization.
As with CPSR, Wiener's concerns first led to action in the arena of nuclear
weapons and the dangers they posed to humanity. Shortly after Hiroshima he began
a long career of pointing out the dangers of nuclear war, and of the role of
scientists in developing ever more powerful weapons of destruction.
As he said in his book The Human Use of Human
. . . the new industrial revolution is a two-edged sword. It may be used for the
benefit of humanity, but only if humanity survives long enough to enter a period
in which such a benefit is possible. It may also be used to destroy humanity,
and if it is not used intelligently it can go very far in that direction .
As early as 1946, he announced that "I do not expect to publish any future work
of mine which may do damage in the hands of irresponsible militarists," and he
observed that ". . . the scientist ends by putting unlimited powers in the hands
of the people whom he is least inclined to trust with their use. It is perfectly
clear also that to disseminate information about a weapon in the present state
of our civilization is to make it practically certain that weapon will be used.
Also, like CPSR, he took a broader view of the social issues of computing. In a
variety of areas, including the problems of automation and employments he
explored the implications of the new technologies. He recognized the subtleties
and difficulties of the issues in a way that still makes thought-provoking
reading. He saw that the scientist had a special, and difficult responsibility:
. . . even when the individual believes that science contributes to the human
ends which he has at heart, his belief needs a continual scanning and re-
evaluation which is only partly possible. For the individual scientist, even the
partial appraisal of the liaison between the man and the [historical] process
requires an imaginative forward glance at history which is difficult, exacting,
and only limitedly achievable . . . We must always exert the full strength of
Finally, like CPSR, he recognized the importance of an educated public. He
devoted much of his energy to writing articles and books that would make the
technology understandable to a wide audience. His books, The Human Use of Human
Beings and God and Golem, Inc., were amongthe earliest works that opened a
public discussion of computers and what they could do.
He was especially concerned that there not be a mystification of the
possibilities for computers, fed by unrealistic optimism:
Any machine constructed for the purpose of making decisions, if it does not
possess the power of learning, will be completely literalminded. Woe to us if we
let it decide our conduct, unless we have previously examined the laws of its
action, and know fully that its conduct will be carried out on principles
acceptable to us! On the other hand, the machine like the djinee which can learn
and can make decisions on the basis of its learning, will in no way be obliged
to make such decisions as we should have made, or will be acceptable to us. For
the man who is not aware of this, to throw the problem of his responsibility on
the machine, whether it can learn or not, is to cast his responsibility to the
winds, and to find it coming back seated on the whirlwind.
So we might think of Norbert Wiener as the patron saint of CPSR, although I
suspect he would be a bit uncomfortable with the religious metaphor.
Tonight we are honoring a man who, like Wiener, might not fit the model of
sainthood but who, like Wiener, has served as a visible and inspiring example of
social responsibility: David Lorge Parnas.
David Parnas is Professor of Computer Science at Queen's University in Kingston
Ontario. He received his Bachelors, Masters and Ph.D. degrees from the Carnegie
Institute of Technology (now Carnegie-Mellon University) and has taught at a
number of prominent institutions in the United States, Germany and Canada. His
research has been extremely influential in the field of software engineering, of
which he can rightfully be called a founder. He was one of the pioneers in work
on structured programming, and his research still stands as a classic in that
On the basis of his work on making programming more productive and reliable, he
was made head of the Software Engineering Research Section and director of the
project on Software Cost Reduction at the Naval Research Laboratory, beginning
in 1979. His expertise made him a natural choice to serve on the panel formed in
1985 to investigate the feasibility of the computing system required for the
Strategic Defense initiative ("Star Wars") program proposed by President Reagan.
I do not need to rehearse for this group the subsequent story (which Professor
Parnas elaborated in his remarks in the panel discussion on ethics). To
summarize quickly, he attended one meeting of the panel (now known as the
Eastport Group) and recognized that the project was ill-conceived and
He raised his concerns with his colleagues on the panel, and although they could
not refute his arguments, they saw the program as an opportunity to develop
search funding for computer science, and did not want to hinder that bonanza (in
which their own institutions would obviously share). After trying to take his
concerns to the relevant government officials and failing to get their
cooperation, he went public with a carefully written and cogent series of
articles (later published in the Communications of the ACM and American
Scientist) which still stand as the basic argument against the feasibility of
He was the instrumental participant in a series of public debates on the SDI,
the first and most significant of which was held here at MIT, sponsored by the
CPSR chapter. The debates led to the gradual admission by the program sponsors
that it would not, as Reagan had promised, "make nuclear weapons impotent and
obsolete," but was at best a conventional anti-ballisticmissile defense, with
all of the strategic difficulties and shortcomings such defenses raise.
Professor Parnas also testified for CPSR to a subcommittee of the Senate Armed
Services Committee examining the SDI program. His work was a major factor in the
gradual disillusionment with Star Wars among the public and policymakers.
Like Norbert Wiener, David Parnas has served as an example of social
responsibility in many ways: in his own personal example of ethical and
professional responsibility in refusing to go along with the work of the panel
and profit from the opportunity; in his concern with public education in his
writings and public appearances; and in his willingness to seek political action
for the public interest. There could not be a more suitable recipient for our
In concluding, l would like to return toÑand completeÑan earlier quotation of
. . . the new industrial revolution is a twoedged sword. It may be used for the
benefit of humanity ... It may also be used to destroy humanity.... There are,
however, hopeful signs on the horizon . . . There are many dangers still ahead,
but the roots of good will are there . . .
David Parnas stands as an example that the roots of good will are there, and
that on them we can grow lives of action and responsibility. It is an honor to
present him with the first Norbert Wiener Award for Social and Professional
CPSR Chapter Activity
Starting with this issue of the CPSR Newsletter, the publication will include
brief notes on what CPSR chapters are up to around the country. It is hoped that
this information will give CPSR members a better idea of what CPSR is doing, and
give chapter membersÑand people who would like to be in a chapterÑcreative
suggestions for projects, meetings, and other activity.
CPSR/Austin is based primarily around the computer science department at the
University of Texas at Austin. The chapter has sponsored a symposium on the
campus about the SDI. . . CPSR/Austin has arranged for the CPSR slide show,
Reliability and Risk, to be shown on public access television in Austin. . . the
chapter's monthly meetings have addressed legal aspects of software production
and computers in the workplace.... CPSR/Boston had its hands full last fall with
the arrangements for the national meeting and banquet. The chapter has also
viewed the very popular videotape of the story of Roger Boisjoly, the Morton-
Thiokol engineer who blew the whistle on the Space Shuttle Challenger's faulty
O-rings.... the Boston chapter puts out a multi-page newsletter every month ....
Boston chapter member and Board director-at-large Karen Sollins is working with
a team investigating the networking requirements of the SDI's National Test
Bed.... CPSR/Los Angeles meets monthly to hear speakers on such subjects as
Student Pugwash and the Great Peace March in the Soviet Union. . . Art Goldberg
of Los Angeles is also working on an analysis of the National Test Bed, and
recently described his work at a chapter meeting.... CPSR/Palo Alto meets
monthly to hear speakers on a variety of subjects. . . in November the chapter
hosted a talk by John Pike of the Federation of American Scientists. . . in
December Palo Alto members saw another popular videotape called Computers in
Context, which discusses the appropriate design of computers in the workplace;
that meeting also featured a talk by Kirsten Nygaard of the University of Oslo,
whose work is featured in the video. . . several members of the chapter are
involved in the CPSR/Palo Alto Workplace Project, a study group which publishes
its own newsletter.... in January CPSR/Palo Alto heard a talk by Christiane
Floyd, the founder of FIFF, CPSR's West German counterpart organization....
members of CPSR/Portland have focused on problems of computerized voting, and
the chapter now publishes a newsletter. . . CPSR/Santa Cruz will also be viewing
the video Computers in Context, which is now available for loan from the
national office.... CPSR/Seattle recently heard from Professor Richard Ladner of
the University of Washington, who spoke on computing access for the
handicapped.... Seattle is providing computer assistance to local peace
organizations, and Ken Berkun is heading up a project to get computers donated
to peace groups in the area. . . CPSR/Seattle has also co-sponsored a benefit
called "Give Peace a Dance," a 24-hour dance marathon to raise money for local
peace organizations. The CPSR T-shirts, "Nerds Against Nukes," have been a
popular item at the events. The chapter is still looking for funding for its
"autonomous dancing machine.... " New chapters are just forming in San Diego and
Denver-Boulder as this issue goes to press. We're also hoping for chapters in
Minneapolis, Washington, D.C., and Atlanta... . Several chapters are not doing
much at all, so if there are any CPSR activists who need help, get in touch with
Mary Karen Dahl in the CPSR National Office
Lester Thurow on Military Spending and the American Economy
Lester Thurow, Dean of the Sloan School of Management at MIT, was the keynote
speaker at the 1987 CPSR Annual Banquet. The subject of his address was "Does
Military Spending Hurt America's Economic Performance?" On Tuesday, October 20,
The Boston Globe featured the following opinion piece by Dean Thurow, which
happened to be a shorter version of his remarks at the CPSR dinner. This is
reprinted with permission.
With an agreement with the Soviet Union to scrap intermediate-range nuclear
forces, the United States can look forward to a reduction in its defense budget
of $295 billionÑ 6.6 per cent of the GNP. Or can it?
If history is any guide, it cannot. When previous presidents have reached
agreement with the Soviets they have come home and announced that since they
have agreed not to do X, America must spend more on Y since the Russians are not
to be trusted. Partly the extra spending was necessary to get the treaties
approved by the Senate (some conservative Republican senators are already
mumbling that they will not vote for the treaty unless the president promises to
spend more on conventional forces), and partly this was done because the
presidents really believed the Russians were not to be trusted.
In this case, even without offsetting increases in parts of the military budget,
the impact will be small. The United States is spending very little on the
intermediate missiles that are to be scrapped. These missiles were in the past
and new ones aren't being built in any number. The only savings are on
maintenance and support. Even here the impact will be small since such missiles
represent only 3 per cent of our stock of warheads and strategic weapons account
for only a small per cent, less than 1 cent, of total defense spending.
To have a noticeable impact on the defense budget, the coming agreement would
have to lead the United States to trust the Soviets so we would be willing to
voluntarily reduce other parts of the military budget. This may eventually
happen, but it won't happen fast enough to have an effect on the military budget
in the foreseeable future.
There is, however, another lesson from history. While cuts are unlikely, huge
increases in the defense budget are equally unlikely in the context of an
agreement with the Soviets. This means that for all practical purposes, the
Strategic Defense Initiative (ÒStar Wars" to some) has died, although the
funeral won't be held until the next president is inaugurated. The current
Congress and the president aren't going to add tens of billions of dollars to
the defense budget while signing treaties with the Soviets. Politically such
increases could only be gotten from the American political system in an
environment where most Americans believed the Soviet Union is really an "evil
Agreements with the devil are logically possible, but in the United States, they
are not politically possible. Americans have a simple logic. If they are dealing
with the devil, deals should not be struck. If deals are struck, then they are
not dealing with the devil.
Research spending may continue under the label of SDI, but it will merely be
general military R&D looking for opportunities to improve America's weapons
systems regardless of whether they are in space or on this earth. No one is
going to have a crash R&D program or deploy any expensive nuclear weapons
umbrella either for cities or missile sites when the United States is improving
its relations with the Soviet Union.
The economics of the defense budget spring paradoxically not from the actions of
the Soviets but from the actions of our allies. If America's main economic rival
were its military enemy then spending less than 7 per cent of the GNP on defense
presents no great economic problems. The United States is a very wealthy country
and subtracting that sum from the goods and services available to be used to
support the American standard of living represents no great sacrifice. Just two
years of modest growth completely eliminates any drawdown on the standard of
One can argue, however, that it is not current expenditures that are burdensome
but the diversion of treasure and talent (40 per cent of engineers work on
defense) that lowers a country's international competitiveness, where the real
burdens are to be found. But as long as economic rivals and military rivals are
the same, there is no such burden. One's economic rival is being forced to
divert the same talent into its defense efforts and, relatively speaking, a
country is no worse off.
The problem arises if one's economic rivals are one's military allies. Seven per
cent of the GNP may not be much when it comes to consumption, but it represents
a huge potential increase in competitiveness if it were to be added to plant and
equipment investment, civilian R&D or increasing the skills of the labor force.
American plant and equipment investment could rise by 60 per cent, research and
development could more than triple, and educational expenditures could more than
From the Secretary's Desk
Eric Roberts-National Secretary
The major CPSR event since the last issue of the Newsletter, of course, was the
October Annual Meeting and Banquet in Cambridge. A full report begins on page 1.
Besides this, the other big news for this past quarter at the National Office is
new foundation grants. It seems that several foundations decided it was indeed
the season of giving, and we did rather well. Since August 1987, we have
Deer Creek Foundation
C. S. Fund
$ 25,000 20,000 5,000
Congratulations to both Gary and Mary Karen for their work in making all of this
In addition to the grants, we are running ahead of budget projections for both
membership contributions and sales, mostly from the videotape version of
Reliability and Risk and our new book Computers in Battle: Will They Work?,
edited by David Benin and Gary Chapman. Computers in Battle is now out in
hardcover from Harcourt Brace Jovanovich and should be available from your local
bookstore for $14.95. If you can't find it there, you can also order it from the
CPSR National Office ($14.95, plus $2.50 for shipping and handling; California
residents add sales tax). A review of Computers in Battle appeared in the
November 30 issue of ComputerWorld and was quite complimentary.
On December 14, several members of the CPSR Board and the National Staff met
with Luca Simoncini, the CPSR contact from Italy and director of our counterpart
organization there, and Christiane Floyd, formerly the president of our
counterpart organization in West Germany. Christiane had suggested a meeting to
discuss whether an international "umbrella" organization to link the various
national groups would be a good idea. We decided that we would not take any
formal steps at this time, but we hope to work to improve communication between
the different international groups
"Person-in-the-Loop" Amendment Signed Into Law
President Reagan recently signed into law an amendmentÑknown as the "Person-in-
the-Loop" amendment, sponsored by Senator Dale Bumpers (D-AR)Ñto the 1988
Defense Authorization Bill. This amendment states:
"No agency of the Federal government may pay for, fund, or otherwise support the
development of command and control systems for strategic defense in the boost or
post-boost phase against ballistic missile threats that would permit such
strategic defenses to initiate the directing of damaging or lethal fire except
by affirmative human discretion at an appropriate level of authority."
The full citation is: National Defense Authorization Act for FY 1988-89, H.R.
1748, Division A (Department of Defense Authorizations), Title II (Research,
Development, Test, and Evaluation), Part C (Strategic Defense Initiative),
Subpart I (SDI Funding and Program Limitations and Requirements), Section 224
(SDI Architecture to Require Human Decisionmaking)
Call for Contributions for Computers for Social Change
For a proposed book tentatively entitled Computers for Social Change,
prospective contributors are asked to contact CPSR Board member David Bellin.
Patterned after the style of Computers in Battle, the book will have two
sections. One will focus on the pro-active use of computers by people working
for social change. For example, computers have been used effectively by
developing nations, aid organizations, voter registration drives, and self-help
groups. Another section will be more analytical and theoretical, addressing
issues such as governmental use of computers, privacy, free speech, information
access, and information economics. Contributions should be approximately 10,000
words, and extensive con
tent and editorial help can be provided. Please contact David Bellin at 522
Tenth Street, Brooklyn NY 11215, (718)499-6443, firstname.lastname@example.org.
Announcing a Major Conference
Professionals and Social Responsibility: Conflict or Congruence?"
The University of Waterloo
Waterloo, Ontario, Canada
March 16-18, 1988
Beginning the evening of Wednesday, March 16, 1988, The University of Waterloo
Centre for Society, Technology and Values will host a major international
conference on professionals and social responsibility The keynote speaker on
Wednesday evening will be Jack Stevenson, Professor of Philosophy at the
University of Toronto and author of the book, Engineering Ethics: Practice and
Thursday morning will feature discussions on "Ethical Codes for the Profession "
Thursday afternoon's sessions will address "Global Peace and I human Rights,"
which will include talks by Anatol Rapaport and Tom Perry, chairman of
Physicians for Social Responsibility in Vancouver, B C
On Friday, March 18, activists in organizations concerned with professional
social responsibility will participate in a panel discussion Representing CPSR
will be Gary Chapman, executive director
Registration for the full conference is $120, or $75 for any single day For more
information, contact the University of Waterloo Centre for Society, Technology
and Values, PAS 2061, Waterloo, Ontario N21. 3G1, Canada The telephone number is
Created before October 2004