CPSR's New Executive Director
We are pleased to announce that we have hired Gary Chapman, who started as
CPSR's Executive Director on January 1, 1985. Gary is what we told the funders
we were't likely to find, a person who has technical competence as well as
interest in and experience with organization building.
Gary's background includes military service, academic research and
organizational experience. He served in the United States Army's Special Forces
("Green Berets"), where he became familiar with many high-tech weapons systems.
He received his B.A. in political science, with honors, at Occidental College,
and went on to the doctoral program at Stanford University. His reserach work
there has focused on political philosophy, artificial intelligence and the
application of AI by the military. Gary also has experience with several
political organizations and campaigns. He has worked on peace issues with the
New American Movement, the United Nations Association, the California Seminar on
Arms Control and Foreign Policy, the Women's International League for Peace and
Freedom, the Citizens' Party, and other groups.
Gary has chosen as top priority items the production of both a chapter formation
packet to assist in the formation of new chapters, and a membership solicitation
packet to assist existing chapters in expanding their membership. Gary has
particular interest in and experience with chapter-structured organizations, and
hopes chapters will consult with him about their problems. He expects to visit
all the chapters sometime during the next six months
Report of ACM Ad Hoc Committee
In August 1983, David Brandin, then President of the Association for Computing
Machinery (ACM), appointed the ACM Ad Hoc Committee on Systems Reliability and
Risks to the Public. The Committee was directed to "study and prepare a report
on the technical considerations of computer systems reliability and risks to the
pubes." The committee consisted of Aaron Finerman (Chair), James Homing, Anthony
Ralston, Richard Tanaka, Eric Weiss, Charles Williams, and Stephen Zilles. (Jim
Horning and Steve Zilles are long-time members of CPSR.) The final report of
this Committee was submitted to the ACM Council on October 8, 1984. By a vote of
22 For, 0 Opposed, and 2 Abstentions, the Council endorsed the following
statement and directed that it be published in the Communications of the ACM and
widely disseminated. We applaud this action heartily.
(The February issue of the Communications contains this statement together with
an excellent letter by ACM's President Adele Goldberg which describes the
history of the committee and comments on the issues raised by the report.)
ACM Statement on Systems Reliability and Risks to the Public
Contrary to the myth that computer systems are infallible, in fact computer
systems can fall and do fail. Consequently, the reliability of computer-based
systems cannot be taken for granted. This reality applies to all computer-based
systems, but it is especially critical for systems whose failure would result in
extreme risk to the public. Increasingly, human lives depend upon the reliable
operation of systems such as air traffic and high-speed ground transportation
control systems, military weapons delivery and defense systems, and health care
delivery and diagnostic systems.
While it is not possible to eliminate computer-based system failure entirely, we
believe it is possible to reduce risks to the public to reasonable levels. To do
so, systems developers must better recognize and address the issues of
reliability. The public has the right to require that systems are installed only
after proper steps have been taken to assure reasonable levels of reliability.
The issues and questions concerning reliability that must be addressed include:
1. What risks and consequences are involved when the computer system fails?
2. What is the reasonable and practical level of reliability to require of the
system and does the system meet this level?
3. What techniques were used to estimate and verify the level of reliability?
4. Were the estimators and verifiers independent of each offer and of those with
vested interests in the system?
To emphasize the importance of this matter, ACM is releasing this statement and
is instituting a forum to illuminate the issues and questions raised. We believe
system developers will welcome public scrutiny of the answers to these
Ethical Questions & Military Dominance In Next Generation Computing
Paul Smolensky ´ CPSR/Los Angeles
The following talk was presented at the Annual Conference of the Association for
Computing Machinery in San Francisco on October 9, 1984. The theme of the
conference was ñThe Fifth Generation Challenge." The talk was part of a panel
presentation on "Ethical issues in New Computing Technologies," which was part
of a series of sessions on the Impact and issues of the Fifth Generation."
Although this yalk covered ideas familiar to our readers, we reproduce it here
because it states the case so concisely and so poignantly.
Nuclear weapons create the greatest moral and intellectual challenge humanity
has yet faced - possibly our last challenge. I would like to raise some
questions and make a few simple observations about our role as computer
professionals in meeting the challenge of nuclear weapons.
My first point is that we cannot hide from the fact that information technology
plays as great a role in nuclear forces as does nuclear physics technology and
rocket technology. Like the bomb builders and rocket engineers, we too design
the machines of nucleus war. This means we have moral responsibilities that can
hardly be exaggerated.
To appreciate the role of information processing in current nucleus forces, it
is important to realize that the structure of nuclear forces differs in a
crucial respect from the traditional structure of military systems: it is not
hierarchical. Most simply stated, the fact that a continent can be turned into
radioactive ash with only a few minutes warning means that there is not time for
decision-making to percolate through a multi-layered control structure involving
many people in many places. This has two sobering implications. First, high-
level leadership may well not have time to do much consultation, negotiation or
consideration in times of crisis. The decision that starts World We III will
probably not be the result of an extended military and political process.
The second implication of the short warning time is that leaders cannot be
counted upon to be in contact with the relevant forces, or even to exist. Thus
over the past 3 1/2 decades, the authority to launch nuclear weapons has
expanded from the President, through high-level military commanders, down to
submarine commanders, and possibly to commanders of tactical forces. And there
looms always the threat that authority to launch nuclear weapons will be
extended to machines as well.
The short time scales of nuclear decision-making put an unprecedented premium on
information processing power. Information gathering has exploded since World War
II. Rapid transmission, integration and analysis of large amounts of data,
complex decision-making, and dissemination of orders have all become a major pan
of the operation of nuclear forces. Information processing is a critical pan of
decision-making at all levels: from decisions about launching a major nucleus
attack, through decisions on the battlefield, to the internal decisions that
control an in-flight cruise missile.
The central role played by the information processing components of the military
machine places awesome responsibility on their designers--that is, on us. How
are we to face that responsibility?
One thing we can do is critically examine the exisiting system. Is the system
capable of carrying out its mission? Is it safe? Or is it a disaster waiting to
In his book, The Command and Control of Nuclear Forces, Professor Paul Bracken
of Yale argues that the control structures of American and Soviet nuclear forces
creates a system as dangerous as the interlocking defense treaties that fanned
the assassination of an Archduke into the War to end all wars. The very
structure that created a sense of complacent security in fact made a world war
Who are the "experts" that con sniff out the bugs in the enormous military
system? These may be the only people standing between this beautiful California
autumn and a nuclear winter. It seems clear that we in the computer profession
cannot leave the assessment of the soundness of our military establishment to
others. The necessary debugging cannot be done without our help-it is an open
question whether it can be done with our help. We cannot wait to perform a post-
mortem analysis of a "core dumper after the system crashes.
There are very general reasons to have serious concerns about the safety of our
current heavy reliance on information technology in nucleus forces. The issue
here is not the reliability of individual hardware and software components. It
is a fact well-known to people inside the computer profession-though perhaps not
to those outside--that integration of complex computing systems under field
conditions is a difficult business. As clever as we try to be, the interactions
among thousands of constituents in novel situations produces many unforeseen
problems. The period of adjustment in the integrated system requires
considerable experience and often involves numerous errors. Do we want
considerable experience with our nuclear forces? Can we afford numerous errors?
Anticipating interactions among software and hardware components that we design
is difficult enough. Now add thousands of people into the system. It has been a
hard-learned lesson--one that many designers are still learning--that when
humans interact with computers, all sons of problems develop that were utterly
To fully appreciate the dangers of the extremely complex integrated
human/machine system that comprises our nuclear forces, we must realize that the
environment in which it works is the environment of international politics. A
more complex and unpredictable environment is hard to imagine.
These considerations suggest we should be extremely cautious about integrating
computers into human systems in which unanticipated problems in novel situations
can produce errors with catastrophic consequences before the system can be
revised. Has this caution been sufficiently exercised in the past? Will it be in
Now let me express some personal concerns. I am worried that the existing
command and control system for nuclear forces does not fully respect the
limitations of complex human/machine systems. I am inclined to believe that the
trend to rush information technology into the military system has already
produced serious problems. But what I find most alarming is that this dangerous
trend is not being viewed critically. Quite the opposite: it is being
enthusiastically fueled to greater intensity.
The apparent interest of our profession in zealously spreading the faith in the
miracles of modern computing of course flies directly in the face of caution
about applying computing technology to military systems. This conflict very
quickly pits our ethical concerns against the bottom-line: big bucks.
A timely case in point is DARPA's Strategic Computing Initiative. Current
estimates for funding run $600M over the next 5 years--an enormous increase in
DARPA funding for computer R & D. This seems like a boon to our profession--it
has been termed "America's Fifth Generation Project." It supports development of
a "technology base" that includes computer vision, natural language processing,
speech processing, navigation, planning used reasoning, and of course, expert
Such notions produce a tremendous amount of excitement in many minds. But a
sober view suggests many questions. Are the promised goods deliverable? How will
the direction of research in our field be affected by such an enormous effort
directed at these ends? The question I want to focus on, however, is: Will
developing these applications of artificial intelligence to war fighting
increase or decrease our security?
I honestly don't know the answer to this question. I am an AI researcher with no
background in military or political science. But I am trying to educate myself
to make the best judgments I can about the research opportunities available to
me. I don't know the answer, but at least I know the question--a question that I
think we as individuals and as a profession should be trying our best to answer.
Will the Strategic Computing Initiative increase or decrease our security?
How are we to undress this question? I can see no tight logical analysis proving
one way or the other how robot tanks, copilots, and battlefield consultants will
affect our security. Anyone who thinks such analyses we possible is naive about
the complexity of modem politics and warfare.
Since proofs are impossible, I respond to the SCI in terms of how it seems to
fit into the overall pattern of military development. I have already described
my general concern that we have rushed too fur too fast in our dependence on
technology in our nucleus forces. The Strategic Computing Initiative promises to
propel us even further in this dangerous direction.
The initiative makes it perfectly clear that computing technology has now
matured into a major player in the arms race. The same insidious forces that
have fueled the arms race for 40 years have appeared newly incarnated in
artificial intelligence. We must use some of the wisdom painfully gathered by
looking back over those four decades.
Let me be a little more specific. The robot copilot and battlefield consultant
are supposed to be a response to the increasing rate at which information must
be processed during batty The belief implicit in this proposed solution is that
advanced computer technology con let people cope effectively with shorter
decision time scales. If this assumption is false, we will be increasing the
chance of misjudgments that could contribute to uncontrolled escalation of
conflicts above the nuclear threshold. If it is true that technology can let us
deal on shorter time scales, the inevitable result will be to then push the pace
of conflict to still shorter time scabs. It seems to be a general principle of
the arms race that a solution to a problem is sooner or later matched by the
other side, closing a feedback loop that restores the original problem with even
There is another possible response to the increasing demand for information
processing during convict. We can say that a technological solution will not
work. We can say that we should not develop technologies that attempt to cope
with unreasonable demands on extremely dangerous systems We can say that steps
must be taken to reduce the strain on our human judgment. We can say that the
solution is not to include more machine processing in a fragile decision-making
structure, but rather to include more human processing. We con say that we
intend the computer technologies we develop to enhance the humanity of our
world, that we refuse to develop "autonomous vehicles"--that we refuse to
develop robots of death, that we reuse to allow anyone to push onto the
technology that you and I are developing the one responsibility that must
forever remain squarely in human hands: the responsibility for taking another's
It is a tragic fact revealed by many studies that most young people expect their
lives to be terminated by nuclear war. This conference addresses the hopes for
the next generation of computers. As part of those helping to raise this new
generation, I don't want to see the fruits of my work make even more ominous the
terrible threat already facing the next generation of humanity
"Market" for EMP
Ed. note: Thanks to Mark Hall for bringing this to our attention.
A news release from International Resource Development, Inc. claims: "A market
of more than $10 bilion by the late 1980's is forecast for equipment systems
which will be able to resist Electro-Magnetic Pulse (EMP) ... This fast-growing
market has already attracted dozens of major electronics firms, and there will
soon be hundreds of suppliers, both of military and commercial EMP-resistant
equipment." The release defines EMP as "the sudden electromagnetic 'flash' which
accompanies the explosion of a nuclear warhead."
Though the news release is headlined "BILLIONS TO BE SPENT ON PROTECTION FROM
ELECTRO-MAGNETIC PULSE," it soon becomes unclad whether the research in question
is for offensive or defensive purposes, for EMP weapons or EMP-resistant
equipment. The report postulates that EMP "may itself emerge as a very
attractive strategic weapon with the ability to stop cars, knock out
communications, cause a power blackout ... cause aircraft to crash, wipe out
computer memories, shoot down nuclear power plants, explode fused munitions, and
cut off life support electronics in hospital This can ail occur without the
devastating effects of radiation, provided the bomb is exploded outside the
earth's atmosphere ..."
As a weapon, effects of EMP are "intriguing," since they "could be used on the
defensive in an attempt to paralyze the electronic capability of an Advancing
Amy, or on the offensive, to destroy the command control communication and
intelligence capability of the defenders as a softening up blow prior to an
attack with men and machines." International Resource Development, Inc., in
Norwalk, Connecticut is offering this 144 page report for $1,850
Computer Unreliability and Nuclear War
This article is the third in a series from a paper prepared by CPSR/Madison
entitled Computer Unreliablity and Nuclear War. This material was originally
prepared for a workshop at a PSR symposium held In Madison Wisconsin in October
1983. The final section will be printed in the next Newsletter. For a complete
copy, please send $1.00 to the CPSR national office.
3. Artificial Intelligence and the Military
A good idea of the kind of technological research the military is involved in
curl be obtained by looking at the research program of DARPA, the Defense
Advanced Research Projects Agency. This agency fulfills the central research
the Department of Defense with an appropriation of 729.6 million dollars in 1983
(around 18% of DoD's total investment in science and technology). DARPA
currently has five main focuses of interest:
(1) Strategic Technology. This is mainly devoted to space based strategies,
working on the assumption that "existing technologies have already made space a
(2) Tactical Technology. This involves the development of weapons used in the
air and on land and sea. For example, it includes the development of cruise
missiles which rely on computers for their ability to find targets and avoid
hazards, such as enemy radar.
(3) Directed Energy. This involves laser and particle beam technology for
defense and communication.
(4) Defense Sciences. This involves basic research that will be useful in other
projects, such as the development of new kinds of material and electronic
devices. It includes Systems Sciences, a project to improve man-machine systems
and monitoring technology. It focuses on such things as software for command and
control and computer-based training technology.
(5) Information Processing. This involves developing technologies that can
gather, receive and communicate information to human Wings or other computers. A
large component of the research in this program involves Artificial
The term "Artificial Intelligence" (AI) refers to techniques for using the
representation of knowledge in computers in a sophisticated way in order to
perform tasks which involve fairly complex reasoning--reasoning of such a kind
that the machine is successfully imitating some aspect of the behavior of a
human being. The imitated behavior may be very limited. Much successful current
research is devoted to the writing of programs that the non-computer scientist
would not consider intelligent at all--not in the normal sense of the word as
applied to human beings.
AI has applications in a number of areas that are of
direct interest to the military. For example:
(1) Natural language -- This would enable military personnel (or even
presidents) to consult the computer directly without any training in computer
techniques, perhaps even by voice so that they would not have to type. Pilots
be able to give oral commands to weapons systems while using both hands to fly
(2) Vision -- An intelligent vision system would be able to interpret the light
or radar signal it receives and make inferences (like perspective) based on this
information. Vision systems could be used for robots traversing strange terrain
(such as the moon) or for surveillance satellites or missiles.
The reliance on AI is one of the scarier implications of current military
research. "The goal is to make it possible for computers to assist and/or
relieve military personnel in complex as well as routine decision-making tasks
which are information or personnel intensive, tedious, dangerous, or in
situations which are rapidly changing." This goal is scary because of the
particular vulnerability of AI programs to the sorts of software errors
described in Section 2 (see CPSR Newsletter Fall 1984, Vol. 2, No. 4).
One common AI technique for hurdling inference involves programming scripts into
the computer. An expert system for restaurants, for example, would have in its
knowledge base an account of the kinds of things that normally happen in
restaurants--a script for restaurants. Imagine the same technique applied to
nuclear war. Is it possible to imagine a script that anticipates every
eventuality? No matter how inventive the programmer, is it not likely that
something will fail to go according to plan? The effects of EMP were overlooked
for more than 20 years. What else have we failed to consider?
Many researchers are deeply convinced that machines wag never be able to make
really intelligent decisions in human terms unless we can build a machine that
is exactly like a human and win go through all the learning experiences that
humans go through. In the event of a nuclear attack, a large number of entirely
novel situations will be arising in rapid succession. It is meaningless to
expect that AI systems will ever be able to provide security in such an event.
Even if it were possible to design compete and accurate scripts, there would
still be the problem of implementing the design in the form of a working
program. AI systems are very large and complex. Moreover, many of the systems
the military wand to build will be distributed, that is they will use several
different computers located in several different places and communicating with
each other. No one programmer con ever have a complex overview of such a system.
The programming task must be shared by a large number of people, each of whom
writes a singe module Changes to improve one module may prevent it from
interacting properly with other modules, despite the fact that every module does
exactly what its programmer thought it should do. The reaming errors may require
a large amount of practical use of the program Wore they surface and can be
Military planners are faced with a dilemma: either they keep a system at the
state of the art, which means constantly making changes, or they keep it as it
is for several years, only fixing bugs. The first approach leads to an
unreliable system; the second approach bash to an outdated system.
With civilian computers, one can balance these considerations in a way that
leads to sufficiently up-to-date systems with a tolerable level of reliability.
It may be unpleasant to be stranded at O'Hare because the airline's computers
lost a reservation, but it is not a disaster. It is a disaster, perhaps, if the
computer on board a 747 malfunctions and causes the prone to crash, but even
then it is not the end of the world. It may indeed be the end of the world if a
comptuer error causes nuclear war
Letters to the Editors
A More Positive Role for CPSR
After reading the last few issues of the CPSR Newsletter, it has occurred to the
that perhaps we have missed something in defining our direction. So much of the
discussion in the CPSR Newsletter seems to be directed toward the fact that
current and proposed military computer systems do not, and are not likely to,
increase the security of the American poeple, or for that matter the people of
any other country in the world. We could, however, take a more positive approach
to this problem and ask, "What kind of computer systems would be likely to
increase the Security of the Mopes of the world?" This is, I think, a very
difficult question, but one which deserves attention and one which CPSR is
probably in a unique position to address. I do not necessarily feel that we will
be able to come up with workable ideas immediately, but that it should be one of
our major goad to think up positive ideas whose implementation will lead to
increased levels of world security.
It is my opinion that proving the unreliability of current and proposed military
systems, although it may increase public awareness and prevent or delay the
development or deployment of some of them, will not do much to reverse the
general trend toward more powerful and less reliable military computer systems
and the concomitant decrease in levels of security for the people of the world.
On the other hand, even minor advances in answering the questions of "Who kinds
of computer systems would promote and increase world security," could have
lasting positive effects. I would, therefore, suggest that we view our rob as
something more than that of a watchdog organization.
Should we succeed in coming up with some workable Ideas, it may be possible to
reedy funding for research and development of these ideas. It may even come to
pass that DoD would desire to fund Some of our ideas. After all, the security of
the American people A, or at least should be, one of DoD's major goals. It is
Possible that in certain situations, cooperation would serve the purposes of
both organizations better than confrontation.
Thomas J. Sager Department of Computer Sconce University of Missouri-Rolla
War Tax Resistance
... Besides CPSR, I am a member of the New England War Tax Resistance and the
National WTR Committee. After looking over all the facts and options, I have
concluded that the United States seems to be driving the arms race more than the
Soviets; or at least that it is at fault in not testing Soviet sincerity by
using the current parity as an opening for sincere negotiations. This is a fatal
and criminal omission.
I believe that the political will to adopt this course (of sincere negotiation,
not unilateral disarmament) must, must, be found by any means possible and
practical. The action most likely to effect this "change of heart" is a massive
withdrawal of financial support from the U.S. government: a tax revolt. The
bottom line in our nation, and especially in our Congress, is money. I would
like to see each Congressman (yes they're still 95% men) who votes for a freeze
resolution one day and the MX the next get 5000 letters on his desk saying "I'm
fed up with this and I'm not going to pay for it anymore." I can't imagine it
not having an effect.
There are already war tax resisters, about 30,000. Many of them, perhaps half,
are specifically pacifists. (I honestly don't know if I am or not, but it
doesn't take being a pacifist to be outraged at our current direction.) They are
struggling with the IRS and with tax forms and court cases over various
penalties. None are in jails. More people who are sympathetic with their
efforts are dissuaded by the level of hassles they fear would fall upon
But there seems to be a level of self-defeatism in this, when sacrifices are
undergone without the support of enough people to make them work. What is needed
is an organizations effort to raise the interest level in a potential tax
refusal movement, thus lowering the risks of joining one. [...] I guess what I'm
trying to say is that if we think we can "save the world" from annihilation
without some kind of struggle and sacrifice, we are cruelly deluding ourselves.
Let us undertake it now, nonviolently, while the risks and costs are small, and
Wore others are driven to desperate acts.
In this case, computer professionals are invoked not by reason of their
professional credentials but because they are among the highest paid workers in
the country. They are also intelligent enough to figure out what is going on, if
not too cynical to do something about it. In many cases, they have taken the
step in conscience not to work on weapons of mass suicide. But they pay for
All the reasons for the necessity of some such catalyzing action are spelled out
in page after page of the CPSR Newsletter and other publications most of us
absorb to our limit every day. The questions are procedural: how effective would
a civil disobedience campaign of some type be, and how to organize it. My hope
for computer professionals is that our instincts will carry from job to social
action: we are problem solvers. We know there is no bug-free system on Earth,
but this current socio-military system is unreleasable in its current state.
There's no debugging his one after it bows up for the first time.
Nations War Tax Resistance Coordinating Committee
CPSR in the News
August & September - The New Zealand computer magazine interface published
CPSR's Assessment of DARPA's Strategic Computing Pen in its entirety in these t
September - A four-page article entitled War and Games, written by Earl Vickers
of CPSR/Palo Alto, was published in the September issue of Creative Computing.
This article contains some interesting historical background, and ends with
descriptions of Buckminster Fuller's World Game and information about
simulations. It mentions CPSR's Annotated Bibliography as well as one that Earl
has composed. Response to this publication came to the CPSR office from as far
away as Bangladesh.
October - The Australian magazine MicroComputer World published our Assessment
of Strategic Computing in its entirety, replete with a photo of the Pentagon.
October 25 - An glib about Off Johnson's law suit against Caspar Weinberger,
charging that launch-on-warning is unconstitutional, appeared in the New
Scientist. It was written by staff writer Ian Anderson.
October 29 - The San Jose Mercury News published a front page story entitled
ïThe last bugÍ computer scientists fear. This long story included boxed quotes
by Peter Neumann (CPSR/Palo Alto), Alan Borning (CPSR/Seattle), and Svero
Ornstein (CPSR Chairman).
November 1 - The PEOPLE section of Datamation contained a picture of Cliff
Johnson of CPSR/Palo Alto and a story entitled Bucking the Tiger. Described
here are the two court cases regarding computing in which Cliff Johnson is
involved, one against the Bank of America and the other against Caspar
Weinberger (see October 25 entry above).
November 1 - Severo Ornstein was interviewed on film in the CPSR office in Palo
Alto by Belgian Television as part of a documentary they were preparing on
November 12 - Electronic Engineering Times published a long story about CPSR at
the beginning of Section 4 entitled Computer Scientists and Professionals
Express Concern Regarding President ReaganÍs Star Wars Strategies, with a
headline on the second page of Computer Group Uses Expertise to Challenge DoD.
This coverage brought in a great deal of mail from all parts of the United
December - The Bulletin of the Atomic Scientists published CPSR's Assessment of
DARPA's Strategic Computing Plan under the title Strategic computing This
version is very lime edited from the version we published ourselves in June,
December 10 - The Technology section of the Boston Globe contained a story by
staff writer Edward Dolnick entitled Can Computers Cope with War? with a
subtitle In 18 months there were 3703 false alarms; Star Wars proposal stirs
debate. This story lists computer mishaps gathered by Peter Neumann of
CPSR/Palo Alto and also gives coverage to the Madison chapter's paper Computer
Unreliaability and Nuclear War.
January - The Letters section of the Bulletin of the Atomic Scientists published
a rebuttal to CPSR's Assessment (see December above) written by Robert Cooper,
head of DARPA. The Bulletin asked if we would like to respond, and Severo
Ornstein wrote an answer, which was also published in the Letters section of
CPSR/Boston has prepared a proposal to develop and produce a professional
quality slide/tape presentation about computer reliability and nuclear war. The
presentation is intended for a general audience and will document how computer
errors in key military systems could lead to a nuclear war. The proposal was
conceived and written entirely by members of the Boston chapter with minor
assistance from the nations office. It has already been submitted to one
foundation for funding and wit be further circulated with the hop of the
national office.. Other chapters are encouraged to of v lop find pendant
projects and to request help from the national office In seeking funding.
Greg Nehon and Dave Redell are working on a paper about the role of computers in
Star Wars. Volume V of the government's Fletcher Report clearly indicates that
major breakthroughs in almost every area of computing would be required for such
a system. We expect to circulate their draft for comment used to refine it into
a CPSR position paper, as we did with our Assessment of DARPA's Strategic
Providing information for new members is an ongoing problem for all chapters.
CPSR/Palo Alto has begun a project dedicated to helping new members lean about
the chapter and to become involved in its activities.
In August the CPSR office received a letter from a professor of psychology who
is writing a book on communications techniques for peace workers. He asked
whether we could contribute a chapter about the role of the computer. John James
and James Ganong of CPSR/Santa Cruz responded and have submitted a chapter
entitled Computers for Peace. The book is to be published by Impact Publishers
CPSR/Santa Cruz is planning a benefit concert by a musical group called ANCIENT
FUTURE. It will be held Saturday, February 16 at 8 pm at Moraga Concert Hall in
From the Secretary's Desk
Laura Gould - CPSR National Secretary
We recently were visited by David Stutsman, a lawyer from Elkhart, Indiana, who
is involved in litigation regarding faulty electronic voting systems. Software
inadequacies of various kinds were revealed after a recent election in Indiana.
This is an important issue as it is estimated that approximately 50% of the
voters in this country use some variety of the system in question to cast their
ballots. Mr. Stutsman is looking for technical assistance in understanding the
software, in order to build a case for the need for more rigorous control over
the design of such systems. Please contact the CPSR national office if you would
be willing to volunteer time for helping to analyze and understand the programs
We are indebted to Rilla Reynolds, a member of CPSR/Palo Alto and also of the
Employee Volunteer Action (EVA) program of Apple Computer. She, with the help of
Marc Rotenberg and John Cicarelli, (also CPSR/Palo Alto members), has finally
gotten our Apple III working. We also want to thank the Apple Corporate Grants
program, which has recently given us a substantial amount of software and
peripheral hardware, including a modem. We hope soon to be communicating
electronically with the chapters from the national office.
Severo Ornstein has been selected for membership in the American Committee on
East-West Accord. Membership is by nomination only in this prestigious group,
chaired by George Kennan and John Kenneth Galbraith, and consisting of experts
in arms control. Sidney Drell, Richard Garwin, Wolfgang Panofsky, Carl Sagan,
Kosta Tsipis, Jerome Wiesner, and Herbert York are among those on its list of
In another kind of recognition, the Stern Fund has recently given us a $20,000
grant, and the W. Alton Jones Foundation has given us $15,000. Both grants are
for general support for the coming year.
In a lighter vein, we recently received a Dear Policyholder letter from the
insurance company which covers the CPSR office. It reads in part:
"It is important that you read your policy to acquaint yourself with the types
of losses not covered. For example, under the provisions of Section I - Losses
1. Under no circumstances does your policy provide coverage for loss involving a
2. You do not have protection against earthquake damage unless you added
Earthquake coverage to your policy ..."
By contrast an article in the January/February Nuclear Times describes a Nuclear
Holocaust Insurance T-shirt with a "Mutant Of Omaha" 1090 on the front and "When
the World's in Ashes, We'll Have You Covered" on the back.
NewsBase is an electronic magazine dealing with such issues as Central America,
ecology and peace. On line 24 hours a day at (415) 824-8767, operating at 1200
or 300 baud, it may be accessed by most personal computers. There is no access
charge at present.
The Council on International Affairs is administering the Speiser Essay Contest,
the topic of which is: "How can we, without adopting socialism or giving up our
treasured freedoms, modify American capitalism to make it more equitable and
reduce the level of ideology conflict with the Soviet Union so as to moire
possible an end to the nucleus nightmare?" Essays may be comments and
suggestions related to Stuart M. Speiser's book, How to End the Nuclear
Nightmare, or they may offer entirely different plans for bettering relations
with the Soviet Union. Deadline for the contest is December 31, 1985. The prize
for the winning essay is $10,000. For more information, contact:
Speiser Essay Contest
Council on International and Public Affairs
777 United Nations
Plaza New York, NY 10017
A conference entitled Strategic Computing: History, Politics, Epistemology will
be held March 2-3 at U.C. Santa Cruz. The conference is organized by the Silicon
Valley Research Group, Project on High Technology and Social Change. Along with
many others, Terry Winograd, Lucy Suchman, and Severo Ornstein of CPSR will
present papers. For further information, contact Paul Edwards at (408) 425-7454.
In our Summer 1984 Newsletter we mentioned a possible workshop on alternative
funding and directions for computer research. After extensive discussions we
concluded that, whim such a workshop would be extremely valuable, the topic is
so broad that n requires closer definition and substantial background research.
We are not currently in a position to do that research and therefore have
temporarily suspended further work on the project.
On the other hand there has been a groundswell of interest in arranging a
conference on the problem of accidental nuclear war. The Nuclear Age Peace
Foundation in Santa Barbara, and members of the Computer Science Department at
Carleton University in Ottawa are both planning conferences on this topic. We
are exploring the possibility of a cooperative effort with both groups and will
meet with the Nuclear Age Peace Foundation in February.
Gary Chapman - CPSR Executive Director
Duct so Winter
"Nuclear Winter: Global Consequences of Multiple Nuclear Explosions," by R.P.
Turco, Science, December 23, 1983.
"Nuclear War and Climactic Catastrophe," by Carl Sagan, Foreign Affairs, Winter
"The Climactic Effects of Nuclear War," by R.P. Turco, Owen Toon, Thomas P.
Ackerman, James B. Pollack and Carl Sagan, Scientific American, August 1984.
"Nuclear Winter and Nuclear Strategy," by Thomas Powers, The Atlantic Monthly,
"Nuclear Winter" section by various authors, in Issues in Science and
Technology, Winter 1985.
Strategic Defense Initiative
Directed Energy Missile Defense in Space, A Background Paper, Office of
Technology Assessment, U.S. Congress, Government Printing Office, April 1984. ,
"Space-based Ballistic Missile Defense," by Hans Bethe, Richard L. Garwin, Kurt
Gomried and Henry W. Kendall, ScienHfic American, October 1984.
Ballistic Missile Defense, by Ashton Carter and David Schwartz, eds., The
Brookings Institute, 1984.
The Reagan Strategic Defense Initiative: A Technical, Political and Arms Control
Assessment by Sidney Drell, Philip J. Farley, and David Holloway, Stanford
Center for International Security and Arms Control, 1984.
Ballistic Missile Defense session by various authors in Issues in Science and
Technology, Fall 1984.
Command, Control, Communications and Intelligence
The Command and Control of Nuclear Forces, by Paul Bracken, Yale University
"How Nuclear War Could Start, " (review of Bracken's book), by Thomas Powers,
New York Review of Books, January 17, 1985.
Strategic Computing Initiative
Background Paper on Strategic Computing, by Robert Aldridge, Pacific Life
Research Center, 631 Kiely Blvd., Santa Clara, CA 95051
FAYETTEVILLE, N.C (AP) - A city building was supposed to be closed and empty at
nights and on weekends, but switchboard computer records showed that hundreds of
calls, some within seconds of each other, were being placed from two telephone
extensions. It wasn't a burglar. It was just two computerized Coke machines
trying to phone home. City officials couldn't figure out how anyone was getting
into the building after hours to place the hundreds of local calls, all to the
same number, according to the computer printout. City police were called in, and
last week found that the number being dialed was the local Coca-Cola Bottling
Co. and that led to the two Coke dispensers. Bob Johnson, who runs the service
department at Coca Cola Bottling, said the machines were outfitted with a
computer system that automatically called another computer at the company each
day, reporting how many bottles of soda had been sold. This allowed the
distributor to know when the machines needed to be refilled without making
unnecessary stops. From midnight until early in the morning the two machines
tried to report their inventory, but if they were answered by a busy signal they
would disconnect and call again - and again and again. Johnson said the system
was discontinued last week after it was discovered that the machines had "a
Created before October 2004