A Visit to NORAD's Cheyenne Mountain
Recently, I visited the North American Aerospace Defense Command (NORAD)
facility under Cheyenne Mountain near Colorado Springs. I had been invited to
join the tour three weeks before by the Arapahoe County (Colorado) Nuclear
Freeze Organization. Apparently, someone who had been planning to go had
cancelled. I accepted the invitation without hesitation because, though NORAD
conducts several tours a day, the tours are booked several monthsÑin some cases
even yearsÑin advance (though a Freeze member told me that peace organizations
seem to be given high priority).
Colorado Springs is a one-hour drive south of Denver, just east of the Rockies,
which rise up very quickly behind it. Cheyenne Mountain ascends out of the
prairie at the eastern edge of the Rockies several miles south of the city. On
the plain below the mountain is Fort Carson, a tank training range in addition
to being the residence of many of those who work inside the mountain.
As we drove up the two-lane road from the highway to a point several hundred
feet above the plain where the mountain begins in earnest, we passed a highway
warning sign that read: "Road may be impassable under certain conditions." It no
doubt referred simply to weather conditions, but the sign was thought-provoking
Visitors park in a lot well below the actual facility, just in front of a guard
kiosk, and take a shuttle van up the last mile or so. The van unloads at the
Visitor Center. As we entered, we passed another interesting sign: "NORAD
welcomes Arapahoe County Nuclear Freeze...."
As we waited in the lobby for the rest of our group to arrive, our guide, a
jovial public relations officer, directed our attention to various exhibits: a
cutaway model of the mountain, a collection of core samples of the various kinds
of rock (mostly granite) that make it up, a display by General Electric
describing the new early-warning radar system that they are building for NORAD,
and others. When all of our group had arrived, we were herded into a small
auditorium for, our guide informed us, a briefing on the "Threat"Ñhe used this
word repeatedlyÑagainst which NORAD is defending us, and also on the
organizational structure of NORAD and related organizations, NORAD's operations,
and the history of the mountain facility.
The speaker for the "Threat" briefing, a captain from Army Intelligence
(ÒIntel," as they call it), was fairly serious in comparison to our guide. He
gave a polished slide presentation that described the Soviet Union's nuclear
capability: ballistic missiles, submarines, bombers, and cruise missiles, and,
in somewhat more ominous tones, their defensive capability: the anti-ballistic
missile system around Moscow, anti-satellite weapons, and their ground and
space-based laser program. The clear message of the presentation was that we
should support the SDI. He asked for questions, and got several, mostly comments
on various things he had said, disguised as questions. For example, I asked my
favorite SDI question, a thinly veiled suggestion that if SDI is purely
defensive and would, as the Pentagon claims, buy us time to make a considered
response to a perceived attack, then we should be glad, rather than fearful,
that the Soviets are developing one. His response was that ours is defensive,
but theirs is offensive.
Our guide gave the remainder of the briefing, which lasted one and a half hours,
followed by a two and a half hour tour. Before I recount what I recall from the
briefing and tour, I'll mention that the guide's overview covered some of the
same ground, and even contained some of the same slides, as the "Threat"
briefing. This suggests to me that most tour groups see only the overview, the
"Threat" briefing being reserved for "peace" groups.
The NORAD Facility
The North American Air Defense Command was formed in 1957 by a mutual defense
treaty between the U.S. and Canada. Canada supplies about 10-15% of NORAD's
staff; command of NORAD passes back and forth between Canadian and U.S.
generals. NORAD cuts across the military branches (Army, Navy, Air Force) of
both countries, in a matrixÑit has operational control of a small proportion of
each branch's resources.
The initial NORAD command and control center was located at Ent Air Force Base
(now the Olympic Training Center) in Colorado Springs. The then head of NORAD
decided that a "hardened" facility was needed and, after evaluating various
sites for fault-stability, denseness of rock, etc., chose Cheyenne Mountain.
Construction began in 1961 and was completed in late 1965. The total cost was
The mountain actually shelters two other "commands" besides NORAD. One is the
Missile Warning Center, which keeps track of all known satellites and "space
junk"Ñ including a wrench that an astronaut left in orbit years agoÑso that if
and when any of it reenters the atmosphere, we (including the Soviets) won't be
surprised. The other agency is the U.S. Space Command. It was formed in the
early eighties because the U.S. military lacked a branch whose domain of
authority included outer space. It monitors Soviet military activity in space,
and, though this wasn't stressed, has authority for our own . Technically, the
Missile Warning Center is now part of the U.S. Space Command, but it predates
Several years ago, NORAD changed the "Air" in its name to Aerospace," space-
traversing missiles having replaced bombers as the nuclear warhead delivery
vehicle of choice. When asked how NORAD and the U.S. Space Command are related,
our guide said that the latter organization is so new that details of its
relationship to NORAD are still being worked out. In any case, both the U.S.
Space Agency and the Missile Warning Center pass their information on to NORAD,
which looks at it all in its role as North America's Òwatch guard."
Access to the underground facility is through a mile-long, U-shaped tunnel two
lanes wide and two stories tall that runs from the north to the south end of the
eastern face. The north entrance is the primary one; the other is used mostly
for fuel trucks and air intake. Before we boarded the bus that was to drive us
into the mountain, our (picture) IDs were checked, our social security numbers
compared against those that we had provided weeks before, we passed through a
metal detector and a barbed-wire fence, and we were issued visitors' badges.
According to our guide, vehicles that enter the mountain are checked thoroughly,
including using mirrors to look underneath and having dogs sniff for explosives.
We drove a third of a mile into the tunnel, whereupon the bus turned around and
let us out. We were met by a security guard who, for the remainder of the tour,
made sure that everyone in the group was between him and the guide. We then
followed the guide through a pair of massive "blast doors" into the complex
The U-shaped access tunnel has no doors at its ends. The doors that would in an
emergency seal NORAD off from the outside world are inside the tunnel, a third
of the way in from the north entrance. The two doors are in line, about 50 feet
apart, each large enough to allow a semi trailer truck through. They are steel,
six to eight feet thick, set in door frames of similar construction. Our guide
said that they can be closed in 42 seconds. The hope is that the shock wave
resulting from a nuclear blast outside would roar through the tunnel without
destroying the doors.
The complex behind the blast doors is different from what most people probably
imagine. It is neither a miniature, futuristic city in a huge, hollowed-out
cavern, nor an anthill-like three-dimensional lattice of twisty tunnels and
chambers. It is simply six long tunnels, three east-west, three north-south,
crossing each other like a "tic-tac-toe" board. Inside the tunnels, not touching
the cavern walls (apparently, that is important), are fifteen rectangular
buildings made out of thick plate steel, mounted on springs made from three-inch
steel rods, and connected together loosely, like railroad cars. The buildings in
the east-west tunnels are three stories tall and house the information-
processing centers and control rooms that are NORAD's reason for existence. The
buildings in the north-south tunnels contain the life-support system for
subterranean life: storage areas for water and fuel, air filters, diesel-powered
generators (which can, if necessary, provide all of the complex's electricity),
as well as amenities for the people who workÑ and, in the case of an attack,
would live there: a 400-seat cafeteria, a store, bunks for 400, etc. The fifteen
buildings provide a total of 200,000 square feet of floor space.
Depending upon whom you ask, between 700 and 1,500 people work inside Cheyenne
Mountain; the number reduces to 400 Òessential" personnel at night. The
Òessential" staff includes a four-star general (or the Canadian equivalent) upon
whom the ultimate responsibility for digesting the information and deciding
whether to notify authorities rests.
As our guide led us through the the narrow corridors, inter-building
passageways, and stair wells (I doubt that the NORAD facility is wheelchair-
accessible), he joked about the decor: standard military tan and pale green,
painted straight onto the plate-steel walls, with weld seams every six feet or
so. Someone asked how large pieces of equipment are moved in to and out of the
buildings; the guide pointed to trap doors in the walls, ceilings, and floors.
There are no windows in the exterior walls of the buildings. The security guard
behind us had never heard, however, of anyone having a claustrophobic attack
inside the mountain.
The Command Center
Our destination was the NORAD "command center," the famous room with the big-
screen displays of the world. Before we entered the command center, our guide
used a phone outside to notify those inside that a "nonclassified" tour group
was coming in, so that they could remove all classified information from the
displays. After a few minutes, we were cleared to enter.
The command center's visitor booth is a dark room 30 feet long and 15 feet wide.
A window the length of the booth affords a view down into the command center and
across at the big screens. Inside the booth are a dozen or so monitors showing
the same thing as one of the large displays: a Mercatur map of the world with
satellite orbits and aircraft trajectories superimposed on it. One satellite
track was the Soviet Mir space station, where two Soviet cosmonauts were in
residence; another satellite track was the Soyuz orbiting station, then
unoccupied. The aircraft represented were several East bloc commercial flights
near North America, as well as a few U.S. military planes. Normally (i.e., when
nonclassified tourists are not present), the displays show more, our guide told
As our guide explained the operation of the command center, he used a light-ray
pointer to point down through the window. He also used a remote control to take
control of one of the large displays to show slides. The half-dozen people in
the room went on about their business, seemingly oblivious to the slide show,
wandering light beam, and stares above them.
In contrast to the cavernous, high-tech command centers shown in movies and TV
shows, the real thing is small and relatively low-tech. It is only a little
larger than, and about twice as tall as, the aforementioned visitor booth: about
30 feet long, 15 feet wide, and about 1.5 stories tall. The primary furniture
consists of four consoles sitting side by side, facing the large displays. Each
console contains one large, round, green-on-black alphanumeric (i.e., not
graphic) computer display about 16 inches in diameter, a small TV display (none
were on), an array of buttons, knobs, and dials, a couple of telephones, and a
rack of metal-covered manuals marked "top secret." Directly below the
observation window (and hence difficult to see) was a general sitting in a chair
and, to his side, an ordinary TV set that was on. At one end of the room is a
display that indicates the current classification level.
According to our guide, the people at the consoles (three men and a woman) were
officers, a requirement of the post because of the responsibility of the job.
One of them is in charge and must be at least a colonel. The general in the room
is there in case the commander of NORAD, who supposedly does not leave the
complex and always carries a beeper, cannot be reached when a decision must be
made regarding whether North America is under attack. Our guide said that, in
addition to the command center we were looking at, there is another one in a
truck that drives around outside the mountain to serve as backup.
Someone said that the computer equipment looked old. Our guide responded that (1
) the current system is NORAD's second since the facility was built, (2) they
are about to upgrade it, and (3) they don't like to upgrade it too often
because old equipment is at least reliable.
Our guide took great pains to dispel the notion that NORAD personnel can
initiate a nuclear response. "People who have seen the movie War Games are
sometimes disappointed that there is no big red button in the command center
with which we launch nuclear missiles," he said. "Sometimes, people come here
and want to see the missiles. We have no missiles, and we have no big red
button." The most NORAD can do on its own is scramble fighter and reconnaissance
planes to fake a look at suspicious aircraft. Otherwise, their sole function is
to inform the military authorities in Washington and Ottawa of an attack; these
authorities in turn inform the President and Prime Minister, who then mobilize
the armed forces if necessary.
Living Inside a Mountain
Our tour group had been split into two groups of twenty because of the
observation room's small capacity. While my half of the group saw the command
center, the other half had toured the "support" facilities. Then, it was our
turn to view the facilities that make it possible to live inside a mountain. Our
guide remained in the command center to replay his spiel for the other half of
the group. We were handed over to an enlisted man in green fatigues who seemed
to have stepped out of M.A.S.H. or "Beetle Bailey": a natural comedian whose
manner and somewhat rumpled appearance was humorous enough even without the
jokes that peppered his narrative, not all of which were complimentary about
life in the mountain or the Army.
In the smaller, north-south, tunnels are the diesel generators, air pumps and
filters, food, fuel, and supplies. Theoretically, there is enough food, water,
and fuel to allow NORAD to function "sealed off" from the outside for about
thirty days. In case the diesel fuel runs out, batteries can provide enough
power to run "essential equipment" for a few days. The words "sealed off" are
quoted because the underground facility would never actually be completely
sealed: the one thing that in time of attack would still come from outside is
air. Massive air filters stand ready to be put into service should the need
arise. Exactly what they are meant to filter is classified, but it presumably
includes fallout and various chemical and biological agents. Of course, the air
filters are useful only if the intake vents are not clogged. Some critics of the
Cheyenne Mountain facility have claimed all along that an actual nuclear attack
would close the air intakes, reducing the period of "sealed" operation to a few
The air supply may, however, no longer be relevant. Though designed to withstand
a sixties-era direct hit, the facility would probably not withstand a hit by one
or more of today's larger and more accurate warheads, according to our guide,
who had rejoined us for a final question and answer period. Of course, that is
precisely what it would get. So, NORAD is essentially back to the situation it
was in in the fifties, in which it had better do its job of warning of an attack
quickly, before it ceases to exist.
The warning system is, naturally, based upon the assumption that the primary
threat is from the Soviet Union, and that an attack would be comprised of ICBMs
launched over the
North Pole carrying several warheads each, SLBMs launched from submarines off
the East and West coasts, and cruise missiles launched from submarines and
bombers. Accordingly, the warning system consists of a line of radar
installations in northern Canada, Alaska, and Northern Europe, infra-red
detectors in satellites over the northern hemisphere, radar and submarine-
detection installations along the coasts, and mobile detection equipment on
ships and planes. The radar systems are soon to be augmented with "ionospheric
backscatter" radar, a new technology that uses reflection from the ionosphere to
look over the horizon, achieving ranges two to three times that of conventional
radar. NORAD is especially concerned about cruise missiles because their
detection system probably cannot find them.
Someone asked how NORAD distinguishes "false" signals from the real thing.
According to our guide, the primary mechanism for preventing false alarms is the
"dual phenomenology" that is provided by having both a radar system and
satellite-based infra-red detectors. A false stimulus is unlikely to show up on
both detection systems. He said that most potential false alarms are caught in
this way. In addition to "dual phenomenology," NORAD also averts false alarms
using the rather mundane method of simply calling a station that is sending
"interesting" data to see if the operators there are actually sending it.
Supposedly, the famous "bad computer chip" alarm (mentioned in the CPSR
videotape Reliability and Risk) was exposed as false in this way within a few
minutes. The guide stressed that no decisions are left to computers; the
computers are there only to support human decisionmaking. That so much
importance is placed on the role of human judgment seems at odds with NORAD's
clear support for the SDI.
Before letting us go, our guide made one last pitch for SDI. He said that most
Americans wrongly believe that the U.S. has some sort of defensive capability;
perhaps they are thinking of the Nike and Sprint missiles of the sixties, which
were mothballed over a decade ago. "Ladies and gentlemen," he said, "If the
Soviets fire a missile at us, even just one, it will hit its target. All NORAD
can do is tell the President that it is coming. "Someone pointed out that if
that is the case, then NORAD is misnamed: it shouldn't be called the "Defense
Command" because it doesn't defend against anything, but only warns. The guide
responded that the name is left over from when NORAD did actually defend against
bomber attacks, and indicated that they might again be able to defend if the
U.S. proceeds with the SDI. Though his remarks did not jibe with his earlier
statement that cruise missiles are NORAD's greatest fear and the fact that, due
to their low altitude, they would be immune to SDI even if we could find them,
nobody said anything because the tour was already thirty minutes overtime: it
was 2:30 and we had had no lunch. And to recast a remark of Bertolt Brecht's
slightly: grub first, then strategic policy.
The SDI's National Test Bed
CPSR NTB Study Group
March 23 of this year marked the fifth anniversary of the Strategic Defense
initiative, which was announced to the nation by President Reagan on national
television in 1983. Today, the fundamental problems in building a trustworthy
space-based defense against ballistic missiles are no closer to being solved
than they were then. The Strategic Defense Initiative Organization (SDIO) has
claimed rapid progress in a number of key technologies, and some technologies
under development for strategic defense are progressing more rapidly than
expected. But the technology that SDIO Director Lieutenant General James
Abrahamson has called "the long pole in the tent" is still the principal and
ultimately unsolvable technological obstacle to ballistic missile defenseÑthat
is, the problem of developing reliable computer software that will work the
first time it is used by the entire SDI system under realistic conditions.
There is a single overriding feature of the SDI that precludes any confidence in
the reliability of the system's software. The SDI is meant to operate in an
environment with which we have no previous experience, i.e., that of a nuclear
war. The performance specifications of the SDI cannot be validated with checks
against real data derived from past experience. The overall design of the SDI
will be largely the result of guesswork. As a simple example, we do not know the
response of a computer controlled communications network whose nodes and links
have been partially damaged by a nuclear attack, and we can't perform a test to
find out. We have to guess at the data for such an event, and there is a large
chance that we will guess incorrectly.
Furthermore, the SDI cannot be tested in its conditions of use, nor even in an
environment accurately representing its conditions of use. We cannot have a
global nuclear war with thousands of warheads heading to targets in order to
"test" the SDI. And we cannot realistically simulate the conditions of such a
conflict because we do not know enough about what will happen in a nuclear war,
and because we are not privy to enough detailed information about what our
opponents have developed or will develop to make sure their weapons are
effective. Too many relevant variables of a nuclear war are unknown and will
remain unknown. Moreover, it is certain that many of the most critical variables
will continually change in ways that are difficult or impossible to detect.
Precisely because the SDI is meant to disrupt and foil the plans of an adversary
the adversary will do everything possible to make sure the there is a way around
the capabilities of the system. As cadets at West Point are taught very early in
their military education, "no battle plan survives contact with the enemy
On the rare occasions when press attention has focused on the problem of SDI
software, there has been a preoccupation with errors or "bugs" in the software
designed for the command and control of the system. By this time, all computer
scientists agree that the software for the SDI will have "bugs." The Eastport
Group, the panel of computer scientists assembled by the SDIO to evaluate the
computer requirements of the SDI, reported, "Simply because of its inevitable
large size, the software capable of performing the battle management task for
strategic defense will contain errors."
There are at least two stages of developing software, and there are different
sorts of errors found in each. Errors can be made early in the "design" phase as
well as in the stage where instructions are actually being coded. The press has
tended to focus on the latter type of errors, often giving the impression that a
"bug" would merely cause a small portion of the system to fail. However,
experience has shown that even a single, simple coding error can result in
complete failure of a complex system.
This focus of the press tends to obscure the additional problem of design error,
in which software specification are written incorrectly. Even perfect coding
will not solve a design error problem. The SDIO has reinforced the skew in
dealing with the software issue by suggesting that it is capable of properly
designing the system, even though the "debugging" process may be challenging.
But the SDIC has offered no credible method for how to write software
specifications when no one knows what will be required for the SDI to protect us
from a nuclear attack.
The National Test Bed
One of the largest single projects in the entire SDI budget is a vast network of
computer facilities and simulation centers. In January the Martin Marietta
Corporation was awarded the first half of an eventual $1 billion to begin
development of the National Test Bed (NTB) in Colorado Springs, Colorado.
The purpose of the National Test Bed is to use state-of-the-art computer
simulations to develop the design of the SDI and begin work on the command and
control software for ballistic missile defense. The NTB, if it is ever
completed, will be the world's largest simulation network, spread across the
entire United States and eventually overseas as well. It is intended to model
and simulate the environment in which the deployed SDI is meant to work, the
kinds of threats the SDI is expected to counter, and the weapons and overall
system design that will be most effective for ballistic missile defense. The
stated purpose of the NTB is to evaluate whether or not space-based ballistic
missile defense is feasible, and, if so, how it should be configured.
The Test Bed will initially involve eight remote facility sites, using the
computing resources of the National Test Facility (NTF) at Falcon Air Force
Station in Colorado Springs, Colorado. The Test Bed remote sites that have
already been selected are the SDIO's Pentagon offices; the U.S. Army Advanced
Research Center in Huntsville, Alabama; the Naval Research Laboratory in
Washington, D.C.; the Air Force Space Division in Los Angeles; the Air Force
Space Technology Center in Albuquerque, New Mexico; the Air Force Electronic
Systems Division at Hanscomb Air Force Base in Massachusetts; Los Alamos
National Laboratory in New Mexico; and Lawrence Livermore National Laboratory in
California. Planned sites include the military missile ranges at White Sands,
New Mexico, and Kwajalein Atoll in the South Pacific, and major defense
contractor facilities where work is being done on the SDI.
The Martin Marietta Corporation was awarded a $508 million contract to begin
development of the National Test Bed along specifications designed by the Air
Force Electronic Systems Division and the MITRE Corporation. The Congress has
appropriated $35 million in the fiscal year 1988 budget for construction of the
National Test Facility, a center that will cost $100 million to complete, and
which will eventually employ about 2,300 in 360,000 square feet of office space.
The National Test Facility will use a Cray 2, a Cray XMP/48, two IBM 3090
mainframes, and several parallel machines, including an experimental IBM RP3
being developed by IBM and the Defense Advanced Research Projects Agency. Sun
and Digital Equipment Corporation workstations will be used.
The Request for Proposals (RFP) for the National Test Bed reports that "the
National Test Bed Program is a phased hardware, software and facility program
whose purpose is to provide a comprehensive capability to test, evaluate, and
compare alternative architectures for strategic defense against ballistic
missiles and Battle Management and Command, Control and Communications (BM/C3)
architectures.... The NTB will interconnect Army, Navy, Air Force, Department of
Energy and other national test and demonstration facilities, including existing
ranges, laboratories, and contractor facilities, into a single distributed SDI
resource for direction and control of the many SDI simulations, demonstrations
and experiments." The National Test Facility, the RFP notes, will be the central
hub of the NTB, and will eventually operate as a "Government Owned, Contractor
Operated" facility, similar to Los Alamos and Lawrence Livermore Laboratories.
The core program of the National Test Bed and Facility will be, according to the
¥ Rapid prototyping of hardware and software capabilities
¥ Software development
¥ Simulation framework development
¥ Simulation and experiment integration
¥ Simulation and experiment support
¥ Studies and analysis support
Overall, the Strategic Defense Initiative Organization told Congress in its
report on the NTB, "As an integrated set resources, the NTB will be a single
national resource dedicated to the SDI for addressing the many critical issues
necessary to support an informed decision on future development and deployment
of strategic defense against balistic missiles." The SDIO has marked 1992 as the
year which a decision about deployment of a Phase 1 version space-based
ballistic missile defense can be made.
The Value and Limitations of Computer Simulation
The claim that the National Test Bed will tell us whether not the SDI will work
is based on a fundamental misunderstanding of the value of computer simulation.
Computer simulations are often very useful and powerful tools in the analysis of
proposed systems. Simulations, for example, can eliminate many unworkable
designs of a planned system before such designs are developed in prototypes, and
this saves a lot of time and money. Simulations can be used as a way of
evaluating "what ifÓ scenarios in a fashion that is much cheaper and usually
faster than testing after a system is actually built and put into use.
Simulations can provide "close enough" experiments for situations that simply
cannot be reproduced in real testing or are too dangerous for real testing, such
as the performance of an aircraft design in extreme weather conditions.
Simulations can also be useful in training peopleÑthe value of aircraft
simulators is significant when pilots are being trained to fly very complex
planes. In general, simulations are useful in the analysis of very complex
systems because they provide a tool with which to break down the performance of
a hypothetical system into parts and time frames that can be managed and
The limitations of simulations are also well understood. The most basic of
these is that a simulation is really only as good as the assumptions used to
build the simulation model. The more accurately these assumptions reflect the
real world, the more accurate and useful the simulation will be. This of course
presumes that the real-world situation in fact known, understood, and
manageable. "SimulationsÓ of situations for which there are no accurate, real-
world data are not really simulations at allÑthey are instead hypothetical
presentations that may resemble some chain of events in the real world. But they
accurately represent only the fictional world created by the simulation
designers. There is a danger in computer simulation that assumptions used to
build a model may be close enough to produce what appears to be a plausible
simulation, when in fact the assumptions are just inaccurate enough to produce a
simulation that distorts what will or does happen in the real world. Developers
of simulations validate their models by comparing their predictions with what
happens in the real world.
While simulations may be useful in eliminating unworkable models before those
models are actually built, typically, computer simulations cannot be used to
find a design that will be fully operational without further testing. This is
especially true of very complex systems. Today engineers do countless, very
expensive simulations of complex weapons and avionics systems, but then still
discover problems when the weapons or aircraft are built and used. The B-1
bomber, for example, the world's most expensive aircraft, has been the subject
of endless computer simulations, which presumably have demonstrated that the B-1
is capable of performing its principal mission of penetrating Soviet air
defenses and delivering nuclear bombs. Real testing of the B-1, on the other
hand, has revealed that it cannot survive a collision with a pelican, an event
no doubt excluded from its simulation models.
Even Pentagon officials have claimed that computer simulation is inadequate for
evaluating the reliability and effectiveness of nuclear warheads. Senior
officials have consistently argued that only real, underground nuclear warhead
testing is sufficient to assess the "quality" of the country's nuclear arsenal.
This is the principal argument of the Reagan administration against
participation in a nuclear testing moratorium or a comprehensive test ban
treaty. Ed Badolato, Deputy Assistant Secretary of Energy for Security Affairs,
speaking against a comprehensive nuclear test ban treaty, said, "No other piece
of military equipment is ever allowed into the field without extensive testing.
Tanks. Airplanes. Even boots. We've all read about the horror stories when a
conventional weapon gets into the inventory with inadequate testing and we've
all heard enough about weapons that don't work right after they're deployed. But
a nuclear warhead is the most complex weapon we've got. We have to test them as
they'll be used, before and after deployment.... Would you fly in an airplane
that had only been tested by a computer simulation?"
The technology of nuclear warheads has been well understood and accurately
simulated for many years. If testing is still needed to assess system
reliability, it will surely be even more important for designing, testing, and
evaluating a space-based strategic defense, which is poorly understood (and
which has little prospect for being well understood) and will be vastly more
complex and demanding than a nuclear warhead. However, complete "end-to-end"
testing of a strategic defense system in an environment accurately representing
its conditions of use cannot be performed.
Computer Simulation of Ballistic Missile Defense
Accurate computer simulation of a nuclear attack and a defensive response
adequate to designing a system will never be possible, for a variety of reasons.
First, as noted above, there are too many significant variables in such an event
which are unknown and which will remain unknown. Without any experience with a
global nuclear war, any simulation of nuclear war will be a very expensive
guess, and therefore inherently uncertain. There will be no way to validate the
models that will be used by National Test Bed simulation computers because it
will be obvious that many of the crucial variables in the models will be
This is not to say that National Test Bed researchers cannot build a model of
nuclear war. They can and will build dozens, all using different assumptions and
producing different results. There is every reason to suspect that the
researchers will be able to build a model in which every simulated Soviet
missile is destroyed by the simulated U.S. defense system before the missile
reaches its simulated target. What is impossible, however, is to verify that
this model accurately represents what will happen in a real nuclear attack. The
danger of this approach lies in the authority which can be attached to a
visually dazzling and systematically convincing simulation, regardless of
whether the simulation reflects what will actually occur at some future date.
The National Test Bed includes "demonstration sites," at which such simulations
can be displayed to demonstrate progress in the program. It may be difficult to
separate the use of simulations for public relations purposes from their use for
appraisal of the feasibility of the SDI.
Computer simulations using hypothetical data for the models that show the
alleged feasibility of the SDI may also have an unwarranted effect on the
character of component research and development. Small changes in the simulation
assumptions could mean the difference between emphasis on one type of defensive
weapon versus another. We have already seen a dispute develop over the utility
of kinetic energy weapons in an early deployment system because of conflicting
analyses of a computer model used by the pro-SDI Marshall Institute and
scientists at Lawrence Livermore National Laboratory. This illustrates how the
decision to spend hundreds of millions dollars in research and development might
hinge on the product of a collection of assumptions in a computer model. To a
large extent, those who control the model will control the outcome.
It is also likely that many of the assumptions used in the computer models of
nuclear war used by the National Test Bed will be highly classified and
available for analysis only to a handful of scientists, most of them already
predisposed to favor SDI goals. A "conventional wisdom" about how to best model
and design ballistic missile defense likely to arise among a small, very
carefully screened and insulated group of scientists and researchers. There will
no way to assess how accurately this "conventional wisdom" corresponds to actual
Soviet capabilities or to the chaos of nuclear war.
Finally, the issue of computer security will be critically important if the NTB
is actually built, while we are now discovering how unsecure computer systems
really are. Methods for sabotaging or otherwise altering computer data are
developing faster than we can invent protection against them. Officials of the
National Test Facility are already paying attention to facility security, but
probably from the wrong threat. The New York Times reported last August:
At the main gate, armed soldiers watch over a maze of 20 soon-to-be-computerized
security booths . . . these will automatically weigh a person, scan a magnetic
identification card, photograph the person's retina and compare the results with
computer files as an identity check. If the test is passed, the portal door will
swing open to allow entry to the top-secret station.
The RFP for the National Test Bed requires the contractors to prepare "for
disruptions caused by demonstrations," as well as potential breaches of security
by foreign agents, terrorists, or saboteurs. But the security of such a widely
distributed system dependent on long communication links and using "off the
shelf" technology and software, as is planned, is certainly open to question.
Within the past few years we have seen computer "crackers" develop methods for
accessing even the most secret and allegedly secure computer systems. There is
also the potential for NTB software to be seeded with "Trojan horses," strap
doors," and computer "viruses," most likely by people who have access to
software code before it is introduced into the NTB system. We can assume that
Soviet concern about the strategic Defense Initiative will remain high in the
future, and consequently Soviet interest in disrupting or sabotaging the work of
the NTB could become a high priority. Unfortunately, as recent cases have
demonstrated, the Soviets are quite capable of enlisting the aid of American
citizens, even those with very high security clearances.
The National Test Bed is being built to answer the very important question,
"Will the SDI work?" But the Test Bed cannot answer this question. The models of
nuclear war that it will use to assess the feasibility and design of the SDI
cannot be validated. The software used for NTB simulations will be suspect
because of the potential for unanticipated errors, concealed computer "viruses,"
the large potential for security breaches in a nationally distributed
communications network, and the software problems endemic to any complex
program. Any answer" provided by the NTB will be a very expensive guessÑa guess
that will bear the seductive authority of scientific analysis and highly
The software problems of the strategic Defense Initiative are intractable. This
is not a failure of technology, nor is it something that can be remedied by
technological progress. It is simply a result of the human inability to
accurately predict the future with the level of detail required by computers. If
the simulation researchers of the National Test Bed are honest, they will assert
that a reliable, trustworthy defense against ballistic missiles cannot be built.
At that point we should kick ourselves for spending a billion dollars to find
out something we already know.
Participants in the CPSR NTB Study Team have included David Pamas, Clifford
Johnson, Art Goldberg, James Homing, and Gary Chapman. Copies of the full CPSR
report on the National Test Bed are available from the CPSR National Office for
$3.00 to cover copying, postage and handling.
The NTB, the SIOP, and Arms Control
Gary Chapman and Clifford Johnson
In mid-February the Strategic Defense Initiative Organization announced a
command structure for the proposed ballistic missile defense system. The
announced command structure creates an "SDI commander-in-chief ," based at the
strategic Defense Systems (SDS) Command Center. As reported by Aviation Week and
Space Technology magazine on February 15, Strategic defense weapons would be
linked with strategic offensive forces, warning and intelligence agencies
through the SDI operational structure. It would allow military commanders to
select various strategic defense options to support offensive force actions
taken during a crisis."
The simple and naive view of the National Test Bed mission is that it will
attempt to describe the response of a U.S. defensive system against a massive
Soviet first-strike attack. In fact, as the above passage reveals, the ballistic
missile defense system of the United states will be considered part of the U.S.
capability for "prevailing" in a nuclear war. The options of using offensive
strategic nuclear missiles are codified by Pentagon warplanners into the Single
Integrated Operational Plan, or SIOP. It is clear that the NTB's modelling of
the SDI's capabilities will be integrated with the various options for offensive
action spelled out in the SIOP. Instead of exclusively modelling a response to a
Soviet attack, the NTB will model two-sided exchanges of strategic nuclear
missiles, including the role of the SDI in the SIOP options of "launch-on-
warning" and preemption.
The RFP for the National Test Bed asserts that there is no requirement for
survivability of the Test Bed because the NTB and NTF rare non-operational,
peace time support facilities based largely on commercial off-the-shelf hardware
operated within a standard commercial system data processing environment." It is
certainly possible, and perhaps even likely, however, that the NTB itself will
evolve into an early prototype for the distributed command and control of the
SDI, particularly since regional, distributed command and control is what the
NTB is supposed to both model and test. The Aviation Week and Space Technology
article says that for the future SDS command and control, "Space-based elements.
. . would be operated from the Consolidated Space Operations Center in Colorado
Springs, Colorado," which is the building the National Test Facility is now
located in prior to the construction of a separate building. The AW&ST article
Ground-based components, such as the Exoatmospheric Reentry Interceptor Subsytem
(ERIS), would be controlled from Regional Operations Centers directed from the
SDI Operations Center. The regional centers would double as backup command
centers that can be activated "under attrition conditions."
It is precisely these sorts of activities and conditions that are to form the
basis of the NTB's experiments. The clear corollary to these features and trends
within the SDI program is that the NTB is not principally a research program,
the way everything in the SDI is billed by the President and the Pentagon, but
is instead a massive development program, one that could serve as the prototype
of an immense, national, command and control system for integrated defensive and
offensive "battle management." It is easy to grasp the significance of this
project if one simply imagines the reaction of the Pentagon and the White House
to the same kind of program in the Soviet Union.
The NTB and Arms Control
The U.S. Congress has firmly demonstrated its commitment to the 1972 Anti-
ballistic Missile Treaty (the ABM Treaty) through recent votes cutting off
funding for the testing and development of technologies that would violate the
traditional, so-called "narrow," interpretation of the treaty. There has been
insufficient attention in Congress to the likely effect of the National Test Bed
on the ABM Treaty. The NTB even has strong supporters among congressional
leaders who are publicly committed to the ABM Treaty, apparently because the NTB
is seen as a way of evaluating the SDI without component testing that would
threaten a U.S. breakout from the constraints of the treaty.
It is possible that computer simulation will be of some value in sorting out
options for strategic defense, just as computer simulations are useful in the
development of most complex systems. The value of computer simulation for the
Strategic Defense Initiative is that it will help eliminate many unworkable
architectures. Computer simulation will not, however, tell us whether or not
strategic defense can work, because of the reasons stated in the accompanying
The great expanse of unknown data that will need to be turned into known data in
order to make a simulation model even remotely accurate will in fact set the
state for increased demands for computer testing to provide more specific data,
including testing that will violate the ABM Treaty. In effect, the NTB managers
will become a built-in constituency promoting more and more testing of SDI
subsystems, a constant chafe on the ABM Treaty. They will undoubtedly argue that
the simulations performed by the NTB and the assessments of SDI design will be
improved significantly by the availability of component testing data, and the
more accurately the components represent a real defensive system the more
accurate the simulations will be (no doubt with a predictable avoidance of the
concession that no computer simulation of global nuclear war can ever be
adequately accurate). As the simulations become more and more visually
promising, there will be increased pressure for abandonment of the ABM Treaty,
abetted by the political pressure that has carried the program along since its
inception. This will be reinforced by the expense of the NTB, both for its
development and its continued operation, because NTB managers will understand
that the Congress will have difficulty explaining to the public why it has spent
a billion dollars or more on a project that has a built-in ceiling on its
capabilities in the legal constraints of the ABM Treaty.
The Strategic Defense Initiative is already based on an assumption that the
United States might someday abandon the rationale that led to the ABM Treaty. As
one expert panel concluded, the U.S. is even now in a state known in legal
circles as 'threatened breach of contract." Some critics of the SDI argue that
the NTB is a "non-lethal" method of evaluating ballistic missile defense that
will help protect the ABM Treaty. But for the reasons stated above, NTB
personnel could become well-integrated members of a powerful constituency that
will constantly pressure policymakers to abandon the treaty
Arms and Artificial Intelligence
Chris Hables GrayÑCPSR/Santa Cruz
Allan M. Din, ed., Arms and Artificial Intelligence: Weapons and Arms Control
Applications of Advanced Computing. SIPRI and Oxford University Press, 1987. 229
This recent book is the result of a workshop on "Arms and Artificial
Intelligence," organized by the Stockholm International Peace Research Institute
(SIPRI) in 1986, and it is comprised of twelve different chapters which were
papers presented at the workshop. First are four solid introductory chapters on
artificial intelligence (AI) in general and on its military application. These
are followed by several excellent chapters on specific issues regarding the
military use of AI. The book ends with some interesting speculation on possible
uses of AI research to improve the arms control process and to increase our
understanding of conflict.
The perfect reader for this volume is likely to be someone quite interested in
AI, weapons, and arms control, but who knows little about the subjects. But even
for those who find the introductory chapters unnecessary, this is still a
valuable book, particularly because of the useful middle section entitled
"Military and Strategic Implications of AI."
The section begins with a good overview of the Strategic Computing Program by
Ingvar Akerston. This is followed by an exceptionally valuable contribution by
Randolph Nikutta, from West Germany, who traces in consummate detail the
implications of the increasing role of AI systems in NATO doctrines and near-
term plans for the European battlefield, developments often referred to as
AirLand Battle doctrine or "follow-on forces attack" (FOFA). "Artificial
Intelligence and the Automated Tactical Battlefield," offers convincing proof
that not only are AI applications a crucial part of NATO plans to fight a major
war, but that their overall impact includes the lowering of the nuclear
threshold and decreased crisis stability in general.
Nikutta's fine work is followed by Herbert Lin's chapter, "Software and Systems
Issues in Strategic Defense," a straightforward look at most of the basic
software problems of ballistic missile defense and of the SDI in particular. Lin
discusses reliability and failure modes, describing a number of interesting
scenarios and contradictions in the SDI program. He also criticizes the use of
expert systems for tasks for which there are no experts (such as shooting down
ballistic missiles), and rejects the official, optimistic assessment that
Òautomatic programmingÓ will make it possible to meet the software requirements
of the SDI.
One of the better contributions is by three scholars from the Soviet Union's
Institute of U.S. and Canada Studies in Moscow. "Artificial Intelligence and
Disarmament," by Gennady B. Kochetkov, Vladimir P. Averchev, and Viktor M .
Sergeev, points out that technology can often determine military doctrines, a
point also made by Nikutta. Kochetkov and his fellow authors go further by
warning of a possible "new round of the arms race based in large part on the
military application of new information technology." This is certainly a real
possibility and it has been raised at many different levels within the U .S.
defense establishment, most noticeably by the recent report of the President's
Commission on Long-Term Integrated Strategy, a group which included Henry
Kissinger, Zbigniew Brzezinski, Samuel Huntington, Fred Ikle, Albert
Wohistetter, among others.'
Another important insight is that the new information technology has the
potential of greatly increasing the lethality of the battlefield. "Thus a
situation is created in which increasing the accuracy of conventional weapons
becomes equivalent to increasing their explosive power up to the level of that
of nuclear weapons." These two points of the Soviet contributors emphasize the
fact that for substantial arms control, it is important to go beyond reducing
the levels of nuclear weapons and prevent the development of conventional
weapons which are as destructive as nuclear warheads.
While the Soviet authors feel the banning of military AI applications is not
currently feasible, they advocate management of the "diffusion of computing
technology into every sphere of society," and also "efforts to elaborate
confidence building measures" as pursued recently by both NATO and the Warsaw
The rest of the book looks at the possible uses of AI to improve the processes
of arms control negotiations. This section includes a wildly divergent group of
First is the chapter "Computer Applications in Monitoring and Verification
Technologies" by Torleiv Orhaug of Sweden. A relatively technical report on the
state-of-the-art in image processing and related AI verification techniques, the
chapter concludes that, while AI is not yet adequate for significant
implementation, certain applications will become important and even necessary
for "efficient image interpretation" on the scale required for new levels of
Next in this section is ÒKnowledge-based Simulation for Studying Issues of
Nuclear Strategy," by Paul K. Davis, director of the Rand Strategy Assessment
System (RSAS) of the RAND Corporation (see The CPSR Newsletter, Spring 1987, for
a detailed description of RSASÑed.).2 RSAS is the latest version of
sophisticated Pentagon war games used to plan and analyze strategic nuclear war.
Davis makes an interesting but unconvincing presentation of his claim that RSAS
will lead to a number of breakthroughs in strategic military thinking because of
its use of Albased "agents" which can play one or both sides in politico-
military war games.
(It is revealing to compare the picture of RSAS that Davis paints to the one
portrayed in Thomas B. Allen's recent book War Games.3 I would definitely not
read one without the other. [See The CPSR Newsletter, Fall 1987, p. 16, for a
review of the Allen book --ed.])
The last two articles have more limited claims. "Verification and Stability: A
Game-Theoretic Analysis," by Steven J. Brams and D. Marc Kilgour, proposes a
"verification game" that roughly calculates the levels of technical efficiency
necessary to make various negotiators follow a strategy that is "both more
compliant and more willing to develop and allow the use of better detection
methods." Their interesting conclusion is that improvements in technical
verification are necessary because of the improved mobility of new weapons such
as cruise missiles and mobile land-based ICBMs.
ÒARMCO-1: An Expert System for Nuclear Arms Control," by Allan Din, is the
editor's own contribution. He describes an attempt to set up an expert system
for nuclear arms control with the aim of revealing something about basic
problems and ideas of arms control negotiations. At its core it contains an
evaluation function that would assign a value to the weapons of the U.S. and the
Soviet Union. The difficulty of this task becomes apparent when one looks at the
history of the Manchester Laws of Combat," developed by one of the founders of
operations research, to perform similar calculations on conventional forces.
Despite years of research, numerous formulations, and numberless studies and
arguments, these "laws" are now more controversial than every Din's work
certainly may shed new light on the problems of nuclear arms reduction, even if
only to show how necessary much broader levels of analysis have become, but more
attention should be spent on criticisms of expert systems in which there is
little expertise, and of other programming difficulties in such open-ended
domains as war and peacemaking.
Din certainly deserves congratulations for a very useful book that includes some
quite extraordinary work and nothing uninteresting. The book is also valuable
because of its very sharp attention to some very important details. Printing the
addresses of the authors is an especially welcome idea, and there are a number
of helpful charts, an adequate index, and copious references.
The Soviet authors in this volume end their piece by saying, "We consider it to
be a major responsibility of the scientific community to clarify the possible
impact of advanced artificial-intelligence systems on weapons capabilities and
on the world military-political situation." I can only agree and add that this
book is a real contribution to that task.2
1. Discriminate Deterrence: Report of the President's Commission on Long-Term
Integrated Strategy, Washington, D.C.: U.S. Government Printing Office, January,
1988. See also Jon Stewart, "New Push for Useable Nuclear Weapons: Long-term
Strategy Report Focuses on 'Smart' Weapons and De-emphasizes NATO," San
Francisco Chronicle, February 17,1988, p. B1.
2. RSAS is a new designation for the Rand Strategy Assessment System, applied
after the publication deadline for the Din book. Therefore, RSAS is known as
RSACÑthe Rand Strategy Assessment CenterÑin references before 1988.
3. Thomas B. Allen, War Games, McGraw-Hill, 1987. See particularly Chapter 18,
"Across the Threshold: RSAC Goes to War."
4. John W. R. Leppinwell, "The Laws of Combat? Lanchester Reexamined,"
International Security, Summer 1987, Vol.12, No. 1, pp. 89-139.
Study Group on
Computers in the Workplace
CALL FOR PARTICIPATION
Computer Professionals for Social Responsibility (CPSR) is an alliance of
computer professionals concerned about the impact of computer technology on
society. Decisions regarding the use of this technology have far-reaching
consequences and reflect basic values and priorities. from the CPSR General
In the past, much of CPSR's attention has been concentrated on the military uses
of computing and the dangers of overreliance on technology in critical systems.
But our concern for social responsibility in the use of computer technology must
not end there. In recent years, computers have become an increasingly central
aspect of our society and deeply affect all of our lives. For many, the most
profound impact comes in the workplace, as more and more jobs require extensive
interaction with computers.
For those of us who enjoy our work as computer professionals, this interaction
is largely positive. Computers increase our productivity and make it possible to
solve problems that would be infeasible in the absence of such highly
sophisticated tools. On the other hand, workers who have little computing
expertise sometimes find that the introduction of computers reduces their
ability to control their work environment and increases the level of job stress.
In May 1986, several members of the Palo Alto chapter of CPSR initiated the
Computers in the Workplace Project to discuss our shared concerns about the
problems associated with the increased use of computers in the workplace After
studying the issue, we became convinced that work place computerization has
significant social implications We believe that we have a responsibility as
computer professionals to be aware of these issues and that our expertise
enables us to play an important role in this debate.
Examples of Issue Areas
The "Computers in the Workplace'' topic is a very general one and comprises a
wide range of individual issues, including:
Computer monitoring. In many work environments, computers are programmed to
monitor the activity of the workers, giving rise to what Time magazine has
called "the boss who never blinks." This issue has attracted significant public
attention and is the subject of a recent report by the Office of Technology
The effect on jobs. Often, the introduction of computers has the effect of
"deskilling" the work requirements, making it easier to replace the existing
Computerization and the international economy. The introduction of
computerization has made it considerably easier to move jobs away from the
industrial centers to areas (particularly in Asia and the Caribbean) where labor
costs are considerably reduced. Clearly, this has an important effect on the
User interface and training. This topic area encompasses a wide range of
questions related to the user interface design and the manner by which new
technologies are introduced into the workplace. One important consideration is
whether easy-to-learn user interfaces are really "user-friendly" or "employer-
friendly." While ease of learning certainly seems an admirable goal, it is also
true that such systems may make workers more replaceable and may not provide
adequate opportunity for users to exercise and amplify their own skills and
Health risks to users. This topic area includes several different health
concerns that are receiving increased attention as more occupational health and
safety cases are filed. These include eye-strain, the medical effects of CRT
radiation, tendinitis, wrist problems, backache, and more general forms of
How could computers improve the workplace? We believe it is important to
consider those ways in which computers have the capacity to improve conditions
in the workplace so as to offer a constructive vision for the future. In
particular, computers make it possible to increase the diversity of work, allow
for individualization, enhance communication, and democratize the workplace.
Since its formation, the Computers in the Workplace project has undertaken
several activities. Jeff Johnson, one of our founding members who is now living
in Colorado, prepared a review of an April 1986 white paper by the National
Association of Working Women (9to5) on computer monitoring. This review, which
appeared in the Summer 1986 CPSR Newsletter, offered CPSR members some insight
into why not everyone is as enthusiastic about computers as they are: the
systems that many people must use in their work are very different from those
that computer professionals use. It also illustrated the importance of expertise
to the debate, since the original 9to5 white paper contained several technical
inaccuracies which the review was able to correct.
In January 1987, the Computers in the Workplace Project organized a panel for
the monthly meeting of the Palo Alto chapter. The invited speakers were Judith
McCullough from the Los Angeles office of the National Association of Working
Women (9to5), Lenny Siegel, director of the Pacific Studies Center and coauthor
of The High Cost of High Tech: The Dark Side of the Chip, and Martin Carnoy,
Professor of Education at Stanford and one-time candidate for U.S. Congress. A
summary of the panel is available from the Workplace Project.
In 1987, the Workplace Project has helped to promote the distribution of
Computers in Context, a 35-minute film produced by California Newsreel.
Computers in Context reviews the approach taken to computers in the workplace in
Scandinavia, where "co-determination laws" mandate that the workers themselves
be involved in the introduction of new technology. The film looks at three
different users of computer applications: bank tellers in Norway, graphic
designers for a Stockholm newspaper, and aircraft maintenance workers for SAS.
Computers in Context has been shown at the DIAC symposium in Seattle and at CPSR
chapter meetings in Palo Alto, Boston, Los Angeles, Santa Cruz, Denver/Boulder,
and Madison. At the Palo Alto meeting, Professor Kristen Nygaard, one of
Norway's leading computer scientists and a principal figure in the development
of the Scandinavian approach to worker participation in technical decisions, led
a discussion of the issues brought out in the film.
Over the last year, we have developed an annotated bibliography of readings on
computers in the workplace. That bibliography is now available to the CPSR
ship through the National Office. The National Office is also the repository for
a resource library assembled by the Workplace Project that includes collections
of readings on specific topics. We also publish a monthly report, entitled
Working Notes, which gives more information on our continuing projects.
The Need for a National Study Group
So far, most of the outreach work of the Computers in the Workplace Project has
been concentrated in the San Francisco Bay area. Judging from the response we
have received, however, there is considerable interest in this topic throughout
the CPSR membership. This indicates a need to provide a national focus for this
issue within CPSR. This topic, however, has proven to be controversial within
the organization, and it is important to consider the implications of broadening
CPSR's participation in this area.
At its October 1987 meeting, the CPSR National Board established a general
procedure for introducing new topics at the national level . That procedure
begins with the establishment of a "national study group" which has the
responsibility to study the issues involved and prepare a report for the Board
suggesting a national policy direction. CPSR is in the process of establishing
such a group on Computers in the Workplace, and is looking for interested and
knowledgeable people who would be interested in participating in that project.
If you are interested in participating in the national study group, or if you
are interested in receiving further information or our Working Notes newsletter,
please write to us at:
Computers in the Workplace Project
P.O. Box 390871
Mountain View, CA 94039
From the Secretary's Desk
CPSR National Secretary
In early March, the CPSR Board of Directors assembled in Palo Alto for two full
days of meetings and discussions. The Board meeting was divided into four
sessions: the national program on civil liberties, general business, the
direction of CPSR, and a fundraising workshop led by Hank Rosso of the Fund
Raising School in San Francisco. The meeting's focus on civil liberties issues
featured a tutorial session on the issues for Board members. Briefings were
given by CPSR/Palo Alto member Dave Redell, Board member Marc Rotenberg, and MIT
Professor of Sociology Gary Marx, who is an expert on computers and privacy. I
personally found each of the sessions to be quite productive, and I believe that
this was true for the other members of the Board.
At the meeting, the Board made several decisions that are of interest to the
membership. First, there is somewhat unwelcome (but hardly surprising) news on
the economic front. We decided that it was necessary to raise CPSR membership
dues (which have not changed since January 1985) to $40 for regular membership
and $15 for students/ low-income. We believe that this is necessary if we are to
maintain CPSR's position as the premier organization concerned with the socially
responsible use of computers. For one thing, our expenses have increased. More
importantly, however, we recognize that, as our organization grows and matures,
it will become increasingly difficult to secure general operating funds from
foundations, and we must therefore depend on our membership to support a larger
fraction of our work.
In this same spirit, we are embarking on an ambitious national fundraising
campaign which we hope will help secure CPSR's future. Working with training
provided by Hank Rosso, the national leadership of CPSR will be contacting all
CPSR members to help raise the money necessary to continue through 1988. We hope
that all members will be able to contribute to this campaign and support the
work the organization is doing.
We have also taken steps to broaden the scope of the organization and to ensure
that The CPSR Newsletter properly reflects that breadth of focus. In the future,
the editorial committee for The CPSR Newsletter intends to publish quality
articles on issues in which the use of computers is central, even if those
issues may generate controversy or are not currently a priority of the National
Office. We also encourage greater participation by members in debates over
policy and direction and welcome "letters to the editor" which will be
considered for publication.
In other news, CPSR's book, Computers in Battle: Will They Work?, edited by
David Bellin and Gary Chapman, was the first runner-up in the competition for
"Best NonFiction Computer Book of the Year" from the National Computer Press
Association. Congratulations to the editors and authors.
This is also election season for the board. At our March meeting, Steve Zilles
was reelected by the Board to a new
three-year-term as chairman. In addition, two new Regional Directors and a new
Director-at-Large will join the National Board when the new fiscal year begins
on July 1. The candidates for Regional Directors (see insert) each ran unopposed
and we welcome Karen Sollins and Susan Suchman to the Board representing the New
England and Middle Atlantic regions respectively. There are three candidates for
the position of Directorat-Large, and I encourage each of you to vote by
returning the ballot card enclosed with this Newsletter. As CPSR members, this
is your organization, and its success depends upon your interest and
The next big CPSR event will be the symposium, "Directions and Implications of
Advanced Computing," the registration form for which is on the facing page. This
event is highly recommended.
CPSR Chapter Activity
CPSR/Seattle has been busy with meetings and projects. . . the April chapter
meeting featured a talk by John Sidles, who has developed a computer model to
evaluate school desegregation plans for the Seattle school district. . . Jon
Jacky, Northwest Regional Representative, spoke on the SDi and "people in the
loop" at Evergreen State College on April 11. . . CPSR/Seattle will once again
participate in the 24 hour "Give Peace a Dance" marathon on June 18 and 19. . .
the chapter also managed to get a computer donated to the event's organizers,
and the machine will also be used by Puget Sound SANE. . . . Bob Wilcox and Erik
Nilsson of CPSR/ Portland travelled to San Francisco for an April 4-8 conference
on computers and voting. . . . the CPSR/Seattle chapter kicked in $250 to help
Bob and Erik with their expenses.
CPSR/Palo Alto's meeting times have changed for the first time in many years, to
the first Wednesday of every month. . . the April meeting of the chapter viewed
a videotape of a Bill Moyers special on developments in technology and the U.S.
Constitution. . . the chapter has working groups on civil liberties and
workplace issues (see the article on computers and the workplace in this
issue).. . . Eric Dorsey and several other CPSR/Palo Alto members have compiled
a survey of defense contracts and employment policies among Silicon Valley
employers, which is now available from the CPSR National Office. . - .
CPSR/Santa Cruz has developed a focus on outreach directed at students and
faculty at the University of California there, as well as a project on
employment in non-military jobs. and a series of talks and the university
cosponsored with a local IEEE chapter. . . . CPSR/Los Angeles meets every month
on the third Wednesday, and is currently planning a "beach party-.... CPSR/San
Diego is centered on the University of California at San Diego, and currently
has nine future meetings scheduled on a variety of topics, and is also
cosponsoring the spring beach party with CPSR/Los Angeles. . . . CPSR/Denver-
Boulder, which meets alternately in Denver and in Boulder has sponsored a
meeting on nuclear winter, and now intends to focus on the National Test Bed
beginning construction in nearby Colorado Springs. . . CPSR/Denver-Boulder
member Jeff Johnson had an op-ed piece on the National Test Bed published in
Denver's Rocky Mountain News.
CPSR/Madison has a funding study group which is collectively reading and
discussing the book by David Drew, Strengthening Academic Science. . . the
chapter has three ongoing projects: a Rights and Responsibilities working group
studying professional ethics; an Employers' Catalog project similar to Eric
Dorsey's work in Palo Alto; and a High School Presentations group that is
organizing showings of the CPSR slide show and discussions about computers at
local high schools. . . the chapter is also investigating the development of a
tutoring program to increase minority student representation in computer science
programs. . . in April CPSR/Boston chapter members heard a talk by Lisa
Gallatiin of the Massachusetts Coalition on New Office Technology (CNOT), who
spoke on "Electronic Monitoring in the Workplace: Supervision or Surveillance?"
CPSR Celebrates Fifth Anniversary
In March, 1983, Computer Professionals for Social Responsibility was officially
incorporated as a nonprofit in the State of California. Although meetings
leading to the formation of CPSR had been going on for at least 18 months prior
to this date, the "official" incorporation date seems as good as any to mark the
beginning of CPSR as a national organization. Therefore, in March of this year,
CPSR observed its fifth anniversary.
We are very proud of the work the organization has done in the last five years.
It has grown from a handful of concerned computer scientistsÑmost of them
located in or near Palo Alto, CaliforniaÑto a national and internationally known
organization highly regarded for quality work and an exceptional standard of
professionalism. Many thanks to everyone who has helped make the last five
years so rewarding.
Created before October 2004