Personal tools

Fall1989.txt

The CPSR Newsletter

Volume 7, No. 4 COMPUTER PROFESSIONALS FOR SOCIAL RESPONSIBILITY Fall 1989

Carol Edwards

Photo by Zama Cook

CPSR/Washington, D.C. Hosts 1989 Annual Meeting Benjamin Piercc CPSR/Pittsburgh
Holly Murray CPSR/Washington, D.C.

The 1989 CPSR Annual Meeting was held in Washington, D.C., on October 20 and 21,
and attracted members and interested participants from all over the United
States and from foreign countries. The meeting was an opportunity for CPSR
members from various parts of the U.S. to meet each other and discuss common
concerns, and also to meet and interact with the national staff and the
volunteer leaders of the organization. Some of the important issues addressed at
the meeting included computers and Federal information policy, research and
development funding, education, and the ethics of the profession. The first day
of the meeting was dedicated to presentations by well-known figures in the
field, while the second day was filled with workshops and small group
discussions.

continued on page 5

Computer Monitoring A Threat to the Right to Privacy? Karen Nussbaum The
following speech was the luncheon address at the 1989 CPSR Annual Meeting in
Washington, D. C., given by Karen Nussbaum, founder and executive director of
The National Association of Working Women, also known as 9to5. [The views below
are those of Ms. Nussbaum and do not reflect an official or unofficial position
of CPSR.Ñed.]

The king has note of all they intend By interception which they dream not of
....

Shakespeare was talking about Henry V, but he could have been referring to
today's legions of managers who are verily obsessed with methods of
interception.

Monitoring is not newÑthey say spying is the second oldest profession. But new
technology creates capabilities in computer monitoring which make it
qualitatively different from supervision in the past.

Today I'll talk about why workplace surveillance is increasing; what's wrong
with computer monitoring; and what can be done about it.

Why So Much Surveillance

Monitoring is one of a growing list of surveillance techniques used in the
workplace. The congressional ban on polygraphs last year only heightened the
scramble for other methods, including drug testing, handwriting analysis, and
``honesty tests."

"When lie detector tests were banned," according to The New York Times, "a Miami
bank substituted a new employee screening programÑa written honesty test, a
urinalysis and a thorough background check before hiring an employee."

What's the panic? Especially when so many experts point to the questionable
value of these methodsÑdrug tests have up to a 40% error rate; handwriting is
considered to

continued on page 2


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Computer Monitoring continued from page 1

have no correlation to job success. And what is an "honesty test" anyway? Look
for the expanded application of Ouija boards or astral projection in the
personnel office.

Some say there's no call to draw so much attention to computer monitoring, since
management has monitored workers with or without computers forever.

There's truth in that.

American management has a history of relying on control as the primary
management tool. With the rise of industry in the late 1 9th century, management
wrested control of the work process from the shop floor.

"Scientific management" was developed here and has remained popular for nearly a
century. Work was broken down into its smallest possible components, taking
particular care to separate manual from mental tasks, with hierarchies of
supervisors overseeing the work process.

Though we may laugh at the excesses of Frederick Winslow Taylor today (and
Taylor himself admitted that he had recurring nightmares that he had to work in
the system he created), scientific management is still the underlying management
theory.

But there's a resurgence of management control todayÑ almost a fetishÑwith
managers attributing the need for it to rising competition and falling
productivity.

Competition is risingÑfrom countries such as Japan and Sweden with their more
productive workers, to countries such as Korea and Mexico with their low-wage
workers.

One response would be to compete by creating a high-skilled, high value-added
workforceÑwhere technology was used to enhance the jobs of an educated,
well-trained workforce.

Instead, the dominant business strategy has been to cut labor costs through
lower wages and benefits. We see this across the board.

For example, what's known as the "contingent" workforce has skyrocketed here.
These are the part- time, temporary, and contracted workers who typically earn
less, enjoy few benefits, and have no job security. They now make up one third
of the workforce.

More part-time jobs than full-time jobs are being created every year. Wages are
falling.

Less than half of employees are covered by pensions and that has been declining
steadily. Today, thirty million workers are not covered by health insurance.

Employees are being trained to change our expectations of job security, to free
management of responsibility.

A management strategy characterized by this lack of commitment to and faith in
the workforce requires a supervision style based on control and fear.
Surveillance and control now take the place of supervision, commitment, and
training.

Monitoring isn't a technological flukeÑit is the logical extension of these
changes in the economy.

What Is Monitoring?

Monitoring isn't simply the benign use of computers to collect data. It is
different in three important ways: it monitors not just the work, but the
worker; it measures work in real time; and it is constant.

It effectively provides a permanent time study not simply to gather data, but to
pace and discipline the workforce. And the scope for this is quite broad.

Truck drivers are an example. What job gives you more of a sense of independence
and freedom from supervision than long haul truckers? Imagine a trucker
barrelling across Wyoming completely independent.

Well, completely independent until the trucker pulls into a barn in Denver and a
little computer tape is removed from the engine. The tape tells a supervisor how
many stops the driver made; what the average gas mileage was; where the stops
took place; and lots more.

Or consider the airline reservation clerks. When you called to make your
reservation for this conference, the reservation agent was timed on the exact
seconds she took per caller; the number and length of her breaks; and the time
between calls. And your conversation may have been listened in on.

As one executive put it, "I count everything that moves."

This represents a fundamental shiftÑfrom relying on a worker's individual
professionalism to relying on electronic control. In a study for the Office of
Technology Assessment, Dr. Michael Smith finds:

continued on page 3

2


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Karen Nussbaum

Photo by Zama Cook

Computer Monitoring continued from page

Electronic monitoring may create adverse working conditions such as paced work,
lack of involvement, reduced task variety and clarity, reduced peer social
support, reduced supervisory support, fear of job loss, routinized work
activity, and lack of control over tasks.

If that's too much to remember, I can summarize it in one word: fear.

Mary Williams is a case in point. In a now-famous case, Mary Williams was
disciplined by United Airlines for comments she made to a co-worker. She was
courteous to an obnoxious customer and handled him wellÑmanagement had no
quarrel with her there. But after this three minute call, which was monitored,
she complained to a co-worker. Management, listening in to this discussion among
coworkers, put her on probation for her remark, then sent her to the company
psychiatrist when she complained, and ultimately fired her.

Or take the case of Toni Watson. Toni works for another airline which strictly
enforces a 1 2-minute limit on bathroom breaks. When Toni went over by two
minutes, she was disciplined. The pressure on her job finally put her out of
work with a nervous disorder.

A data processor in New York told me that her screen periodically flashed
"You're not working as fast as the person next to you!"

Others are bitter that their work speed and productivity are publicly posted
daily or hourly.

An ad in PC Week magazine for networking software boasts to management that

Close-Up LAN brings you to a level of control never before possible. It connects
PCs on your network giving you the versatility to instantly share screens and
keyboards.

The ad continues:

You decide to look in on Sue's computer screen .... Sue won't even know you are
there! . . . All from the comfort of your chair.

A secretary from Florida told us that the thing she found most offensive about
her generally abusive boss in her small office was that he calls up on his VDT
the work she's doing while she's doing it.

A personnel director of a big company in North Carolina called and told us she
quit her long-term job because she thought the use of computer monitoring was
sadistic.

And computer monitoring isn't limited to terminal operators and truck drivers.

The maids at the hotel you are staying at are probably monitored. Your maid
punches a code in the phone when she enters and leaves your room, providing a
detailed log of her speed and a record of her movements for the entire day.

Don't get sick, because nurses are monitored. They carry boxes on their belts
which track the amount of time used for each procedure with a patient. So don't
be surprised if they lack bedside manner.

And the "higher professions" are being hit too. One reporter told me as she was
typing in her story, her computer flashed "I don't like that lead"Ña
surreptitious supervisor butting in on a first draft.

I've never heard a worker say that he or she appreciated this "feedback," though
I have hard management representatives that claim it. The workers we hear from
feel humiliated, harassed, and under the gun.

As a leading maker of monitoring software programs says, "Monitoring helps
employees. It's the only way we can get

continued on page 4

3


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Computer Monitoring continued from page 3

everything on the permanent record." To workers that sounds frightening.

Employees object to monitoring for several reasons. They don't always know if
they are being monitored. They don't know how the information is being used. And
basic rights are compromisedÑthe "right to know;" privacy; due process; and
health and dignity.

A survey by the Massachusetts Coalition for New Office Technology collected
responses from over 700 union and non-union monitored office workers. Here a few
of their findings: 62% of respondents were not informed they would be monitored
prior to hiring;

Three quarters feel they are being spied upon;

Three quarters say that monitoring lowers morale;

80% say that monitoring makes their jobs more stressful.

A back office bank worker in the survey was mystified as to how her supervisor
was gaining access to highly specific personnel information. At her six-month
review her supervisor started quizzing her about her bathroom breaksÑ that's
when her co-workers filled her in on monitoring capabilities on her computer.

This survey shows deep alienation from the process. And these feelings are
substantiated by study after study.

Subliminal Suggestion

Those are the electronic capabilities workers are aware of. Even more
frightening is the marketing of tools for subliminal suggestion.

Mind Communication, Inc., markets subliminal suggestion audio tapes to order for
just $249. Among their customers are AT&T, Kimberly-Clark, Procter and Gamble,
and Honeywell. Behavioral Engineering Corporation offers "Subliminal Sound." And
Greentree Publishers produces computer software called "Subliminal Suggestion
and Self-Hypnosis," with everything from feel-good messages to more pointed
commands like "work faster."

Monitoring is the ultimate expression of lack of trust. Supervisors don't trust
their workers to do their jobs. Workers don't trust their supervisors to be
fair. Upper level management doesn't trust lower level management to handle
basic supervision.

What's the Response? Does anyone care? Is monitoring a threat to privacy?

Workers think so.

Workers are filing privacy suits against their employers in unprecedented
numbers. Between 1984 and 1987, twenty times as many workplace privacy suits
were decided by U.S. courts than in the three years before. Jury verdicts in
favor of workers averaged $316,000Ñcompared to 1979 and 1980 when no workers won
compensation.

9to5 started a hotline last year, providing counselling for office workers with
job problems. When Ms. magazine ran a short notice that we wanted to hear from
people who were being monitored, we were flooded with calls. Hundreds of women
called, mostly from airlines and non-union phone companies.

Some called whispering. Some called saying they had been appointed by their
co-workers to call the hotline. One woman specifically called on her monitored
phone so her supervisor would know! Many told us they were having meetings in
the parking lot, or at their homes, to talk about what to do. They all talked
about being violated, humiliated.

An airlines reservationist in Texas said, "This is America! If I wanted to live
like this I'd move to Beijing!" She described paranoia so intense that people
whispered in the parking lot

continued on page 5

The CPSR Newsletter is published quarterly by:

Computer Professionals for Social Responsibility P.O. Box 717

Palo Alto, CA 94301 (415) 322-3778

Also located at: 1025 Connecticut Ave., N.W., #1015 Washington, D.C. 20036 (202)
775-1588

The purpose of the Newsletter is to keep members informed of thought and
activity in CPSR. We welcome comments on the content and format of our
publication. Most especially, we welcome contributions from our members.
Deadline for submission to the next issue is February 1, 1990.

This Newsletter was produced on an Apple Macintosh II, using the desktop
publishing program Pagemaker 3.0 and Adobe Illustrator 88. The hardware and
software used were donated to CPSR by Apple Computer, the Aldus Corporation, and
Adobe Systems. The Newsletter was typeset from a Pagemaker file on a Linotronic
100.

4


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Computer Monitoring continued from page

and assumed the public phone in the lunchroom was monitored.

Unions are bargaining in language to respond. For instance, at Ohio Bell in
Cleveland the union has won language preventing secret monitoring. And though
management can collect individual data on customer service reps, it has agreed
it will use the data only in aggregate.

The Courts are mixed. Out of four key cases, two assert workers' rights (
Watkins v. Berry, 1983, and U.S. v. Harpel, 1974) and two found for management
(Epps v. St. Mary's, 1986, and James v. Newspaper Agency Corp., 1979).

The Congress is concerned. The Office of Technology Assessment, the
Congressional research arm, published a report on monitoring in 1987 which said:

There are strong arguments that the present extent of computer-based monitoring
is only a preview of growing technological capabilities for monitoring,
surveillance and worker testing on the job.

If this is the case, then there may be a need for a new balance between workers'
rights to privacy or autonomy in the workplace and management requirements for
information.

Legislation is pending at the federal, and state levels. For example, the
Privacy for Consumers and Workers Act (HR 2168, Clay, Edwards, Williams, Gilman)
provides for right-to-know and privacy protections. The California Assembly
passed legislation prohibiting subliminal suggestion in the workplace,
ultimately vetoed by the Governor. Many states are considering comprehensive
monitoring bills, including Connecticut, Massachusetts, Minnesota, New Jersey,
Rhode Island, and Oregon. And New Mexico and New York are both considering "beep
bills."

What You Can Do

So why am i telling you all this? You are the creators of this wonderful
technology. But it is misused. And there is no one with more credibility than
you to speak out against this misuse.

We want you to know how we experience monitoring. We want to you to work with us
to design alternatives. We ask you to join us in preventing abuse and to help us
make the most of these fantastic tools. Thank you.

Information about computer monitoring and other subjects under research at 9to5
are available from 9to5, 614 Superior Ave., N.W., Cleveland, OH 44113.

Annual Meeting continued from page I

Privacy and Computers in the United States

The keynote speaker scheduled for the meeting, Senator Patrick Leahy of Vermont,
was unable to attend because of a last-minute vote that required his presence on
Capitol Hill. Taking his place at the CPSR meeting was Katy Miller, staff
counsel to the Senate Judiciary Committee's Subcommittee on Law and Technology,
of which Senator Leahy is chairman. Miller described Leahy's concern with the
issues that have been major components of CPSR's program. On computer viruses,
said Miller, Senator Leahy is "cautious" about legislation that would punish
virus perpetrators because of his concern about constraining creativity in
software programming. He prefers the promotion of a "culture of compliance"
through professional development of ethical norms of behavior.

On privacy issues, Miller stressed the point that computer professionals must be
concerned with both the right to know what the government is doing, and with
citizens' right to privacy. Senator Leahy, she said, has introduced or supported
measures that prevent electronic eavesdropping and public access to records of
video rentals from retail stores, a bill that became known as the "Bork bill"
because it was in response to a reporter revealing the titles of videos rented
by Federal judge Robert Bork when he was up for confirmation to the Supreme
Court. Miller said that it is clear that new technologies pose grave threats to
privacy, and new opportunities for citizen monitoring of government. The
Congress needs the advice of concerned computer professionals like those in
CPSR.

Federal Support of Computer Science R&D

The first panel discussion of the day was about trends in the Federal
government's support of research and development in computer science. Two of the
scheduled panelists had to drop out at the last minuteÑKenneth Flamm and Marilyn
ElrodÑso the panel consisted of Dr. Frederick Weingarten of the Congressional
Office of Technology Assessment; Dr. William Scherlis of the Defense Advanced
Research Projects Agency; and Professor Ann Markusen of Rutgers University. The
panel was moderated by Professor Lance Hoffman of The George Washington
University.

Weingarten began by describing the mission of the Office of Technology
Assessment (OTA). OTA, he said, has a

continued on page 6

5


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Annual Meeting continued from page 5

"multi-partisan" mandate to provide Congress with information that will
facilitate its decision-making in the area of technology and public policy. Just
where computers fit into overall science policy is a hot issue at OTA right now.
First, Weingarten looked at the four fundamental reasons for government funding
of science in the first place:

1) Intrinsic scientific worth of a project (e.g., the supercollider project),
even where there is no social utility to the project, at least not in the
foreseeable future. Computer science has to fight hard for funding in this
category because it defies traditional distinctions between "pure" and
"applied." Meanwhile, the Office of Management and Budget (of the Executive
branch]) is trying to reduce Federal computer funding, asking, essentially, "why
doesn't IBM do it?"

2) Congress' own immediate purposes, notably defense and nuclear power.

3) Suggestions of an implicit industrial policyÑfor example, the now-familiar
issue of "international competitiveness." Despite examples like Sematech,
high-definition television, the Strategic Computing Program, and DARPA, compared
to Japan there isn't much activity in this category, said Weingarten.

4) A new rationale is "infrastructure-building" for broad social purposes into
the next century, with the computing examples being supercomputer centers and
the National Science Net.

OTA looks at a variety of factors in assessing these rationales for Federal
funding of science and technology. There are of course funding priorities, not
only with other Federal programs but with other science projects. There is the
constant controversy over how active the Federal government should be in
promoting technological development in the U.S., and how much should be left to
private enterprise.

Weingarten expressed a particular frustration after years at the National
Science Foundation and OTA: The computer profession does not participate
adequately in public policy debates compared, for example, with the physics or
engineering communities. Computer professionals' attitudes range from timidity
to defending anything that will increase funding for computer science research.
Weingarten said he has the fervent hope that the continued growth of CPSR will
stimulate more sophisticated involvement of computer scientists in policy
development.

Dr. William Scherlis spoke next. Scherlis is in charge of the software division
of the Information Sciences and Technology Office of DARPA, and since 1986 he
has played a major role in the formation of the administration's new High
Performance Computing Initiative. Like Weingarten, Scherlis described what his
agency does in the process of funding computer science R&D. In pursuit of its
mission to exploit long-term opportunities in defense needs, DARPA's preferred
approach is to use the commercial base for a "transition" to Pentagon
applications. Scherlis cited VLSI design, artificial intelligence, expert
systems, and UNIX among the computing innovations DARPA has backed in recent
years.

Making the surprising claim that the Department of Defense is a "relatively
small player in the computer marketplace," Scherlis explained that DARPA uses
defense money to "bootstrap" new technologies when commercial demand is not yet
present, trusting in an eventual "trickle-down" of technological capabilities to
commercial needs, at which point venture capitalists take over.

Scherlis claimed that DARPA is a "very tiny," separate agency, with a small
budget but with the autonomy to allow it independence from near-term defense
requirements. But he admitted there are always questions about the balance
between basic research and development for defense applications. In general,
said Scherlis, DARPA would like more involvement from the computer profession in
Federal decision-making. Like Weingarten, Scherlis said he regrets the lack of
computer professional inputÑ from both scientists and practitionersÑin
Washington.

Scherlis closed with two challenges to CPSR: How can the computer science
industry maintain competitiveness in the face of organized national initiatives
in countries like Japan and West Germany? Do we need a civilian technology
agency? Scherlis feels that our military strength, in particular, depends on
maintaining a technological lead. He also asked, what steps need to be taken to
protect security and privacy as we build larger and more comprehensive networks?
The Computer Emergency Response Team, (CERT) based at the Software Engineering
Institute in Pittsburgh, has responded to an almost continuous stream of
incidents in the year since its creation and new vulnerabilities are being
discovered all the time. More proactive measures are needed, as well as a more
decentralized ability to respond to emergencies.

While Scherlis gave a detailed explanation of how DARPA drives academic computer
science research in areas that will eventually benefit defense needs, he paid
little attention to the broader question of whether national interests are

continued on page 7

6


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Annual Meeting continued from page 6

best served by investment in technology with long-term military applications in
mind. This question was taken up by the next speaker, Ann Markusen, who
considered science and technology policy from an economic point of view.

Markusen began by pointing out that we stand at the beginning of a debate on
national priorities in technology, but discussion seems to focus mostly on how
to get where we're going rather than on where we should be headed. The question
of priorities seems to be unduly influenced by the notion of "competitiveness,"
said Markusen, and this is a perspective with two major flaws: It misleadingly
embodies Cold War rhetoric in an economic context. Who, in today's global
economy, are our "competitors?" Second, the arguments about "competitiveness"
assume that "we" (meaning the whole U.S., as an undifferentiated block sharing a
single concern) have a stake against ``them," when in fact there has been little
public discussion of the consequences for most of "us" (e.g., workers) of
increased or reduced competitiveness of U.S. industry. In many cases, arguments
about competitiveness have been used to materially lower standards of living in
this country, said Markusen.

On the question of how to achieve progress efficiently, Markusen noted that
although military needs have provided a relatively large proportion of funding
for computer science research and development in the past (30-60% of total
support for R&D, as opposed to about 3% for research on steel and 70% in the
aerospace industry), it is time to reassess the role of the military in light of
a newly emerging national mission, which calls for economic rather than
primarily military strength.

It is undeniable, said Markusen, that many social benefits have resulted from
military expenditures, especially in computing. But it does not necessarily
follow that the government is the most appropriate or efficient source of
funding. The story of nuclear power is a particularly dubious one, clearly
showing how vulnerable government is to excessive "optimism of the will." It is
unclear under what circumstances government initiatives can be expected to
succeed, and there is no hard evidence that military expenditures have generated
good technology cheaply. Indeed, noted Markusen, it is worth pointing out that
Germany and Japan, two recent economic superpowers, have very small defense
industries.

Markusen is interested in the idea of a civilian technology funding agency, but
worries that its priorities will tend to be driven by the interests of a small
group without adequate public debate. It is important for CPSR and similar
groups to take a leading role in "public brainstorming" on these issues, she
concluded.

CPSR President Terry Winograd congratulates Norbert Wiener Award recipient
Professor Daniel McCracken.

Photo by Zama Cook

The Norbert Wiener Award

Participants at the meeting then walked a few blocks to The George Washington
University for lunch. After a greeting by computer science department chair
James Foley, CPSR President Terry Winograd presented the Norbert Wiener Award
for Social and Professional Responsibility to Professor Daniel McCracken of the
City College of New York. Professor McCracken, world-famous for his text-. books
on programming languages, has been an important forerunner for CPSR, with a
career marked by professional involvement in such public policy issues as
strategic nuclear weapons, citizen privacy, and intellectual property. The
Norbert Wiener Award, named for famed MIT mathematician and cyberneticist
Norbert Wiener, was given to Professor McCracken for his lifelong commitment to
social responsibility in the computing profession. The award is a Hoya crystal
sculpture set with a Bulova clock, and mounted on an inscribed clear base. [For
more information on Professor McCracken and the Norbert Wiener Award, see The
CPSR Newsletter, Summer 1989, p. 17Ñed.]

Professor McCracken's short acceptance speech was followed by luncheon speaker
Karen Nussbaum, founder and executive director of the National Association of
Working Women, also known as 9to5. Ms. Nussbaum's speech is reproduced in full
in this newsletter, beginning on the first page.

continued on page 8

7


Volume 7. No. 4 The CPSR Newsletter Fall 1989

Annual Meeting continued from page 7

Computers in Education

The meeting then returned to the auditorium of the Pan American Health
Organization for two more panel discussions. The afternoon session began with a
panel addressing "Computers in Education: Mixed Agendas and Uncertain Outcomes."
The panel featured Professor Sherry Turkle of MIT (author of the book The Second
Self: Computers and the Human Spirit); Douglas Noble, a teacher in Rochester,
New York and a leader of the Coalition for Equity in Education; Carol Edwards,
the director of the Southern Coalition for Educational Equity and Project MICRO
in Atlanta; Michael Foyer of OTA, one of the principal authors of OTA's "Power
On" report on educational computing; and Professor Chet Bowers of the University
of Oregon School of Education. The panel was moderated by Terry Winograd.

Turkle opened the discussion with the most optimistic talk of the five speakers.
Although many impressive pilot projects in computer-based education have yielded
essentially no results, Turkle said she views this as an opportunity to stand
back and reconsider the foundations of our approach to the uses of computers in
education, freed of our current "technological determinism."

Her slogan for the talk was that "Equal access [to technology] requires
epistemological pluralism." Different sorts of people have different approaches
to tools. A computerÑif properly designedÑcan encourage "personal appropriation"
by users, providing them a rich context for making the machine their own in
their own ways. Because computers stand midway between formal systems and
concrete physical artifacts, they encourage concrete appropriation of formal
systems.

In previous work, Turkle has described how men and women tend to differ in
cognitive styles in relating to computers, and how a "woman's style" is
typically given lower value or no value at all. This work resonates with
contemporary feminist thought, which holds that there are significant
differences of "style" between men and women, which should be taken into account
instead of forcing everyone into the same (male) cognitive mold. (This argument
generated quite a bit of controversy during the question and answer period.)
More recently, Turkle has studied MIT's Project Athena, a highly touted
university-level educational computing project. She found that this system
embodies "the opposite of epistemological pluralism"Ñit has a single operating
system, window interface, and philosophy of educational software. The failure of
this approach is demonstrated by many "small

From left to right, panelists Ann Markusen, Frederick Weingarten, and William
Scherlis discuss trends in Federal support of computer science research and
development.

Photo by Zama Cook

epistemological rebellions" on the fringes of the project, said Turkle.

Among the factors standing in the way of epistemological pluralism are the view
that the computer is "just a tool"Ñ both hammers and harpsichords are tools,
said Turkle; which is more like a computer?Ñand the current state of schools.
Teachers need to be more fluent with computers as expressive tools.

Douglas Noble began by recounting the optimistic litany behind the $2 billion we
have spent on computers in schools so far: Computers alone will bring solutions
to both the educational crisis and the economic crisis; computer literacy is a
basic survival skill. What we have instead, asserted Noble, are instruments of
control and hoards of uncritical, "functionally illiterate drones and
information processors."

This is one legacy, Noble believes, of the heavy military orientation of R&D in
computer education (and in computer development in general) for the last 30
years. "When even the military has recognized the futility of the 'teaching
machine' approach to learning, shouldn't we?," he asked. What we have ended up
with is computer-based education as a tool of larger, highly questionable
technical and social imperatives.

Educator Carol Edwards noted that, in her perspective, educational computing is
subordinate to concerns about

continued on page 9

8


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Annual Meeting continued from page 8

education in general. Her remarks focused on several problems:

1 ) Equity: A simply quantitative approach to this problem (i.e., do all the
kids have access to computers?) falls short. In addition, we should be asking
qualitative questions about the quality of software, programming activities,
teacher expertise, and the educational milieu and philosophy. Edwards asserted
that the failings of computers in the classroom today reflect our two-tiered
society: one group learns to use computers for control and leadership in
society, while the lower tier is becoming a "throwaway" caste of unskilled labor
whose jobs are being replaced by technology and whose education is so dull and
meaningless that they drop out in droves. Edwards challenged the audience to use
technology to restructure the two-tiered society instead of to reinforce it.

2) Lack of diversity: As Edwards aptly pointed out, nothing spoke as eloquently
of the diversity problem in our field as the makeup of the audience itself
(which was overwhelmingly white and middle- classÑand probably two-thirds male).
This distorted representation reinforced the need for Sherry Turkle's earlier
call for an "epistemological pluralism," said Edwards.

3) Exaggerated claims of educational high-tech vendors: There is no evidence,
for example, to support the claims that preschoolers need computers, Edwards
complained.

4) Technological Advances: Another angle on the equity problem: It's crucial to
include diverse groups in the designÑnot just the useÑof new technologies.
Currently, the people defining the new frontier are creating something
comfortable for themselves and appropriate to their own cultural context. A
concern for equity, if it is to be effective, must be built into the technology
development process from the very beginning.

Michael Foyer began his talk by emphasizing that OTA's position is not blindly
in favor of technology. Personally, he said, he is skeptical about educational
technology, but less so than when he joined OTA's study project on the subject.
The years 1982-83 marked the beginning of a "national experiment" in
microcomputer technology in schools, said Foyer. While the results have been
mixed and it is too early to make a final verdict, it seems clear that computers
are not a cure for the education crisis. In particular, we cannot think about
education the way we think about other research and development: what works in
one school doesn't in another; the "blueprint model" is wrong.

Foyer described some conclusions of the OTA "Power On" report. Educational
software, generally speaking, is of terribly poor quality. Even so, teachers
complain that there is not enough of it. Schools and teachers are under
tremendous pressure to perform well under fairly standard measures. It has been
shown that computer instruction can be used to raise standardized reading and
math scores. It is prohibitively expensive to develop good-quality educational
software. A company that invests the time and energy to produce a really fine
program will never be able to recover its costs. Partial solutions to this
problem might include more Federal support for educational software development,
and the creation (by some means, such as massive hardware grants to schools) of
a larger and wealthier marketplace for educational software.

Chet Bowers, from the University of Oregon, spoke of computers, ecology, and the
sense of self. In a panel notable for the reach of its analysis, Bowers probably
ranged the farthest into broad cultural and philosophical critique. The thrust
of his hypothesis on educational computing was two-fold: first, we are unaware
of how culturally determined even seemingly value-free entities such as science,
technology and computing are; and, second, our particular Western mode of
problem-solvingÑ the Cartesian modelÑis ultimately destructive of ecologically
balanced community and life itself.

Evoking themes of 19th century German philosophy and 20th century linguistics
theory, Bowers exhorted his audience to bear in mind the cultural assumptions
and subjectivity we carry around, even as we manipulate data or write software,
and to remember that far from being a neutral conduit, our "language thinks us."

Bowers went on to derive concrete suggestions for cross-cultural educational
software that demands critical thinking, but it was hard to avoid the
conclusion, given his general arguments, that computers in the classroom are too
fraught with negative associations and values to be very useful in a humanistic
curriculum. Bowers' own conclusion seemed to say as much: technology is no more
than a historical experiment, and perhaps a failed one at that.

Moderator Terry Winograd summed up the two extremes he saw represented in these
talks: either there is a linear militaristic world view inherent to computers,
or the computer may be seen as an infinitely adaptable "Trojan horse." Winograd
suggested a middle ground that, while conceding that computers won't
revolutionize our educational structures, would have us "think small" and
explore the ways computers can make some changes.

continued on page 10

9


Volume 7. No. 4 The CPSR Newsletter Fall 1989

Annual Meeting continued from page 9

Patrolling the Programmers

The final panel discussion of the day was entitled "Patrolling the Programmers:
Computer Ethics and Computer Accountability." The panel featured talks by John
Shore, author of The Sachertorte Algorithm and Other Antidotes to Computer
Anxiety and a vice president of Entropic Systems, and Carol Gould, a professor
of humanities at Stevens Institute of Technology in New Jersey. The panel was
moderated by Rachelle Hollander, coordinator of the Ethics and Values Studies
Program at the National Science Foundation.

John Shore's talk argued for the proposition that computer professionals should
be regulated. The computer is an engine, he said, and software engineers are
engineers in every sense of the word. Civil engineers, designers of airplanes
and cars, plumbers, doctors, lawyers, hair stylists, and many other professions
are regulated. Software engineers belong on the list. He left open the
possibility that some computer professionals are not software engineers and do
not build systems that the public depends on. These, perhaps, do not need to be
regulated.

The earthquake in the San Francisco Bay area, which happened the same week as
the Annual Meeting, underlined the responsibilities of engineers. In a recent
quake of similar force in Armenia, many buildings collapsed; in California, only
a few were damaged. "How many of us would support the repeal of all building
codes in California?,"

¥ Shore asked.

The software crisis shows that depending on "software darwinism" is foolish.
Desire to do good, education in good practice, and market forces are not enough
to guarantee reliability, Shore asserted. If we've learned anything from thirty
years of experience, it is that discipline is the only way to build software
that works. And discipline, said Shore, requires enforcement. And on the
positive side, if you put strong requirements on people, some will rise to meet
them. CPSR should join the forefront in helping formulate these requirements, he
concluded.

Carol Gould addressed among other things the question of computers and
democracy. She pointed out that while computers have a potential for
decentralized democratization, unless there is equality in access to information
and justice in the distribution of social and technical resources, the mere
widespread use of computersÑas in online votingÑdoes not ensure increased
democratic par

Volunteers Once Again Make the Annual Meeting a Success

As usual, CPSR volunteers made the Annual Meeting a success. The meeting was
co-chaired by Professors C. Dianne Martin and Lance Hoffman of the Computer
Science Department at The George Washington University. They helped organize the
panel discussions and dealt with local arrangements. Joel Wolfson was
responsible for publicity and was very helpful at the meeting itself. David
Girard and Paul Hyland helped videotape the entire Friday program, and they will
be editing the tape to make a program that can be viewed by CPSR chapters and
other groups. Special thanks go also to Monica Green of SANE/Freeze, to all the
speakers and panelistsÑall of whom appeared at the meeting as volunteersÑand to
CPSR staff members Susan Lyon, Katy Hickman, Lois Toback, Tera Martin and Marc
Rotenberg, who put in many hours of hard work to make the meeting a memorable
event.

ticipation in decision-making. On the contrary, a top-down application of
computers could be used to manipulate opinion and set agendas.

Turning to questions of legal responsibility for a failed or malevolent computer
application, Gould asserted that if a person is the programmer, this person
should be responsible for the program if he or she knows its intent. Even if the
intent is unknown, Gould feels the responsibility is the same, underscoring the
need to bring programmers into the decision-making process early. Why, she
asked, are scientists (such as Nazi chemists) held more responsible for their
actions than computer scientists?

During the questioning, someone remarked that people naturally resist knowing
the end use of their product. In response, Gould again urged reorganizing the
workplace so workers have responsibility (and, presumably, a greater stake in
the outcome).

Harkening back to the education panel, Gould critiqued Chet Bowers' radical
relativism with regard to the question of responsibility: if individual autonomy
is an illusion and we are all hapless victims of our language and culture, then
where is the place for individual responsibility? If we don't hold on to our
capacity to change some social norms, then both responsibility and ethical
standards fly out the window.

John Shore's talk stimulated a lot of discussion and continued on page 11

10


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Annual Meeting continued from page 10

controversy during the question and answer period. One audience member said that
even "certified" programmers are bound to make errors in complex systems. It's
much better to discourage the belief in software infallibility. Certification
might give people a false sense of security. Another remarked that "good" code
doesn't necessary equal good (i.e., ethical or moral) use. How would you ever
prove a thing like "software malpractice?" How would you test programmers? How
would you hold a programming team responsible for a failure? Shore responded in
general that while certifying programmers is not a panacea for the software
crisis, it is better than doing nothing, which is what is happening now.

The Saturday Workshops

On Saturday morning, after reports from CPSR executive director Gary Chapman and
Washington office director Marc Rotenberg, Annual Meeting participants started a
series of workshops on specific topics, beginning with a morning workshop on
local organizing conducted by Monica Green, national field director for
SANE/Freeze.

After everyone introduced themselves, the plenary session broke up into groups
and tried to answer the question, `'What is the greatest obstacle to the health
of CPSR's local chapters?" The most common responses reported included lack of
outreach from the very active chapter members to other members, from chapters to
people in local areas, and between chapters. There is difficulty getting people
involved (including being visible enough that people know there's something to
get involved with). There are problems of cohesion in chapters in large
metropolitan areas (e.g., Los Angeles and New York). There is a lack of specific
ways for members to get involved in ongoing activities. (Conversely, Marc
Rotenberg finds that it's difficult to find people to work on all the tasks he'd
like to see accomplished. The problem seems to be one of matching people with
tasks rather than lack of one of the other.) And there is the common problem of
leadership: finding people that are able to set directions and motivate others,
and organizing those people into an effective "chapter core."

The rest of the session was devoted to exchanging ideas about how these problems
can be addressed. Chapter organizers and people who hope to start a CPSR chapter
found this session very useful, and the specific recommendations that came out
of the workshop will be put into practice throughout the CPSR program. People
interested in details of the workshop can contact the National Office for more
information.

For most of the afternoon, attendees broke up into smaller workshops on specific
topics. The workshops were on Computers and the Environment, International
Security, Privacy and Civil Liberties, Education, and Computers and the
Workplace. The workplace, education, and privacy workshops attracted the most
interest. People in the workshops discussed ongoing projects and new
initiatives, and shared resources and information that will help CPSR's work in
these areas.

Closing Plenary

CPSR Terry Winograd attempted to sum up the meeting and provide a long-range
perspective on the CPSR program. Much of the early work of CPSR has focused on
avoiding imminent military and social dangers arising from computer technology.
Now, as the organization matures and the world enters a new period of
international thaw, we need to take a larger view: if we manage to avoid the
rocks in our path, where are we going? Areas CPSR should now be investigating
include: What do computer science and social responsibility have to do with
democracy? Can CPSR help push for better public access to information? What
might "electronic democracy" mean? There is important work being done by the
Computers in the Workplace project and on the subject of "participatory design."
CPSR has a chance to shape national policy on investment in computing. Here, for
a change, is a chance to argue for things that the federal government should be
doing, such as the proposed supercomputer network. Can educational software be
built to reinforce democratic values? How do we promote ethics and principles of
responsibility in computer science?

Winograd said that CPSR has a new challenge: to take advantage of the new,
historic changes going on in the world and to create new opportunities for using
computers to benefit people everywhere.

1990 CPSR Annual

Meeting in Palo Alto

The 1990 CPSR Annual Meeting will return to Palo Alto, California, and Stanford
University on October 20 and 21. The meeting will be held in Kresge Auditorium
on the Stanford campus. The Saturday and Sunday meeting will follow the same
format as previous meetings, with a day of nationally- known speakers followed
by a day of workshops and group discussions. There will be a banquet on Saturday
night, at which the Norbert Wiender Award will again be presented. Tentative
topics to be addressed at the meeting include the image of computers in the
popular media and women in computer science.

11


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Letter to the Editor

Dear Editor:

Bill Sulzman's report on the National Test Facility in the Summer 1989 issue of
The CPSR Newsletter suffers from ideological distortion so extreme as to make it
useless. According to its name, CPSR should be both professional and
responsible. Mr. Sulzman has been neither.

One of his more outrageous claims is that "There is no longer any attempt to
represent the SDI as anything other than a significant component of U.S. nuclear
war fighting capability." A responsible and professional assessment would know
that there is indeed extensive effort to portray SDI as a separate program with
goals quite distinct from those Mr. Sulzman declares. For one thing, the SDI
Organization, based in the Pentagon, is headed by a three star general. Colonel
Leib, National Test Bed director, is the only source cited by Sulzman; that
officer is at least three rank levels below the place were SDI policy is
established. Had Sulzman seen fit to inspect such readily available unclassified
sources as the SDIO Report to the Congress, he would have seen that his
characterization is ridiculous.

One does not have to believe everything in the Report to the Congress or the
numerous other SDI official sources. They are certainly self-serving. But it is
irresponsible to ignore them or deny them with a cavalier dismissal.

Sulzman continues with a question that implies any completed computer supported
system that is less than perfect must be a ``colossal failure." Colonel Leib
dissents from the conclusion but not the premise. Sulzman rejects the response
by describing it as ``the flawed comparison with . . . the Apollo program." He
offers no discussion to support the key word "flawed." This reader does not find
the comparison defective. If Sulzman does, I would be glad to consider his
reasons; but he gives none. In my professional experience, there is no large
computer-supported system whatever that always works perfectly. The description
"it almost works" applies to all of them. And surely some have been colossal
failures, but by no means all.

Sulzman continues with some unsupported comments on oversight. The official SDI
oversight process is described as "overmatched by SDIO's many supporters," with
government overseers "scrambling to come up with the right questions to ask.''
But his own examples in the very next sentence are the General Accounting Office
and Office of

Technology Assessment reviews that have been critical of the SDIO. Where is the
"scrambling" Sulzman describes?

The SDI concept has weaknesses and deficiencies that can be detailed and
documented. It is a disservice to the movement of social responsibility to
replace professional reporting and assessment with contempt, ridicule,
mudslinging, and sloganeering. I hope that in the future the editors will demand
higher standards of authors.

B.L. Schwartz Arlington, VA

Dr. Schwartz is employed by a contractor that provides system engineering and
technical assistance to the SDIO. He contributed to the writing of the 1989 SDIO
Report to the Congress, and he is currently a member of the team writing the
1990 version.

Bill Sulzman replies:

Dr. Schwartz's criticisms of my ``useless" article can be readily addressed by
the facts although it is true that I have certain beliefs which underlie my
position on this or any other subject, as he does.

The SDI is part of U.S. nuclear war-planning and strategy. Vice President Dan
Quayle is only the latest in a long line of public officials to admit that the
original Reagan position on the SDI as a replacement for nuclear war was
political rhetoric. The war gaming at the National Test Bed is in line with what
many recent Secretaries of Defense have said about the SDI. The facility's next
big simulation in January, 1990, includes the Strategic Air Command and is
described by NTB Director Colonel Leib as an attempt to see how offensive and
defensive war scenarios fit together. (Colorado Springs Gazette, November 25,
1989.)

The OTA assessment of the SDI is the source of the phrase I used, "colossal
failure."

The comparison of the SDI with the Apollo missions is flawed. The moon did not
employ men with hostile intentions or sophisticated weapons, it was incapable of
changing its own orbit, and the the Apollo spacecraft did not have to function
in the midst of a nuclear war.

I stand by my statement about the oversight process. I have had numerous
conversations with congressional aides who work for key legislators. None of
them knew as much as I do about the SDI. They are sure of only one thing, and
that is that SDI spending has to be watched very closely.

12


Volume 7, No. 4 The CPSR Newsletter Fall 1989

A Real-Life Computer Detective Thriller Jim Gawn, CPSR/Philadelphia Book Review

The Cuckoo's Egg: Tracking a Spy Through the Maze of Computer Espionage, by
Clifford Stoll, Doubleday, 1989. 326 pages, $19.95, hardcover.

In August, 1986, Cliff Stoll, an astronomer who had been

reassigned to a systems programming job at Lawrence Berkeley Labs (LBL), set out
to track down a 75 cent discrepancy in his UNIX accounts. The investigation was
not completed in the couple of hours that he expected. It took up most of his
work and his life for the next two years, and led to five West Germans being
charged with espionage by their government, and to a Pittsburgh resident
revealing himself as an agent for the Eastern bloc. In the process, to his own
surprise, Stoll became a self-taught expert on computer security.

He found that the discrepancy had been caused by misuse of a computer by someone
logged in from a remote site, someone who was taking care to cover up his
tracks. Had this specific accounting error not been spotted, it is likely that
the intruder would not have been detected As it was Stoll thought

it expedient to place wiretaps on his system's dial-in lines, rather than use
trace facilities in the operating system that might be noticed by the person
being observed. He then watched for many months, increasingly concerned about
the nature of the intrusions, and increasingly determined to track down the
culprit.

For this was not someone hacking and cracking just for a lark. Both on the LBL
computers, and on many others accessed via them, the intruder was establishing a
pattern of using loopholes to log in as the "superuser," or system manager,
finding valid but dormant user-IDs, granting them superuser authority, and then
using these IDs to search for interesting files throughout the system. And what
was apparently interesting to him was defense work. Most of the machines he
cracked were on military bases, at defense contractors, or at universities with
significant defense involvement. The files he printed or copied onto his own
system were those that contained military information or that might lead him to
it.

Whoever was doing it, it became clear to Stoll that he was an able systems
programmer, familiar with VMS and some dialects of UNIX, but a methodical
plodder rather than a hacking genius. The person clearly was keeping systematic
notes on his progress, and was working with evident purpose and commitment.

Stoll did not at first observe comparable purpose and commitment in the security
and law-enforcement agencies of the U.S. government. Numerous times his efforts
nearly foundered because no agency was willing to take responsibility for
coordinating an investigation. "It's important, but

it's not my bailiwick," was the refrain. In particular, it took a long time
before telephone line traces were authorized. But a number of individuals in the
lower and middle ranks of several security agencies and corporations gave him
help and eventually persuaded their superiors to take action, and it was
established that the intrusions were coming from, or through, West Germany. Then
began another round of bureaucratic fumbling, this time at the international
level. Eventually, however, five German men were charged with espionage.

This book has several themes. In addition to the straight account of how he
tracked the intruder, Stoll discusses the effect it had on his world view.
Formerly, he had been fuzzily anti-establishment, and rather complacent
politically; but the experience of cooperating with police, security and
intelligence authorities, and of trying to get them to cooperate with him,
together with his concern at the ease with which the intruder gained access to
sensitive information, changed his stand. He is still far from being a
right-wing "my-country-right-or-wrong" type, but he now views the security and
military services as containing many intelligent, humane people dedicated to
doing a good job and concerned about the results of their actions. He believes
that much of what they do is valuable, but that they also have the potential for
wrongdoing. Most of all, he sees them as severely hampered by bureaucracy and
concerns over turf.

Stoll believes in the responsibility of the individual not only to act ethically
but to ensure that others act ethically; and he is more aware of the common
interests of all political factions in a free society:

Now, after sliding down this Alice-in-Wonderland hole, I find the political left
and right reconciled in their mutual dependency on computers. The right sees
computer

continued on page 14

13


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Cuckoo's Egg continued from page 13

security as necessary to protect national secrets; my leftie friends worry about
an invasion of their privacy when prowlers pilfer data banks. Political
centrists realize that insecure computers cost money when their data is
exploited by others.

One of several themes is how Stoll was able to apply his scientific training to
the detective work involved in this projectÑnot just the formal training, but
also the political maneuvering and the people- handling, the back-door routes to
information and decision-makers. In applying such skills, Stoll himself skated
on some thin ethical ice a couple of times. He does not deal fully enough with
this issue. It was surely justified that he made up large amounts of spurious
electronic mail about a fictitious SDI NET. This encouraged the intruder to stay
logged into Stoll's system for long periods, thus making him easier to trace. It
also led to a spyÑor spy's stoogeÑin Pittsburgh blowing his cover by writing in
for SDI NET information that was supposedly available to the public.

But was Stoll justified in Iying to a telephone billing clerk in order to find
out the identity of the subscriber with a certain telephone number? Sure, it
wasn't a very big lie. Sure, it was for a good cause, and the information he
gained was instrumental in continuing the hunt. But do honorable ends justify
deceptive means? In this case, I'm really not certain. However, I am sure that
Stoll would show himself in a better light were he to acknowledge such concerns
rather than, as it appears, feeling smug about having beaten the system.

That one concern aside, Clifford Stoll has done the computing community a
service in bringing to light the reality of computer vulnerability. He describes
the specific weaknesses in software and in computer operations practices that
made it possible for one person to dig so far into so many systems. He hopes
that by reducing the security vulnerabilities, we can keep our networks open and
avoid onerous restrictions that would reduce their utility and hamper the open
exchange of ideas and information.

Stoll's narrative style displays some of the breathless detachment that I
associate with ghostwriters (he does acknowledge the heavy involvement of an
editor in putting the book together). Nonetheless, it is a gripping book, well
worth reading both as a detective story and as a cautionary tale.

Jim Gawn is administrative applications manager in the Computing and Information
Technologies Center of Millersville University in Lancaster, Pennsylvania.

Me and My Data Shadow Paul HylandÑCPSR/Washington, D.C. Book Review

Privacy in America: Is Your Private Life in the Public Eye ? David F. Linowes,
University of Illinois Press, 1989. 190 pages, $19.95, hardcover.

A woman who was turned down for several government jobs discovered that the
reason was a note from her grammar school records, in which a teacher had
carelessly noted that the woman's mother was crazy. A man who had been suffering
from depression finally went to see a psychiatrist for treatment; his personnel
office told him that the insurance claim would be kept confidential, but his
boss ultimately found out and the man was fired. A journalist moved to a new
city to accept a promotion and his new automobile insurance policy was cancelled
soon after it had been set up; even though he had an excellent driving record, a
consumer credit report contained false statements made by a disgruntled ax-
neighbor, statements which proved difficult to correct.

In his book, Privacy in America, David Linowes provides a thorough survey of
various threats to privacy posed by modern society, primarily, but not
exclusively, the result of advances in computer technology. Early in the book,
he exaggerates somewhat the potential for the advancement of this technology,
making grandiose statements such as, "Technology we currently have in place is
at least a hundred years behind what has already been developed." In another
place he claims that computers will replace twenty-five million out of
twenty-eight million manufacturing jobs. Technically oriented readers should not
be deterred by these gee-whiz prognostications; they quickly give way to more
earth- bound analyses of very real threats posed by the automation of personal
data handling by a wide variety of organizations. If the American people knew
the extent to which the details of their lives were readily accessible, they
would be shocked and angered and just might try to do something about it.

This type of information has been collected for many years, but recent
technological advances have multiplied the threat. "In the past, bureaucratic
and physical limitations have been unwitting protectors of privacy." Before
computerization, searching for and transferring personal data was an expensive
and time-consuming process, making such data less readily available, especially
outside the organization that had collected it. Moreover, due to space
limitations there was generally less data collected,

continued on page 15

14


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Privacy in America continued from page 14

and it was often destroyed "within a reasonable time after use." Now, with the
costs of storage and transmission of computer data moving steadily downward,
such constraints don't play nearly the role that they once did.

Thus computerized personal data is accessible to many more people and used for
many more purposes than ever before. Some of these are perfectly acceptable,
routine uses by which few

people would feel threatened, and which probably make society function more
efficiently. Other uses of personal data would probably surprise most people,
like the level of detail about their economic activity that is sold for the
purposes of targeted marketing. And finally, there are many ways that this data
accessibility abets illegitimate access, prohibited either by laws or by the
rules of the organization controlling the data. For example, because a
surprising number of people often have access to medical records in the course
of doing their jobs, a patient might have a more difficult time seeing his or
her records than a resourceful and/or unscrupulous private investigator. And
most people probably have a hard time turning down a request for personal
information made by law enforcement officers, whether or not these officers have
legitimate access to that information.

Privacy in the workplace is one area that Linowes examines especially closely.
As recently as thirty years ago, employers collected and kept little information
on their employees. Today, however, this has changed dramatically. More
information is required by government agenciesÑfrom the Departments of Defense
and Justice to the Occupational Safety and Health Administration (OSHA) and the
Equal Employment Opportunity Commission (EEOC) to state and local governments.
Information is also required to administer benefits programs. These appetites
for information will only grow in the future, and the computer's contribution is
to enhance the ability to collect, maintain, and distribute the data.
Unfortunately, few companies seem to care about the proper use and protection of
personal data. Exceptions include IBM and the Ford Motor Company, which modeled
their employee protection rules after the Code of Fair Information Practices
(see box on page 16). The University of Illinois conducted a survey of some of
the largest industrial corporations in America to determine the extent of real
and potential threats to privacy in the workplace, as well as management
awareness of and attention to employee rights. The results of this comprehensive
survey are fully reported in the book, along with suggestions for remedies to
problems created by these threats.

Other intrusions on privacy made possible by technology that receive some
attention in the book are the use of polygraph testing, urinalysis, and similar
tests to screen applicants and investigate wrong

doing by employees. Linowes also briefly discusses the use of various forms of
employee surveillance enabled by advances in computer and telecommunication
technologies, from electronically monitoring the number of keystrokes typed or
the length of phone calls, to secretly listening to calls or "watching" computer
screens, to broadcasting subliminal messages to employees. This growing use of
electronic monitoring and control is thought to cause stress and is
depersonalizing because it measures human performance as if people were
machines. ( See Karen Nussbaum's talk from the Annual Meeting on page 1Ñed.)

The Federal government is often thought of as inefficient and inept; however,
these characterizations do

not necessarily apply to government data collection. The government's appetite
and capacity for information arc unparalleled. It is estimated that the number
of computers operated by the government will soon top one million, and seven
years ago there were over 3.5 billion personal files, or an average of 15 files
for each individual in the country. Most Americans have no idea of the extent of
these files, and even fewer are aware that they have the right to examine any
government file pertaining to them, with few exceptions, and that they can
request amendment of information that they think is in error. Linowes provides
useful information on how to make these requests, based upon either the Privacy
Act of 1974 or the Freedom of Information Act.

One of the most important tenets of fair information practice is that
information should not be used for purposes other than those for which it is
originally collected; language similar to this appears in the Privacy Act.
However, there are exceptions in the Act for "routine use" or use "as required
by law," that have been interpreted in such a way

continued on page 16

15


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Privacy in America continued from page 15

to make this protection essentially meaningless. In the pursuit of government
efficiency and fraud detection, a new phenomenon called computer

matching has arisen, whereby government agencies will share tapes of personal
information, using that information to catch welfare cheats, benefit
double-dippers, tax evaders, or draft dodgers. While the goals of such a program
might be desirable, do we want the federal government using this information in
such a manner, such as in checking bank, drivers' license and employment
records? This constitutes a general search through data on everyone, without any
suspicion of guilt, on the assumption that something might turn up; just such a
"fishing expedition" has been prohibited by the courts in the case of physical
searches, based on the Fourth Amendment prohibition of unreasonable searches and
seizures.

In some cases the government extends its reach even further. The Internal
Revenue Service has purchased mass-marketing mailing list information to trigger
investigations for tax fraud. Intelligence agencies regularly intercept mail,
telephone, and other forms of communications between the U.S. and other
countries, and obviously have the capability to do so domestically as well. The
extent of such domestic surveillance is supposed to be very limited, but how can
we know for sure?

It is an obvious invasion of privacy to actually intercept communicationsÑto
read someone's mail or listen to their phone calls. But significant information
about a person can be obtained just by knowing who they communicated with and
when. Even with the proper authorization, such infor

The Code of Fair Information Practices

In 1973 the Secretary's Advisory Committee on Automated Personal Data Systems
for the U. S. Department of Health, Education and Welfare recommended the
adoption of a "Code of Fair Information Practices" to secure the privacy and
rights of citizens. The code is based on five principals:

* There must be no personal data record-keeping system whose very existence is
secret;

* There must be a way for a person to find out what information about the person
is in a record and how it is used;

* There must be a way for a person to prevent information about the person that
was obtained for one purpose form being used or made available for other
purposes without the person's consent;

*There must be a way for a person to correct or amend a record of identifiable
information about the person;

*Any organization creating, maintaining, using, or disseminating records of
identifiable personal data must assure the reliability of the data for their
intended use and must take precaution to prevent misuses of the data.

mation is generally available to law enforcement authorities without probable
cause. Even more revealing about a person is a record of financial transactions.
The temptation is often so great for law enforcement personnel to obtain such
information that they try to bend the rules to get it. Banks, credit card
companies, and the like are often happy to cooperate with the authorities, or
are possibly unfamiliar with their ability or obligation to protect a customer's
privacy by refusing an improper request. With the increasing computerization of
financial data, the threats are growing; just look at the potential created by
point-of-sale electronic funds transfer for recording electronically more detail
than ever about a person's financial transactions.

The government is not the only organization with a voracious appetite for
personal information. Private organizations include employers, as discussed
above, credit grantors and insurance companies, as well as the service
organizations that compile and maintain personal information for them; including
credit bureaus like TRW and

Equifax, the Medical Information Bureau (an industry association that serves as
a clearinghouse for medical information), tenant screening agencies, and others.
The customers of these services are not the individual subjects of their files;
they are organizations looking for data on individuals. Therefore, these
services are not necessarily as careful about the quality of data on individuals
as they would be about aggregate data quality (or simply quantity), nor are they
likely to be responsive to claims from individuals regarding inaccuracies. The
Fair Credit Reporting Act gives individuals some recourse if they believe they
have been wronged by a consumer credit bureau, but the provisions of the Act are
not always that easy to use, and they only apply to credit bureaus.

continued on page 17

16


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Privacy in America continued from page 16

Linowes aptly quotes Alexander Solzhenitsyn: "As every man goes through life he
fills in a number of forms for the record, each containing a number of
questions.... There are thus hundreds of little threads radiating from every
man, millions of threads in all.... They are not visible, they are not
material.... Each man, permanently aware of his own invisible threads, naturally
develops a respect for the people that manipulate the threads." The real threat
may not result from any one inappropriately available piece of information about
an individual, but from the whole mosaic, the fairly complete picture of a
person that is there for the taking to anyone with requisite access, power, or
money. The ability of the large organizations, both public and private, that
control society to enforce homogeneity and identify and neutralize boat-rockers
and rabble-rousers opens up new possibilities for social control not unlike
those seen in George Orwell's 1984. Linowes touches on this subject but does not
go far enough; he also dismisses it as theoretical, where I think pieces of it
are very real.

Linowes does, however, outline ways that you as an individual can protect your
privacy. At the end of the book, he lists thirteen "Basic Guidelines for
Protection of your Personal Privacy," including things like being careful to
whom you disclose personal information, disclosing only what is necessary, doing
whatever possible to limit its further dissemination, and being very careful
when signing waivers that they do not grant an organization too much power to
collect, store, and share information about you. He also discusses current laws
governing information privacy and suggests rules that organizations should
follow in the absence of such laws. However, without political pressure, these
changes might occur more slowly and be less universal than they should. CPSR
participates in the policy debate in Washington to the extent possible;
concerned citizens everywhere could help spread the word by being vigilant and
vocal about protecting their own privacy, telling others of their concern, or
even writing letters to the editor and alerting the media to instances of
privacy abuse. If people knew the extent of the growing threats to their
privacy, they would not be so complacent. If enough people speak out, their
political representatives might have to listen.

Paul Hyland is an assistant engineer working for IBM in Rockville, Maryland, and
he works on image storage and distribution for manufacturing applications. He
was recently elected chair of the Washington, D.C., chapter.

A Darker Side of the Chip Lenny SiegelÑCPSR/Palo Alto Book Review

Behind the Silicon Curtain: The Seductions of Work in a Lonely Era, by Dennis
Hayes, Boston: South End Press, 1989,159 pp. (215 with notes) $10 paperback.

When my book came out a few years back, it was subtitled, The Dark Side of the
Chip. [Lenny Siegel and John Markoff, The High Cost of High Tech: The Dark Side
of the Chip, 1985--ed.] Dennis Hayes' book perhaps should be called The Darker
Side of the Chip. Behind the Silicon Curtain effectively and entertainingly
peeks behind the media hype that has portrayed Silicon Valley as the home of New
Age capitalism where technology has transcended social ills and human conflict.

However, for those readers who already know the down side of high technology,
Hayes' view of "the future" is so negative that it provides few tools to
challenge, or even understand, the dynamics of an area built around the computer
and chip industries.

Attacking the what he considers the "profound loneliness" of the high-tech
workplace, Hayes calls the fabric cubicle partition, ubiquitous in Silicon
Valley offices, labs, and even many factories, the "emblem for the transience of
[Valley] workers." It's a telling view of Silicon Valley life, one which
participants rarely stop to ponder. Valley employers offer their employees, even
those with professional degrees or skills, little loyalty, and they expect
little loyalty in return. Companies rely upon pools of permanent "temporaries"
to meet changing demand. Production workers, particularly the large fraction of
undocumented immigrants, are expendable.

Transience is the flip side of the dynamism chronicled by the the Valley's
corporate apologists. It is a side too often ignored, but Hayes dismisses what
to me appears a natural tension: to avoid stagnation, any culture must accept
instability.

Hayes properly rejects the oft-heard contention that temporary cubicles
symbolize democratic, or at least egalitarian work styles. But he also points
out that they rise to the height of "stockyard pens." It's a provocative figure
of speech, and I consider it a cheap shot. The transience of Silicon Valley life
gives workers more alternatives than normal, whether or not they are headed
toward certain occupational oblivion.

continued on page 18

17


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Silicon Curtain continued from page 17

Hayes describes the widely known corporate cultures of Silicon ValleyÑthe ROLM
Philosophy, Intel Culture, and the H-P WayÑand calls "corporate culture . . . a
fig leaf hiding the transience, stress, and loneliness that work created in
people's lives." Clearly the paternalism of the Valley's big name companies is
oversold, but Hayes fails to explore the significance of corporate-centered
lives. Using Apple Computer's fitness center and eating company popcorn is not
the same as shopping in a mining company store and living in a company-owned
shack. In a new area with few established community institutions, large
employers do substitute for social clubs, political parties, unions, and even
churches. Should employer-centered life be rejected outright, as Hayes appears
to suggest? Or should it be changed?

Hayes, unfortunately, is selectively negative. This struck me first when he made
a minor comment about the city-owned amusement park in Santa Clara, Great
America. Despite its pretentious name, my wife and I take our two kids there
more than a dozen times each summer. (We buy season passes.) In general, we find
it a relaxing place to exercise the kids. with fewer hassles, shorter lines, and
less creativity than Disneyland. Some teenagers act rowdy there, but the park
offers "good, clean fun," with no sign of gangs or racial hostility.

Yet Hayes writes, ``For lack of summer recreation programs, the shopping malls
and the Valley's amusement theme park . . . teemed with adolescents squandering
their allowances, seeking adventure and intrigue in the clashing gangs and
cliques of their respective ethnic and class subcultures."

Unless you've never read anything else about military technology, Hayes' section
on high-tech weapons is not too useful. I am by no means a supporter of the
Pentagon's role in Silicon Valley industryÑI've attacked it in writing and in
person for 23 years. But Hayes says, "Estimates of the livelihoods that depend
on military spending in the Valley run as high as 50 per cent." This may have
been true 23 years ago, but the Pentagon and its contractors have fallen behind,
not only in employment, but in market share, technological hegemony, and
political influence. Today, it

is more important to understand why so much of high tech has moved away from
warfare, than to belabor how some workers continue to be enmeshed in the arms
race.

Finally, Hayes devotes a chapter to what he calls "Psychotherapeutico": the use
of exercise, shopping, and even peace activism to escape the loneliness of
Silicon Valley life. It's a good effort to identify some of the common excesses
of modern-day behavior, but it falls short in two important areas. First, it
isn't clear at what point fitness or consumption becomes excessive.

Secondly, I don't believe Silicon Valley is responsible for the excesses. I know
people who use shopping and exercise the way others use religion, but most of
them have little to do with high tech. It's a good topic for a book on its own,
but I don't see that it has much to do with Silicon Valley.

The section on "Political Disarmament," situated in the "Psychotherapeutico"
chapter, seems to suggest that high tech peace activists use their politics to
escape the emotional wasteland of industrial life. Indeed, it isn't too hard to
criticize the somewhat metaphysical ideology of Beyond War, an upper
middle-class peace cult with its roots in the Valley, but Hayes seems reluctant
to credit Beyond War for the dedicated educational work of itsmembers, many of
whom go far beyond the official ideology of the organization.

He even deigns to mention briefly Computer Professionals for Social
Responsibility, suggesting that CPSR focuses too narrowly on "the vulnerable
technical premises of Star Wars." He says that CPSR, like Physicians for Social
Responsibility, "too often retreats to the gloom of unfolding weapons and
surveillance technology." It's easy for me, as a CPSR member, to reject his
critique outright, for it isn't hard to find CPSR members who participate in a
wide range of public interest activities, only a few of which are related to
their technical interests.

More important, he misses the point. Just as the Manhattan Project produced a
generation of physicists critical of the nuclear arms race, so the development
of computerized warfare, particularly since the Strategic Computing Program
tried to turn computer scientists into weapons designers, has stimulated the
growth of groups such as CPSR. Do high tech peace activists represent a small
fraction of the Silicon Valley professional workforce, as Hayes argues, or

continued on page 22

18


Volume 7, No. 4 The CPSR Newsletter Fall 1989
Washington Update

According to Government Computer News, Congressman Neal Smith (D-IA) is more
concerned about settling a political score with the National Institute of
Standards and Technology than with federal computer security. In last-minute
budget maneuvers, Smith cut NIST's funding to implement the Computer Security
Act from $6 million to $2.5 million, even after the Office of Management and
Budget had given NIST a green light for the funding request. Smith's beef with
NIST grows out of a battle last year over his pork-barrel crusade to redirect
NIST funding to the Iowa State University's School of Science and Technology.
Having lost that campaign, Smith can claim partial victory now that he has
successfully cut funding for computer security across the federal government.

Information Access Policy on HoldÑThe deadline for authorizing the agency
responsible for information policy passed on September 30 with no action by
Congress. The delay is unfortunateÑit means that the Office of Information of
Regulatory Affairs continues to operate without clear statutory authority.
Congress is likely to resolve OIRA's future when it returns in early January.

Morris Case Goes to Trial--Thomas A. Guidoboni, attorney for Robert Morris, Jr.,
the graduate student responsible for the November 1988 Internet virus, moved to
dismiss the charges against his client, arguing that the government prejudiced
its case with news disclosures, that the indictment was insufficient and vague,
and that the 1986 Computer Fraud and Abuse Act under which Mr. Morris is charged
is unconstitutionally vague. Judge Munson turned down the motion and the case
will go to trial.

Data Protection Board in the U.S.ÑA bill to establish a Federal Data Protection
Board has been introduced by Representative Bob Wise (D-WV). Under the Data
Protection Act of 1989, a Data Protection Board will monitor the collection,
use, and dissemination of personal information. Many countries, notably France,
Canada, and West Germany, have such agencies, but plans for a similar office in
the United States have been on hold since 1974 when opponents of the Privacy Act
knocked out the planned agency. Privacy advocates continue to urge the creation
of an independent commission to highlight privacy abuses.

NCIC Back in the NewsÑA proposal to provide point-of-sale verification for gun
purchases has been shelved by Attorney General Richard Thornburgh. Citing major
gaps in criminal convictions in the National Crime Information Center, the FBI's
criminal justice records system, the

Attorney General reported to Congress that it was not currently feasible to
implement such a system. The news slows the plans to provide on-line access for
non-criminal justice use of the NCIC, though it it is likely to increase
pressure to automate fingerprint records and eventually to provide fingerprint
verification systems to gun dealers across the country, estimated to number over
275,000.

Ethics at the NSCS ConferenceÑSeveral CPSR members spoke at the twelfth National
Computer Security Conference on issues relating to ethics and civil liberties.
This was the first NCSC conference with a track devoted explicitly to ethical
questions in the use of computers.

House Report on Software RisksÑA report by the staff of a House science
subcommittee, "Bugs in the Program,,, highlights problems with software
development and recommends congressional action to improve acquisition and
development. Among the findings: that agencies must test software far more
rigorously than is current practice and that problems associated with risk are
reliability, as well as cost and development time, are routinely underestimated.

Hearing on Computer VirusesÑRepresentative Charles Schumer (D-NY) held hearings
on two bills introduced in the House of Representatives on computer vi ruses. C
PSR Washington office director Marc Rotenberg and CPSR member Lance Hoffman
testified at the hearing and recommended that Congress not rush to pass new
legislation without more information about the adequacy of current law. The
Justice Department, citing the pending case against Robert Morris, Jr., declined
to testify.

Worker Authorization CardÑIn an effort to implement certain provisions of the
1986 Immigration Reform and Control Act, a House subcommittee is considering
the~ development of a worker authorization card. While taking no position on the
pending legislation, CPSR sent a letter to Representative Bruce Morrison,
chairman of the House Judiciary Subcommittee on Immigration, recommending that
the Subcommittee examine more closely the possible privacy implications of such
an identification card. The letter expresses concerns about the poor data
quality of the Immigration and Naturalization Service records system as well as
the possibility that a worker authorization card could be a step toward a
national identification card.

SDI FundingÑIn a mostly symbolic vote Congress has cut funding for SDI to about
$3.8 billion dollars, slightly below last year's level. This first real cut in
the SDI's funding puts the brakes on the program, but also suggests the hard
work that lies ahead for legislators seeking to reduce the $300 billion annual
defense budget.

ÑMarc Rotenberg

19


Volume 7, No. 4 The CPSR Newsletter Fall 1989

James Martin

James Martin Elected to CPSR Advisory Board

The CPSR Board of Directors has elected James Martin to the CPSR National
Advisory Board. Martin is probably the world's best-known author of books
relating to computers and society, having published over 70 best-sellers. His
most famous books are The Wired Society, for which he received a nomination for
the Pulitzer Prize, and An Information Systems Manifesto. Martin is also a
columnist for PC Week magazine.

Martin is also famous internationally for his seminars on information systems.
He has lectured before thousands of MIS professionals around the world, and his
seminars always include a section on the social impact of developments in
computing technology, and the role of social responsibility in the profession.

Martin has a strong commitment to international peace and nuclear disarmament.
One of his most recent books is Preventing the Unthinkable: A Tutorial on
Nuclear Warfare and the Future, published by Savant Research Studies. This
612-page book may be the most complete and exhaustive book on nuclear weapons
issues available. The theme of the book is preventing nuclear war, and it
features a compelling section on "morality, conscience, and laws" that stresses
individual commitment to ending the threat of nuclear war.

Martin received an M.A. in Physics from Oxford University, and he has received
an honorary doctorate from Salford University for his contribution to
information systems education. He worked for IBM for 19 years. In addition to
writing books and conducting seminars, Martin has consulted with AT&T, IBM,
Honeywell, Texas Instruments, GTE, DEC, JCL, Xerox, and many other firms. He has
also been a consultant to the British government on telecommunications and the
postal system. He resides in Vermont and Bermuda.

The CPSR National Advisory Board is made up of people the Board of Directors
asks to advise the organization on its direction, character, and program. The
National Advisory Board is composed of people who are famous for their
commitment to socially responsible uses of technology and who have a strong
interest in the growth of CPSR. The Board includes five Turing Award winners,
one Nobel laureate, a member of Congress, and a retired four-star admiral, as
well as several people world famous in the computing profession.

CPSR Testifies in House Computer Virus Hearing

On November 8, 1989, the House Judiciary Subcommittee on Criminal Justice held a
hearing on computer virus legislation. Two bills introduced in the House that
would extend criminal penalties to computer viruses were the subject of the
hearing. CPSR Washington Office Director Marc Rotenberg and CPSR member Lance
Hoffman appeared as witnesses.

The hearing was chaired by Congressman Charles E. Schumer (D-NY). Also present
during various portions of the hearing were committee members George E.
Sangmeister (D-IL), George W. Gekas (R- PA), Howard Coble (R-NC).

Mr. Schumer began the hearing by outlining some of the risks and associated with
computer viruses and said that the purpose of the hearing was to determine
whether new legislation was necessary. Mr. Schumer noted that the Department of
Justice had been invited to testify, but chose to decline because of the pending
criminal charges against Robert T. Morris, Jr. The Justice Department said that
it might prejudice the government's case if it were to testify on the pending
legislation.

The following is a transcript of the oral testimony presented at the hearing by
Marc Rotenberg:

continued on page 21

20


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Virus testimony continued from page 20

Mr. Chairman, members of the Committee, thank you for the opportunity to testify
today on legislation regarding computer viruses. It was just a year ago last
week that the Cornell "virus" swept through the Internet. For many people in
this country it was the first that they had heard of computer viruses and
similar programs that could bring a nation-wide computer system to a halt. In
Palo Alto, California CPSR members met shortly after the Internet virus to
discuss the significance of the event. The discussion revealed many concerns
about network security, ethical accountability, and computer reliability. After
several days of debate, we issued a statement on the computer virus that has
been widely circulated in the computer community and republished in computer
journals. I have attached the CPSR statement to my testimony and ask that it
been entered into the hearing record.

On the issue of the culpability of the person responsible for the virus we said
clearly that the act was irresponsible and should not be condoned. The author of
the virus had treated the Internet as a laboratory for an untested experiment in
computer security. We felt this was very risky, regardless of whether data was
altered or destroyed.

But we did not view our task primarily as sitting in judgement over the author
of the Internet virus. There had been other viruses in the past, and there would
be more in the future. More important, we believed, was to set out the various
concerns of our organization for the public, policy makers, and others within
the profession who were examining the significance of the computer virus and
considering various responses.

¥ We emphasized individual accountability as the cornerstone of computer ethics.
We said that the openness of computer networks depends on the good will and good
sense of computer users. Criminal penalties may be appropriate for the most
pernicious acts of computer users. But for the vast majority of cases, far more
would be accomplished by encouraging appropriate ethical guidelines.

¥ We said that the incident underscored our society's growing dependence on
complex computer networks. Although the press and the public tended to focus on
the moral culpability of the virus writer, we believed that the incident also
raised significant policy questions about our reliance on computer systems,
particularly in the military, that are difficult to test and may produce
misplaced trust. There is little that tougher criminal penalties can do to
correct the problems of computer risk and reliability.

We opposed efforts to restrict the exchange of information about the computer
virus. Shortly after the virus incident, officials at the National Security
Agency attempted to limit the spread of information about the computer virus and
urged Purdue University to destroy copies of the virus code. We thought this was
short-sighted. Since that time, several technical reports and the widespread
exchange of information through the Internet have helped users in the computer
community more fully understand how the virus operated and provided the
necessary data to correct security flaws. We continue to believe that the needs
of network users will be better served through the open and unrestricted
exchange of technical information.

I will describe some of the current concerns of the computer community and make
several recommendations about what Congress might do to respond to the problem
of computer viruses. At the outset, I should make one fundamental point: the
problems raised by computer viruses are far- reaching and complex. There is no
simple technical or legal solution. In many ways, we are confronting a whole new
series of policy questions that raise fundamental issues about privacy and
access, communications and accountability. Public policy must be brought up to
date with new technologies, but in the effort to ensure that our laws are
adequate Congress should not reach too far or go off in directions that are
mistaken or may ultimately" undermine the interests we seek to protect.

I should also say that part of the solution is to ensure adequate funding for
programs designed to enhance computer security, such as the Computer Security
Act, the law passed by Congress designed to address the computer security needs
of the federal agencies. I was very disturbed to learn that the conference
committee recently cut the proposed appropriation for National Institute for
Standards and Technology (NIST) from $6 million to $2.5 million, even after the
Office of Management and Budget had approved the funding for NIST. This support
is dearly needed to assist federal agencies with computer security.

The last five years have been a period of rapid development in computer security
legislation. Congress has three times passed laws designed to extend criminal
statutes to computer technology. Virtually all of the states have adopted
computer crime statutes, and many are looking at possible

continued on page 22

21


Volume 7, No. 4 The CPSR Newsletter Fall 1989

Virus testimony continued from page 21

changes and additions.

Based on the views of CPSR members, the experience of the Internet virus, and
our general concern about protecting open computer networks, I will describe the
potential problems with the proposed federal legislation.

From a constitutional standpoint, it is first worth considering whether a
computer virus may also be a form of speech, as I believe the the Aldus Peace
Virus was, and whether criminalizing such activities may run afoul of First
Amendment safeguards. Restrictions on speech should be carefully examined to
ensure that free expression is not suppressed. Computer networks are giving rise
to new forms of communication. The public debates in the town square of the
eighteenth century are now occurring on the computer networks that will take us
into the twenty-first century. These are fragile networks, and the customs and
rules are still evolving. The heavy hand of the government could be too great a
weight to carry for this new electronic democracy.

I wonder also if in casting such a broad net, these statutes might not meet
constitutional challenge on overbreadth grounds. A criminal law should clearly
distinguish between prohibited and permissible conduct. If it fails to do this,
it grants too much discretion to law enforcement officials to choose which cases
to prosecute. Where speech is involved, such a law might unnecessarily chill
protected speech.

A further problem lies in the attempt to define the criminal act in terms of a
technical phrase such as a "virus." A virus is not necessarily malicious. Some
viruses may only display a Christmas greeting and then disappear without a
trace. Other viruses might alter or destroy data on a disk. To treat the two
acts as similar because an identical technique is involved would be similar to
punishing all users of cars because some cars might cause the death of a person.
It is the state of mind of the actor and the harm that results which should be
the two guiding principles for establishing criminal culpability.

More interesting from a technical viewpoint is that computer viruses may be used
both to enhance computer security and to facilitate the exchange of computer
information. Although computer security experts have said that such programs are
potentially as dangerous as the disease they are designed to cure, it is not
clear that disseminating a benign virus should necessarily be a criminal act.
Hebrew University used a computer virus to identify and delete a malicious virus
that would have destroyed data files across Israel if it had remained
undetected.

I would recommend that the Congress wait until there is more case law under the
1986 Act and until more of the state statutes have been tested before enacting
new computer security legislation. Congress should also obtain information from
the Justice Department about the effectiveness of the current laws, and see
whether state courts can develop common law analogies to prosecute the computer
equivalents of trespassing, breaking and entering, and stealing. This is a
process that happens gradually over time. The extension of common law crimes to
their computer equivalents may provide a more durable and lasting structure than
federal statutes that must be updated every couple of years.

Tougher criminal penalties may help discourage malicious computer activities
that threaten the security of computer networks, but they might also discourage
creative computer use that our country needs for technological growth. Though we
have a great deal of criminal law that could potentially apply to the acts of
computer users, it is still very early in the evolution of computer networks. In
the rush to criminalize the malicious acts of the few we may discourage the
beneficial acts of the many and saddle the new technology with more restrictions
than it can withstand.

Copies of the written testimony submitted by CPSR to the hearing record can be
obtained by sending $5 (for postage and handling) to the CPSR National Office at
P. O. Box 717, Palo Alto, CA 94302-0717.

Silicon Curtain continued from page 18

are they responsible for a shift in the prevailing direction in the use of
advanced technology?

Perhaps Hayes should have raised the dilemma that has faced CPSR, as well as
other activist groups organized along professional lines, since its inception:
are we an elite whose expertise entitles us to political influence beyond our
numbers? Or are we merely the technical arm of a much broader movement for peace
and social change?

I feel uncomfortable writing so critically about a writer whose basic premises I
share, who writes so clearly, and who documents his information. But unless one
has missed previous critiques of Silicon Valley, Behind the Silicon Curtain
offers little new understanding.

Lenny Siegel has been director of the Pacific Studies Center, a Silicon
Valley-based research center and library focusing on the impact of high
technology, since 1970.

22


Volume 7 No. 4 The CPSR Newsletter Fall 1989

From the Secretary's Desk Eric RobertsÑNational Secretary

With the new year fast approaching, it seems like a good time to look beckon our
trials and our successes of the past year and offer something of a "CPSR Year in
Review."

In many ways, 1989 has been a banner year for the organization. One indicator of
our success is that we've gotten more press coverage than ever before, from
national media sources such as The New York Times, from professional journals
such as The Communications of the ACM, and in local papers in various parts of
the country.

Our most widely publicized undertaking was our report on the FBI plans to
upgrade the National Crime Information Center, which we prepared in response to
a request from a Congressional subcommittee. Our report raised several important
questions about the civil liberties implications of the FBI's proposal to
include a "tracking" facility, and our report was widely credited as the major
factor behind the FBI's decision to drop that proposal.

Our new office in Washington, D.C., has been of tremendous benefit to CPSR by
bringing us closer to the policymaking process. Washington Director Marc
Rotenberg has testified before Congress or submitted statements on a wide range
of issues including computer virus legislation, computer security law and the
role of the National Security Agency, privacy concerns for credit bureaus, data
security issues for the Health Care Finance Administration, and promoting public
access to information stored in federal computer systems.

Along with CPSR National Advisor Board member Sherry Turkle and 1988 Annual
Meeting panelist Esther Dyson, CPSR Executive Director Gary Chapman visited the
Soviet Union in April, meeting with Soviet computer scientists and programmers
to gain an understanding of how computers are affected by this era of glasnost
and perestroika. In September, I also had the opportunity to visit Moscow, and
it is exciting to recognize that the reduction in international tension has made
it possible for scientists in the two countries to work together as colleagues.

But, despite all of our successes, it has been a difficult year for CPSR,
particularly financially. For a long time, we have relied too heavily on
foundation grants to support our programs; after a basic startup period,
foundations expect organizations like CPSR to raise more of their funding from
their members. And when a grant we had anticipated failed to materialize in
June, CPSR fell into the most serious fiscal crisis we have had since our
founding. Thanks to the

excellent response from all our members to our crisis appeals, we have weathered
that particular storm. Even so, our long-term stability depends very much on
whether we can increase the percentage of our income that comes from our members
and supporters. While we have retained the $15 student/low-income and $40 basic
membership rates, we have established a new regular membership category at $75,
and we hope that members will be able to renew at this level so that we can
carry on our workÑwork that no other organization besides CPSR is doing.

Looking ahead, 1990 looks like an exciting year for CPSR as well. At the end of
March, the Computers in the Workplace Project will hold a conference on
participatory design in Seattle. Interest in this conference, both from the
United States and Europe, has been high, and we expect a large attendance. In
July, the third CPSR Directions and Implications of Advanced Computing symposium
wi,l be held in Boston, and we have an impressive list of cosponsors lined up
for the event. And we are already preparing for the 1990 Annual Meeting here in
Palo Alto in October, which we hope will be the largest and most exciting yet.

DIAC-9O Call For Papers

The 1990 CPSR research conference, Directions and Implications of Advanced
Computing, will be held July 28, 1990, in Boston, Massachusetts. The conference
organizers are soliciting papers to be presented at the meeting, and for
publication in a planned anthology. The papers should address some significant
aspect of the social impact of computing. Broad topics include research
directions in computer science, defense applications, computing in a democratic
society, and computers used in the public interest. Papers on ethics and values
are especially encouraged. Reports on work in progress, suggested directions for
future work, surveys and applications will also be considered.

Submissions will be read by the international program committee with the
assistance of outside referees. Complete papers should include an abstract and
should not exceed 6,000 words. Submissions will be judged on clarity, insight,
significance, and originality. Four copies of submitted papers are due by March
1, 1990. Notices of acceptance or rejection will be mailed by April 15.
Camera-ready copy is due by June 1. Send papers to Douglas Schuler, 2202 N. 41st
Street, Seattle, WA 98103. For more information call Doug Schuler at (206)
634-2771.

23


Now available from the CPSR National Office

The award winning hour long video on accidental nuclear war

Losing Control?

Produced by Ideal Communications

Scheduled for national televisor broadcast on PBS

Now available through CPSR

Admiral Noel Gayler, U. S. Navy (ret ), former commander of U. S. Pacific Forces
and member of the CPSR National Advisory Board "Losing Control? is the most
powerful and convincing film I have ever seen on the risks inherent in reliance
on nuclear weapons The most relevant film I can think of for Americans to see
this year, in view of the immense opportunity we now have with the Soviet Union
to stop and reverse the nuclear arms race "

Dr. Thomas Wander, Chairman of the Arms Control Committee of the American
Association for the Advancement of Science: "In terms of a systematic exposition
of the factors that could lead to an inadvertent nuclear war, Losing Control? is
the best treatment I have ever seen in the television medium."

Susan Martin President, California Teachers' Association, member of the Board of
Directors, National Education Association: "Ingenious and riveting. A must see
for every person concerned with our planet's survival, especially for our
students and their teachers. A real eye-opener for those who thought the arms
race is over. 'Wow, important stuff,' was the almost universal comment of my 120
students who saw it this morning."

Shelby Skates, political reporter, Seattle Post-Intelligencer ''Losing Control?
may be as important a work of journalism as the 1950s Rachel Carson books on the
environment."

Video purchase: $60.00 Rental: $25.00 plus $7 postage and handling Contact the
CPSR National Office at (415) 322-3778


CPSR Chapters

December 1990

CPSR/Acadiana, LA

Jim Grant

Center for Advanced Compuer Studies

University of Southwestern Louisiana

P. O. Box 44330

Lafayetee, LA 70504

(318) 231-5647

CPSR/Austin

Ivan M Milman

4810 Placid Place

Austin, TX 78713

(512) 823-1588 (work)

CPSR/Berkeley

Steve Adams

3026 Shattuck, Apt. C

Berkeley, CA 94705

(415) 845-3540 (home)

CPSR/Boston

Tom Thornton

2 Newland Road

Arlington, MA 02174

(617) 643-7102 (home)

CPSR/Chicago

Don Goldhamer

528 S. Humphrey

Oak Park, IL 60304

(312) 702-7166 (work)

CPSR/Denver-Boulder

Randy Bloomfield

4222 Corriente Place

Boulder, CO 80301

(303) 938-8031 (home)

CPSR/Los Angeles

Rodney Hoffman

845 S. Windsor Blvd., #6

Los Angeles, CA 90005

(213) 932-1913 (home)

CPSR/Madison

Deborah Servi

128 S. Hancock St., #2

Madison, WI 53703

(608) 257-9253 (home)

CPSR/Maine

Betty Van Wyck

Adams Street

Peaks Island, ME 04108

(207) 766-2959 (home)

CPSR/Milwaukee

Sean Samis

6719 W. Moltke St.

Milwaukee, WI 53210

(414) 963-2132 (home)

CPSR/Minnesota

David J. Pogoff

6512 Belmore Lane

Edina, MN 55343-2062

(612) 933-6431 (home)

CPSR/New Haven

Larry Wright

1 Brook Hill Road

Hamden, CT 06514

(203) 248-7664 (home)

CPSR/New York

Michael Merritt

294 McMane Avenue

Berkeley Heights, NJ 07922

(201) 582-5334 (work)

(201) 464-8870 (home)

CPSR/Palo Alto

Clifford Johnson

Polya Hall

Stanford University

Stanford, CA 94305

(415)723-0167 (work)

CPSR/Philadelphia

Lou Paul

314 N. 37th Street

Philadelphia, PA 19104

(215) 898-1592 (work)

CPSR/Pittsburgh

Benjamin Pierce

School of Computer Science

Carnegie Mellon University

Pittsburgh, PA 15213

(412) 268-3062 (work)

(412) 361-3155 (home)

CPSR/Portland

Bob Wilcox

CPSR/Portland

P.0. Box 4332

Portland, OR 97208 4332

(503) 246-1540 (home)

CPSR/San Diego

John Michael McInerny

4053 Tennyson Street

San Diego. CA 92107

(619) 534-1783 (work)

(619) 224-7441 (home)

CPSR/Santa Cruz

Alan Schlenger

419 Rigg Street

Santa Cruz, CA 95060

(408) 425-1305 (home)

CPSR/Seattle

Doug Schuler

CPSR/Seattle

P.O. Box 85481

Seattle, WA 98105

(206) 865-3226 (work)

CPSR/Washington, D.C.

David Girard

2720 Wisconsin Ave., N.W., Apt 201

Washington, D.C. 20007

(202) 967-6220 (home)

CPSR, P.O. Box 717, Palo Alto, CA 94301 (415) 322-3778


CPSR Foreign Contacts

December 1989

CPSR is in regular communication with the following individuals and
organizations concerned with the social implications of computing.

Canada

Ottawa

Initiative for the Peaceful Use of Technology (INPUT)

Box 248, Station B

Ottawa, Ontario K1 P 6C4

Toronto

Dr. Calvin Gotlieb

Department of Computer Science University of Toronto

Toronto, Ontario M5S 1A4

Vancouver

Richard S. Rosenberg

Department of Computer Science

University of British Columbia

6356 Agricultural Road

Vancouver, British Columbia V6T 1W5

Australia

Australians for Social Responsibility in Computing (ASRC)

Sydney Graham Wrightson Department of Computer Science Newcastle University
Newcastle, NSW 2308

New Zealand

Computer People for the Prevention of Nuclear War (CPPNW)

P.O. Box 2

Lincoln College

Canterbury

Finland

Pekka Orponen

Department of Computer Science

University of Helsinki

Tukholmankatu 2

SF 00250 Helsinki

Great Britain

Computing and Social Responsibility (CSR)

Edinburgh

Jane Hesketh

3 Buccleuch Terrace

Edinburgh EH8 9NB, Scotland

Glasgow

Philip Wadler

Department of Computer Science

University of Glasgow

Glasgow 612 800 Scotland

Lancaster

Gordon Blair

University of Lancaster

Department of Computer Science

Bailrigg, Lancaster LA1 4YN

Sussex

Mike Sharples

University of Sussex School of Cognitive Sciences

Falmer

Brighton, BN1 9QN

West Germany

FIFF per adresse Helga Genrich

Im Spicher Garten #3

5330 Koenigswinter 21

Federal Republic of Germany

Italy

Informatici per la Responsibilita Sociale (IRS-USPID)

Dr Luca Simoncini

Istituto di Elaborazione dell'Informazione CNR

Via Santa Maria 46

1-56100 Pisa

Ivory Coast

Dominique Debois

Centre d'Information et d'Initiative sur l'Informatique (CIII)

08 BP 135 Abidjan 08

Cote d'Ivoire, West Africa

South Africa

Philip Machanick

Computer Science Department

University of Witwatersrand

Johannesburg, 2050 Wits, South Africa

Spain

Dr. Ramon Lopez de Mantaras

Center of Advanced Studies

C.S.I.C.

17300 Blanes, Girona

Thailand

David Leon c/o

The Population Council

P.O. Box 1213

Bangkok 10112

CPSR, P.O. Box 717, Palo Alto, CA 94301 (415) 322-3778


CPSR Educational Materials

The following materials may be ordered from the CPSR National Office. All
orders must be prepaid. Please include your name and address for shipping.

Back issues of the CPSR Newsletter are available for $1 each.

Some issues available in photocopy only.

Articles and papers

_ ANNOTATED BIBLIOGRAPHY ON COMPUTER RELIABILITY AND NUCLEAR WAR. Compiled by
Alan Borning (16 pages - updated October 1984 - $2.00)

_ COMPUTER SYSTEMS RELIABILITY AND NUCLEAR WAR. Alan Borning (20 pages--February
1987 - $2.00)

_ COMPUTER UNRELIABILITY AND NUCLEAR WAR. CPSR/Madison (11 pages - June 1984 -
$2.00)

_ THE RESPONSIBLE USE OF COMPUTERS; WHERE DO WE DRAW THE LINE? Christiane Floyd
(4 pages - June 1985 - $1.00)

_ THE "STAR WARS" DEFENSE WON'T COMPUTE. Jonathan Jacky (reprinted from The
Atlantic, 6 pages - June 1985 - $1.00)

_ THE STAR WARS COMPUTER SYSTEM. Greg Nelson and David Redell (10 pages - June
1985 - $1.00)

_ LIST OF ILLUSTRATIVE RISKS TO THE PUBLIC IN THE USE OF COMPUTER SYSTEMS AND
RELATED TECHNOLOGY. Compiled by Peter G. Neumann (9 pages - August 1987 - $1.00)
DEADLY BLOOPERS. Severo M. Ornstein (16 pages - June 1986 - $2.00)

_ LOOSE COUPLING: DOES IT MAKE THE SDI SOFTWARE TRUSTWORTHY? Severo M. Ornstein
(4 pages - October 1986 - $1.00)

_ RELIABILITY AND RESPONSIBILITY. Severo M. Ornstein and Lucy A. Suchman
(reprinted from Abacus, 6 pages - Fall 1985 - $1.00)

_ STRATEGIC COMPUTING: AN ASSESSMENT. Severo M. Ornstein, Brian C. Smith, and
Lucy A. Suchman (4 pages - June 1984 - $1.00)

_ SOFTWARE AND SDI: WHY COMMUNICATION SYSTEMS ARE NOT LIKE SDI. David L. Parnas
(Senate testimony, 2 documents, 7 pages - December 1985 - $1.00)

_ WHY SOFTWARE IS UNRELIABLE. David L. Parnas (8 memoranda, 17 pages - June 1985
- $2.00)

_ PRIVACY IN THE COMPUTER AGE Ronni Rosenberg (24 pages - October 1986 - $3.00)

_ SELECTED AND ANNOTATED BIBLIOGRAPHY ON COMPUTERS AND PRIVACY. Ronni Rosenberg
(7 pages - September 1986 - $1.00)

_ THE LIMITS OF CORRECTNESS. Brian Cantwell Smith (21 pages - June 1985 - $3.00)

_ ETHICAL QUESTIONS AND MILITARY DOMINANCE IN NEXT GENERATION COMPUTING. Paul
Smolensky (6 pages - October 1984 - $1.00)

_ STRATEGIC COMPUTING RESEARCH AND THE UNIVERSITIES. Terry Winograd (28 pages -
March 1987 - $3.00

_ THE CONSTITUTION vs. THE ARMS RACE. Clifford Johnson (8 pages - December 1986
- $1.00)

_ THE NATIONAL CRIME INFORMATION CENTER: A CASK STUDY. Mary Karen Dahl (4 pages
- March 1988 - $1.00

_ "SENSITIVE, " NOT "SECRET": A CASE STUDY. Mary Karen Dahl (4 pages - January
1988 - $1.00)

_ THINKING ABOUT "AUTONOMOUS" WEAPONS. Gary Chapman (4 pages - October 1987 -
$1.00)

Videotapes and Slide Show

Except where noted, CA residents add sales tax

Loan and rental is for one month, except by pre-arrangement

_ Reliability and Risk: Computers and Nuclear War. An award winning half-hour
documentary on accidental nuclear war, the reliability of computers in critical
settings, and the computer aspects of the SDI. November 1986 [Slide show rental:
$75. Videotape rental: $25. Videotape purchase, Beta or VHS: $35, U-matic: $50.
Shipping and handling: $7.00.

_ "SDI: Is the Software Feasible?" Seminar sponsored by the Library of Congress
for Congressional staff members. 1 hour, April 1986. Features Danny Cohen (SDIO)
and Dave Redell (CPSR) presenting opposing views. Includes questions from the
audience. [Available as a loan only. Shipping and handling: $7.00, no sales tax]

_ "To Err..." WHA Madison Public Television presentation on computer failure.
Features several members of CPSR/Madison, 15 minutes, May 1985. [Available as a
loan only. Shipping and handling: $7.00, no sales tax]

_ MIT debate on the feasibility of the SDI. Co-Sponsored by the MIT Laboratory
for Computer Science and CPSR/Boston, approx. 2 1/2 hours, October 1985.
Moderator: Mike Dertouzos (head of LCS); pro- SDI: Danny Cohen and Chuck Seitz
(SDIO); con-SDI: David Parnas (University of Victoria) and Joseph Weizenbaum
(MIT). [Rental: $50] Transcript also available (please call).

Books

CA residents please add sales tax

Please add $3 for postage and handling.

_ COMPUTERS IN BATTLE: Will They Work? Edited by David Bellin and Gary Chapman,
Harcourt Brace Jovanovich, 224 pages, 1987. An anthology of perspectives on the
role of computers in the arms race, written and edited by CPSR members.
Available in bookstores or from the National Office. Cost: $14.95.

_ EMPTY PROMISE: THE GROWING CASE AGAINST STAR WARS. Union of Concerned
Scientists, John Tirman, editor, Beacon Press, 230 pages, 1986. Features
chapters by eight authors. Cost: $7.95.

_ THE SACHERTORTE ALGORITHM and Other Antidotes to Computer Anxiety. John Shore,
Viking Press, 256 pp., 1985. Cost: $7.95.

CPSR, P.O. Box 717, Palo Alto, CA 94301 (415) 322-3778

Archived CPSR Information
Created before October 2004
Announcements

Sign up for CPSR announcements emails

Chapters

International Chapters -

> Canada
> Japan
> Peru
> Spain
          more...

USA Chapters -

> Chicago, IL
> Pittsburgh, PA
> San Francisco Bay Area
> Seattle, WA
more...
Why did you join CPSR?

To network and volunteer to support initiatives.