Privacy in the Computer Age
Part III: Legal Issues and Conclusions
Ronni Rosenberg CPSR/Boston
This is the final installment of this three-part article.
However basic the right to privacy may seem, rights exist only as defined by the
body of laws in each country. . .1
A good starting point for U.S. legal history related to privacy is 1890, when
Samuel Warren and Supreme Court Justice Louis Brandeis published "The Right to
Privacy."2 The article represents the first official call for the legal
profession to pay attention to privacy, and it sets forth the idea that the
individual has the right to decide what personal information crosses the
boundary from private to public. It is reputed to have been written primarily by
Samuel Warren, who was motivated by his anger over the newspaper publication of
what he considered personal details about his wife's entertaining.
In the early 20th century, torts cases dealing with privacy arose as a response
to the taking and printing of unauthorized photographs, made possible by
technological advances in instant photography and halftone printing. The courts
ruled that such photographs were an invasion of privacy, because a reasonable
person did not expect to have his or her picture taken when going outside.
Throughout this century and until recently, legal precedents related to privacy
have been based on four recognized categories of injuries, listed below along
with some problems in applying these precedents to computerized information:
1. Appropriation of a person's name or likeness for commercial advantage.
Problems: Computerized information is not simply a person's name, and it is not
clear when a computer file turns into a likeness.
2. Intrusion into private life; i.e., intrusion upon a person's seclusion or
solitude or into a person's personal affairs.
Problem: The collection of computerized information is not physical intrusion,
3. Public disclosure of embarrassing private facts about a person, even if they
Problem: How do you define "private facts" in a centralized computer system,
since private facts, when mingled with public facts, may lose their private
4. Publicity that places a person in a false light in the public eye.
Problem: To win a case using this precedent, one typically must demonstrate
financial loss, not just loss of reputation, but the latter alone can result
from the dissemination of computer files.
These precedents do not address some problems that are of special concern in the
case of computer files. For example, they do not consider individual control of
the context in which information is used, yet such control is necessary because
computers have encouraged much greater sharing of information than was practical
with manual files.
These and other shortcomings led to a growing feeling in legal and political
circles that "existing precedents . . . are simply too far removed from the
kinds of damage which might arise through the use of computers."3 This feeling
in turn led to the initiation, in the 1960s, of congressional hearings on
privacy and computers, spearheaded by Senator Sam Ervin, chairman of the Senate
Constitutional Rights Subcommittee and the Senate Government Operations
Committee. Among the regulations proposed during the course of these longterm
hearings are the following:
¥ Establish a register of all data banks, and open it to the public.
¥ Make it a criminal offense to avoid registering a data bank in this register.
¥ Register data bank security procedures.
Store only facts, not opinions.
¥ In publicly accessible data banks, store only data relevant to the purpose in
¥ Automatically log all interrogations of data banks concerning the public.
Give the public the right to inspect computer records about themselves.
¥ Give individuals the right to take issue with personal data about themselves
(e.g., by adding to their record when they believe that to do so will give a
¥ Make negligent data bank operators liable for damages.
¥ Remove aged data, either by deleting it completely or by archiving it and then
restricting access to the archived data.
By the early 1970s, it was clear that hearings were not sufficient, and that
privacy legislation had fallen seriously behind technological innovations:
The law . . . from time to time must be restructured as values, technology, and
social needs change. The technology of gathering, storing, and processing
information has changed sufficiently to make it necessary to redress some
balance in favor of privacy.4
Soon thereafter, privacy statutes specifically related to computers began to be
enacted, most notably the Fair Credit Reporting Act of 1970 and the Privacy Act
of 1974, both discussed separately below. These acts incorporated some, but not
all, of the proposed regulations listed above.
Even as laws were enacted to protect individual privacy by more closely
regulating the use of personal information in computers, some court decisions
were reflecting changing expectations about what degree of privacy was
reasonable. For example, the courts now hold that a reasonable person can expect
to have his or her picture taken when going outside. In 1976, the case of Paul
v. Davis5 went before the U.S. Supreme Court. Davis had been arrested 18 months
earlier for shoplifting. While charges were still pending, a police chief
circulated a flyer to local merchants, with the names and photos of active
shoplifters, including Davis. Davis claimed that his constitutional rights to
due process, privacy, and liberty were violated. The Court disagreed, and it
ruled that because the criminal records were official, they did not have to be
kept confidential by criminal justice agencies. Subsequent legal decisions have
held that people arrested but not convicted or even tried do not have the right
to prohibit dissemination of their criminal history records for employment
purposes (typically without the knowledge of the individual). One researcher
Currently, therefore there are no constitutional protections for individuals
vis-a-vis arrests and or conviction records no matter how capricious,
erroneous, arbitrary, or illegal the original arrest or subsequent dissemination
of the record.6
Privacy Act of 1974
The Privacy Act7 is the major law regulating the dissemination of federal
computer files about individuals. It applies to federal agencies and the private
organizations with which they do business. It requires these agencies to:
¥ Notify an individual on request if there is personal information about him or
her in the agency s records.
¥ Permit an individual to examine and copy most of those records.
¥ Permit an individual to dispute the contents of the record and place a
statement of the dispute in the file, subject to specified procedures .
¥ Keep records of accesses and disclosures of record information.
¥ Refrain from disclosing information without permission from the individual,
with certain exceptions.
¥ Publish notice of the existence of their files.
The Privacy Act goes further than any previous (or subsequent) law in regulating
computer file use, but it has several serious shortcomings, discussed below.
There is a serious, Òroutine useÓ loophole, through which matching programs have
flowed. The Privacy Act specifies the following conditions of disclosure of an
No agency shall disclose any record which is contained in a system of records by
any means of communication . . unless disclosure of such record would be for a
routine use... The term routine use means . . . the use of such record for a
purpose which is compatible with the purpose for which it was collected.
In practice, the definition of Òroutine purposesÓ has been interpreted extremely
broadly. In 1979, the Office of Personnel Management was asked to release their
records about federal employees to other federal agencies. The data would be
matched with information collected about welfare recipients to identify federal
employees who were getting undeserved welfare benefits. Although this is clearly
not the purpose for which the data was collected, the Office of Personnel
Management decided that the release of their records fell within legal
guidelines. They explained their decision as follows: "An integral part of the
reason the records are maintained is to protect the legitimate interests of
government. Therefore, such disclosure is compatible with the purpose of
maintaining these records."8
With no monitoring agency to control such decisions, this early decision to
match computer data in different systems went unchallenged, leading directly to
the unchecked growth of matching and other nonroutine-use programs throughout
The history of computerized data systems over the last decade shows one clear
trend: they have always been adapted to purposes other than their originally
intended use. There are dozens of examples of new uses for once restricted
personal data systems Social Security Administration files are now used
routinely to identify illegal aliens. The federal Parent Locator Service allows
child support enforcement officials to search virtually all government and
private record systems in order to trace absent parents who owe child support.
Numerous state laws allow or actually require public and private employers to
use criminal history databanks. compiled originally for police use, in order to
screen out applicants convicted of certain crimes, or simply to ascertain if
applicants have arrest records.
The records of hundreds of federal and state public assistance programs have
been matched against each other and against public and private employment rolls,
to identify people receiving multiple benefits for which they are ineligible
because of their earnings.9
Such programs now number in the hundreds, contributing significantly to
individual loss of control over use of data files. The "routine use" loophole is
the most serious problem of the Privacy Act.
There is no mechanism for monitoring compliance to the law. As originally
proposed, the Privacy Act included a provision for the creation of a permanent
independent commission to monitor and regulate the development of significant
new computerized information systems. When the Act was enacted, this provision
was dropped. As a result, adherence to the requirements of the law is not
Because enforcement of the Privacy Act is left almost entirely to the federal
agencies themselves, it is hardly surprising that they have bent the Act to
their own purposes and have now miraculously established that any computer-
matching is a "routine use" of personal records. All that is required to satisfy
the policy of the Act, the agencies say, is to publish each new computer-
matching "routine use in the Federal Register. Thus, the safeguards of the
Privacy Act have been effectively eroded.10
The regulations apply only to federal agencies. As originally drafted, the Act
applied not only to federal agencies, but also to state and local governments,
which ultimately were exempted from the Act's requirements. Also exempted were
federal law enforcement and intelligence agencies, who argued persuasively that
some information in their files (including information provided by informants)
would lose its effectiveness if revealed to individuals, thus rendering the
organization that gathered the information useless.
Police, military, and internal security computers are exempted from most of the
provisions.... The police argue that this is essential. If you give a suspect
access to the police files on him, he may take evasive action and you will never
catch him Many feel that the rising crime rate is more serious than the issue of
Criminal history records were exempted partly because there was an expectation
that special legislation to control them would be forthcoming. Although numerous
bills have been proposed, such legislation was never enacted. As an interim
measure, Congress ordered the Department of Justice to issue regulations
applying the principles of the Privacy Act to federal criminal history records.
Issued in 1975, these regulations give people certain rights to review their
criminal history records, specify certain requirements for the completeness and
accuracy of such records, and set limits on their dissemination. However, state
and local agencies can still disseminate their criminal history records to
anyone, by local executive order. 12
Fair Credit Reporting Act of 1970
The Fair Credit Reporting Act13 is targeted not at government agencies but at
private organizations, specifically those that collect and distribute
information about the credit-worthiness of individuals. A summary of the Act's
"Compilers of credit and 'investigative' reports must:
¥ Eliminate from their reports bankruptcies after 14 years and other adverse
information after 7 years.
¥ Keep record entries on employment up to date; confirm adverse interview
information 3 months before reporting it.
¥ Notify subject that report is being made; whenever employment or credit is
denied on basis of report, subject must be advised of reporting agency that
¥ On request of subject, agency must disclose 'nature and substance' of material
in file (but not file itself); must disclose sources of data; must reinvestigate
item at subject's request; if agency does not correct item, must include
subject's statement on it.
¥ Maintain 'reasonable procedures' to grant reports only to those with
'reasonable interest. '
¥ Agency must not, without written consent of subject, furnish to government
agency more than name, address, and place of employment of subject except when
government has 'legitimate business need. ' "14
Like the Privacy Act, the Fair Credit Reporting Act provides significantly more
protection than existed before it was enacted, but it contains significant
loopholes, described below.
The subject has no way of finding out where credit records exist, unless credit
is denied. The Act specifies no register for credit data banks, and it does not
require that the subject be notified when a new credit file is created. If
someone is denied credit, he or she will learn the name and address of the one
credit agency whose records were used in making that credit decision. Note that
this will occur only if the organization that decided to deny the credit
application lists a poor credit report as one of its reasons. "There is some
evidence that credit-granting organizations sometimes merely state that credit
has been denied for reasons other than an unfavorable report in which case they
don't have to give the actual reason or even indicate if a credit report was
received."15 Moreover, there is no reliable mechanism for an individual to learn
about all the credit agencies that maintain records about him or her.
The actual credit report need never be provided. The subject has no right to
obtain a copy of the actual credit record or even to examine it at the credit
agency. Thus, the subject must rely on the agency to provide a fair, accurate
summary of the relevant information in the record.
If a credit agency determines that there is incorrect information in a file,
there is no assurance that everyone who received that incorrect information will
be notified of the error. If a decision to deny credit is based on inaccurate or
incomplete information from an agency's records, the correction of the record
may occur too late. For example, suppose someone applies for and is refused a
mortgage. Even if the credit agency corrects the record, the correction is
likely to be made too late for the transaction that the individual wanted to
execute. In such a case, the credit agency is in no way liable for damages.
The Act does not do much in the way of limiting who can get access to credit
files. Allowing reports to be granted to anyone with "reasonable interest" and
to any government agency with "legitimate business interest" is as big a
loophole as the "routine use" clause in the Privacy Act.
There are no limits on how long criminal information can be retained in a credit
Agency-Specific Privacy Regulations
The widely applicable regulations of the Privacy Act and the Fair Credit
Reporting Act are supplemented by numerous agency-specific regulations. Among
the most familiar of such regulations are those that seek to ensure the
confidentiality of census information. The Census Bureau currently operates
under an Act passed in 1929, which provides:
That the information furnished under the provisions of the Act shall be used
only for the statistical purposes for which it is supplied. No publication shall
be made by the Census Office whereby the data furnished by any particular
establishment or individual can be identified, nor shall the Director of the
Census permit anyone other than the sworn employees of the Census Office to
examine the individual reports.... In no case shall information furnished under
the authority of this act be used to the detriment of the person or persons to
whom such information relates.16
These are the most strongly worded privacy regulations examined thus far.
Nevertheless, during World War I, the Census Bureau provided law enforcement
officials with the names and addresses of individuals, for use by the Justice
Department in tracking down draft dodgers. During World War II, the Census
Bureau provided the army with aggregate information about the residence of
Japanese-Americans, for use in ensuring that all Japanese-Americans reported for
evacuation. No monitoring agency was around to detect these violations.17
Effects of Privacy Laws
Regulations designed to strengthen personal privacy have had limited success,
for several reasons. First, there is a tension between privacy laws and other
laws, ones designed to ensure government accountability by allowing public
scrutiny of government records. Foremost among such public disclosure laws is
the Freedom of Information Act:
Another law affecting this problem is the Freedom of Information Act, which was
intended to bring the activities of government more into the public view.
Basically, this Act provides that all government records are public records
obtainable by any citizen unless there is specific legislation providing for
confidentiality. However, since most government business also deals with
individuals, the Freedom of Information Act poses a potentially serious threat
to individual privacy. There have already been occasions where agencies have
been required to release personal data under this Act...18
In the case of the Freedom of Information Act, it is generally believed that
this potential threat to privacy is outweighed by the benefits of enhanced
public review of government activities.
Second, as discussed above, existing privacy laws are too vague. Typically,
exceptions to the regulations can be made whenever a "reasonable" justification
can be made. In practice, reasonable justifications can be made for collecting
or distributing just about any information about people.
Third, privacy laws are sometimes technically naive. For example, there is a
large gap in privacy regulations in the area of statistical data banks,
typically intended for social science and other forms of research. It has been
argued that wholly statistical systems do not permit any retrieval of
information about identifiable individuals, hence do not pose any threat to
personal privacy. Upon further examination, however, it becomes clear that such
systems often do contain identifying information, for instance, to permit data
from a survey to be associated with the relevant information from earlier
studies.19 "There is no legal privilege for these data banks at the moment; they
are not protected by any traditional lawyer-client or doctor-patient
relationship; they are unprotected islands of very attractive data which can be
reached by the courts, by investigating committees, and by others."20
Fourth, the laws do not keep pace with technological advancements. As one well-
published computer writer observes, "System designers are not waiting until laws
exist. The technology is changing so much faster than the development of social
mechanisms to cope with it. On the other hand, there is a danger that unduly
restrictive laws might be passed that would prevent some of the social benefits
Given these shortcomings and the lack of active enforcement of the laws, it is
not surprising that the regulations have fallen far short of achieving their
objectives. One study of the use of computerized individual records concluded:
"Regulations and statutes, contrary to common perceptions, have had little or no
impact on the level of employment use of records."22
Alan Westin, certainly one of the most prominent writers on computers and
privacy, describes the failure of the American legal system to regulate computer
data banks satisfactorily:
. . the techniques . . . have outstepped classic legal and social
constraints.... First, American law has no clear cut definition of personal
information as a precious commodity.... When information is not needed to make a
profit, when it involves the flow of disclosure about the individual among those
he comes in contact with and those who exercise authority over him, American law
has had no general theory of value no set of rights and duties to apply as a
general norm. Second, American law has had no general system for dealing with
the flow of information which government agencies and other levels of government
control, apart from a few examples such as census data.... Third, American law
has not developed institutional procedures to protect against improper
collection of information, storage of inadequate or false data, and intra-
organizational use of such information.... Without the opportunity to know what
was in the record, to cross-examine those who had given the information, and to
challenge the evaluation put on the information by government security
officials, individuals were left without effective protection in their personal
reputations and job rights . Finally, American law is seriously challenged by
some of the technological aspects of computer information systems which tend to
work against the kinds of reasonableness standards that the law tries to apply.
. . . the problem of privacy is more a question of public policy and social
conscience than a technical question involving the computer. 24
The conflict between computers and privacy is a continuation of an established
conflict between the legitimate needs of organizations for personal data and the
rights of individuals to control such data. Data banks of personal information
have been around for a long time, in the form of court records, birth and death
records, property records, and so on, and they have been publicly maintained and
available to anyone interested. However, the files were separated, and it was
expensive to bring them together. The dispersed form of the data was a
protection, for it worked to control circulation of the data. Before computers,
the combination of controlled circulation, compartmentalized information, and
limited data collection served to limit infringements on personal privacy:
. . . the prime protection in this area has remained the inability of government
agencies and private authorities to use the mountains of information they had
secured in anything like a centralized and efficient fashion.25
The ability of computers to collect, organize, share, and combine much more data
than ever before, less expensively than before, has exacerbated existing
problems and created some new ones (e.g., problems stemming from widespread
matching programs, which are not practical without computers).
At the same time, computer technology provides the potential for greater
privacy. Computer systems can keep track of the distribution of data by being
programmed to keep detailed audit trails, which can be made available for public
scrutiny to detect violations. Computerized security mechanisms can provide
greater assurance that data is not accessed by unauthorized individuals or
agencies. Computer programs written as part of data bank systems can easily
discard data defined as outdated, and they can flag files that are incomplete.
To date, the potential for such privacy-enhancing applications has not been
tapped, while the development of privacy-reducing applications continues.
Even if more attention were paid to implementing technical means of enhancing
privacy, those means alone cannot do the job. Legislation like that described
above provides important protections, but it could be strengthened by the
addition of some key mechanisms, including:
¥ Registration and public notice of all personal data banks, so an individual
can easily know where files on him or her are maintained .
¥ Establishment of watchdog or ombudsman-type agencies, public review boards,
and industry-wide committees, to ensure adequate audit and review of adherence
to privacy regulations.
¥Allowance of organizational liability for damages to individuals caused by
inaccurate or incomplete data.
In addition, greater public education about and participation in the development
of new computer data banks could significantly affect the course of the
"computers and privacy issue" in the future:
It is true that on at least three occasions during the last twenty years,
sufficient political opposition developed within Congress and other centers of
power to stop or delay the development of a major new computer system that a
powerful arm of the government had decided was needed. The Internal Revenue
Service was blocked in its plan to build a Tax Administration System. The FBI
was blocked in its plan to build a central computerized database containing
information about most of the arrests made by local and state police. The Great
Society planners of President Lyndon Johnson were forced to abandon their dream
of developing a national data system that would centralize all the information
collected about each individual citizen within a single computer.26
Although computer technology, further regulation, and greater public involvement
provide the promise of greater personal privacy, the current situation with
regard to data bank development and its legal regulation appears to favor the
organizations and agencies that collect, maintain, and use the data, often for
profit. It is the collective opinion of the writers whose works on computers and
privacy I reviewed, that this situation is fundamentally unfair:
The burden should be upon the person maintaining the data bank to show that an
individual should not have the right to examine his file, or, if he does have
that right and alleges erroneous entries, that the information is not wrong.
There is no question that this will add to the cost of maintaining the data
bank, but for a person to enforce his right, either inspection or correction,
will cost him time and money, too. It is inequitable to require him to bear the
burden of proof when the entity maintaining the data bank is the one receiving
the prime benefit therefrom.27
Strong safeguards of individual privacy are not only fair, they are essential to
maintaining individual freedom:
If one reads Orwell . . . carefully, one realizes that "1984" is a state of
mind. In the past, dictatorships always have come with hobnailed boots and tanks
and machine guns, but a dictatorship of dossiers, a dictatorship of data banks
can be just as repressive, just as chilling and just as debilitating [to] our
1 C. C. Gotlieb and A. Borodin, Social Issues in Computing, New York: Academic
Press, 1973. p. 76.
2. Samuel D. Warren and Louis D. Brandeis, "The Right to Privacy," reprinted in
Breckenridge, Adam Carlyle, ed., The Right to Privacy, Lincoln, Nebraska:
University of Nebraska Press, 1970. Originally published in the Harvard Law
Preview, 4,15, December, 1890.
3. Gotlieb and Borodin, p. 77.
4. Ibid, p 80.
5. 424 U.S. 693 (1976).
6. Kenneth C. Laudon, ' Employment Use of Criminal History Records and the
Growth of General Purpose National Information Systems." pre-publication draft,
June 1983, p. 16
7. Privacy Act of 1974, Public Law 93-579, 93rd Congress, S. 3418, 31 December
8. David Burnham, The Rise of the Computer State, New York: Random House, 1980.
9. John Shattuck, ''In the Shadow of 1984: National Identification Systems,
Computer-Matching, and Privacy in the United States," Hastings Law Journal, 35,
6, July 1984. p.1000.
10. Ibid . pp. 1003-1004.
11. James Martin, Security, Accuracy, and Privacy in Computer Systems, New York:
Prentice-Hall, Ins. 1973, p. 446.
12. Laudon, "Employment Use of Criminal History Records and the Growth of
General Purpose National Information Systems." pp. 17-8.
13 Fair Credit Reporting Act, Public Law 91-508, 91st Congress, 1971.
14. Martin, p. 437.
15. Robert C. Goldstein. The Cost of Privacy, U.S.A: Honeywell Information
Systems. Inc., 1975. p. 13
16. Act of Congress, Chapter 28. Section 11,46 Stat. 25, 18 June 1929.
18 Goldstein, p. 13.
19. Ibid, p. 10.
20. Alan F Westin, ' Computers and the Protection of Privacy,'' Technology
Review, 71, 6, April 1969. p. 36.
21. Martin, p. 436
22. Laudon. ' Employment Use of Criminal History Records and the Growth of
General Purpose National Information Systems." p. 12
23 Alan F. Westin, ed., Information Technology in a Democracy, (Cambridge, MA:
Harvard University Press, 1971) pp. 304-6.
24 Robert P. Bigelow and Susan H. Nycum Your Computer and the Law, New Jersey:
Prentice-Hall, 1975. p. 139.
25 Westin, Information Technology in a Democracy, p. 303.
26 Burnham, The Rise of the Computer State, p. 219.
27. Bigelow and Nycum, p. 147.
28. Shattuck, pp 1004-1005 Quoting from S Rep. No. 1183, 93d Cong. 2d Sess. 7
This article is available as a complete paper, including references, from the
CPSR National Office. P O. Box 717 Palo Alto. CA 94301. Please send $3.00 to
cover postage and handling
National Security and Electronic Databases
Mary Karen Dahl
National Program Associate
The question is not, will there be restrictions or controls on the use of
commercially available on-line databasesÑthe question is how will such
restrictions or controls be applied.
Diane Fountaine of the Office of the Assistant Secretary of Defense for
Communications, Command Control and Intelligence, November 11 1986
Ms. Fountaine's pronouncement is the most recent articulation of a Reagan
administration policy to restrict access to an ever-increasing body of
information considered vital to national security. As set forth in National
Security Decision Directive 1 45, this policy raises significant questions about
the government's attempt to use high technology as an instrument of foreign
If we regulate the export of high technology hardware on the basis of national
security concerns, why not regulate software and knowledge about technology?
Alternatively, might we not rigorously classify technical data and so prevent
its entry into commercially available databases? Both tactics are under
consideration, according to Deputy Undersecretary of Defense for Trade Security
Policy Stephen D. Bryen. Monitoring the use of databases is also an option.
Donald C. Latham, Assistant Secretary of Defense for Communications, Command,
Control and Intelligence, told Washington Post reporter Michael Schrage. "I'm
very concerned about what people are doingÑand not just the Soviets.... If that
means putting a monitor on Nexis-type systems, then I'm for it. The question is,
how do you do that technically without unnecessary interference?"
As evidence that the government means business, Mead Data Central (providers of
the Nexis database), Lockheed Corporation (operators of Dialog), and others have
been visited by representatives of national defense and security agencies. Says
Mead Data's President Jack Simpson, Until you have received cordial visits by
representatives of the FBI, the CIA, and the Department of Defense,you can't
appreciate the true extent of this issue." Government visitors have asked what
database providers know about their subscribers and what measures could be taken
to safeguard data against undesirable access. In several cases, the Department
of Energy has requested subscriber lists. According to the San Francisco
Examiner's John Markoff, officials at Dialog report that Energy tried to make
continued access to databases run by the National Technical Information Service
(NTIS) contingent on their releasing names of Dialog's subscribers to that
database. Ironically, protection against disclosure of such lists has been
provided by the Electronic Communications Privacy Act, which was signed by
President Reagan only a week before a memo detailing government information
policy was issued by then National Security Adviser, Vice Admiral John M.
[The new privacy act governs access by government entities to records concerning
electronic communications services or remote computing services. Access without
prior notice to the subscriber or customer affected is permitted only if a court
order is obtained. Such court orders may be issued only "if the government
entity shows that there is reason to believe . . . the records . . . are
relevant to a legitimate law enforcement inquiry."]
The Poindexter memo, dated October 29, 1986, has its basis in "National Policy
on Telecommunications and Automated Information Systems Security," directive
NSDD-145. NSDD-145 established a task force called the Systems Security Steering
Group (SSSG) and, operating under the SSSG's direction, the National
Telecommunications and Information Systems Security Committee (NTISSC). Together
these groups include representatives from the Departments of Defense, Energy,
Commerce, and Treasury, the Joint Chiefs of Staff, the Army, Navy, Air Force,
and Marines, the FBI, the CIA, the Defense Intelligence Agency, and the National
The SSSG is to "review and evaluate the security status of those
telecommunications and automated information systems that handle classified or
sensitive government or government-derived information" and to "identify
categories of sensitive non-government information, the loss of which could
adversely affect the national security interest, and recommend steps to protect
such information.'' The NTISSC is further charged with "assisting the private
sector in applying security measures." In this context, Donald Latham's
readiness to monitor Nexis-type systems takes on added significance: he chairs
the NTISSC. The NTISSC is also to Òestablish and maintain a national system for
promulgating the operating policies, directives, and guidance which may be
issued pursuant to this Directive." On the basis of this directive, the DoD,
CIA, and FBI have undertaken meetings with commercial database providers.
According to NSDD-145, the director of the National Security Agency acts as
national manager to implement policy directives. A partial list of the authority
with which the NSA is charged explicitly includes: to Òexamine government
telecommunications systems and automated information systems and evaluate their
vulnerability to hostile interception and exploitation''; to Òconduct, approve,
or endorse research and development of techniques and equipment"; and to review
and approve all standards techniques, systems and equipments for
telecommunications and automated information systems security. "
The Poindexter Memo
It was in his role as SSSG chairman that John Poindexter produced the October
memo, which directs all government agency and department heads to evaluate
information according to its sensitivity. They are then to provide systems
protection as appropriate for that which is "electronically communicated,
transferred, processed, or stored on telecommunications and automated
In essence the October memo describes a strata of "unclassified" classified data
below those categories already in use (e.g., top secret, secret, and
The new category, "sensitive, but unclassified information," is defined as that
the "disclosure, loss, misuse, alteration, or destruction of which could
adversely affect national security or other Federal Government interests." The
administration's tendency to conceive of national security in very broad terms
is evident: "National security interests are those unclassified matters that
relate to the national defense or the foreign relations of the U.S. Government."
And broader still: "government interests are those related, but not limited to
the wide range of government or government-derived economic, human, financial,
industrial, agricultural, technological, and law enforcement information, as
well as the privacy or confidentiality of personal or commercial proprietary
information provided to the U.S. Government by its citizens. "
The application of these guidelines to commercial databases is as yet unclear. A
"sanitized" version of a 200-page Air Force report on the issue is expected by
The current government initiative is not without precedent. The 1976 release of
the Bucy Report ("An Analysis of Export Control of U.S. TechnologyÑA DoD
Perspective") marks the beginning of Pentagon efforts to impede Soviet
acquisition of knowledge about technology rather than the technology itself,
according to Jerry J. Berman, Chief Legislative Counsel for the American Civil
Liberties Union. Berman cites several examples: in 1977, private development of
a cryptology device prompted the National Security Agency to invoke the
Invention Secrecy Act of 1951; in 1979, the Atomic Energy Act was used to enjoin
The Progressive magazine's publication of an article on thermonuclear weapons
drawn entirely from unclassified sources; since 1980, the Arms Export Act of
1976 and the Export Administration Act of 1979 have been used to keep scientists
from giving papers or holding conferences open to Soviet nationals.
What is new is the focus on information in electronic format. Protecting
electronic databases, the argument goes, is necessary because the electronic
format makes possible rapid and extensive searches of large bodies of data that
would otherwise require time-consuming and expensive library research.
Additionally, as attorney Donald Wealdon, an expert in export control law, told
the Washington Post, sophisticated software makes it possible to search and
organize data so as to "create information that could rise to the stature of
proprietary or 'controlled' information." Thus the very same information in
print might well remain unrestricted, although reports in Business Week magazine
(December 1, 1986) indicate that business leaders are concerned that controls on
one form of information may well foreshadow controls on others.
The issues are complex, and the solutions unclear. On the one hand, Soviet
scientists freely admit to using commercial U.S. databases. The U.S. government,
in turn, may elect to "keep sensitive government information out of the public
databases or limit its availability to U.S. and Allied defense contractors." On
the other hand, even the authors of the September 1985 DoD report titled "Soviet
Acquisition of Militarily Significant Western Technology: An Update," in which
this proposition appears, acknowledge that "unfortunately, this may also inhibit
the United States' own research effort by restricting the ready availability of
such information." However, databases run by the Department of Energy and the
National Aeronautics & Space Administration are already restricted. The Energy
Department forbids commercial customers to pass on certain information to
foreign subscribers; NASA refuses subscriptions to unclassified technical
information to a secret list of companies who have foreign customers.
Critical questions are raised by such policies. Should scientific and technical
discourse have less protection under the First Amendment than other forms of
speech? Or, as Representative Jack Brooks (D-Texas) has asked: Does the
administration have legal authority to set policy for telecommunications and
computer systems without full public hearings and congressional review?
Moreover, should a task force dominated by the DoD oversee computer security for
civilian agencies-Ñand for civilian enterprises? The ACLU's Jerry Berman sums
up: "With NSA in the lead, we are on the verge of militarizing our information
systems. Secure systems. Authentication mechanisms. Encryption. Audit trails.
AndÑfull circleÑ limited access to heretofore openly accessed government and
private database systems." The process has already begun.
Berman warns that the United States is "moving more and more toward the
militarization of the flow of scientific and technical information and unless it
is stopped, and I think it can only be stopped politically and not through the
courts, I think our free society is in serious jeopardy."
A study packet and short bibliography are available from the National Office of
CPSR for a fee of $6.00 to cover postage and handling. Those interested in
studying these issues more extensively should contact Mary Karen Dahl at (415)
Programming and the Pentagon:
The Effect of Cost-Plus Contracting on Software Engineering
Eric RobertsÑCPSR/Palo Alto
Almost anything in software can be implemented, sold, and even used given enough
determination. There is nothing a mere scientist can say that will stand against
the flood of a hundred million dollars. But there is one quality that cannot be
purchased in this wayÑand that is reliability. The price of reliability is the
pursuit of the utmost simplicity. It is a price which the very rich find most
hard to pay.
C. A. R. Hoare
1980 Turing Award Lecture
In recent years, the modern military establishment has become increasingly
dependent on computers, which are now embedded in almost every major defense
system. To a large extent, the reliability and effectiveness of these military
systems is directly determined by the quality of the underlying software. This
article examines the effect of the "blank-check" contracting policy
traditionally employed by the Pentagon on the quality of that software. Cost-
plus contracting (which allows a contractor to recover all costs incurred on a
project plus an additional fee retained as profit) predictably leads to higher
costs. Given the current state of software engineering, however, cost-plus
financing can also lead to a reduction in overall quality. This conclusion is
based on two observations about the nature of the programming process:
(1) It is difficult for most programming projects to make effective use of
additional programmers. Unfortunately, cost-plus procurement increases the
tendency towards overstaffing. Most software engineering projects involve
considerable interdependence between the project components and require
effective communication within the engineering staff. In projects with these
characteristics, a larger project staff may actually reduce the overall quality
of the finished product.
(2) There is an enormous range in programming productivity, even among
experienced programmers. Since it is difficult to predict such performance
accurately, companies have a strong incentive to hire many programmers in the
hope of finding a few excellent ones. To recover the cost of the less competent
programmers acquired in the process, it is advantageous to cost-plus projects,
reserving the better programmers for competitive projects in the private sector.
Poor programmers not only work more slowly, but also tend to produce programs
which are less reliable and less efficient.
Defense Spending: Price vs. Performance
Since the beginning of the Reagan presidency, the military budget has increased
dramatically. Against the backdrop of ever-growing Federal deficits, this
increase has led to a greater concern about the cost of that buildup. Spare
parts procurement provides particularly vivid examples of runaway costs: the
military outfits the C-5A transport with a $7,600 coffee maker and shells out
$9,606 for an Allen wrench that can be purchased for 12¢ at the local hardware
Unfortunately, the spare parts scandal diverts attention from more general
problems in the military procurement system. The problem is in fact much larger,
as illustrated by the following quotation from the April 1984 issue of The
Defense Monitor published by the Center for Defense Information:
The horror stories are often explained away by some Pentagon officials as
aberrationsÑunique events in a generally sound system. This is not the
conclusion of the Pentagon Inspector General who is charged with rooting out
these practices. "I think these are not just random mistakes that happened
because some accounting system went haywire," he told Congress. "Overpricing is
a series of problems of a systematic nature [because] many people in the
procurement business were not sufficiently price conscious." In other words they
didn't care what it cost.
At one level, however, this should not be surprising. In the minds of many
people connected with the defense industry, cost is indeed an irrelevant issue.
If more money can bring a stronger defense for the nation, any cost is seen as
justified. The important metric is performance. This attitude was expressed
succinctly by the chairman of the House Armed Services Committee, Congressman F.
Edward Hebert (D-Louisiana), in 1972: "I intend to build the strongest military
we can get. Money's no question."
The primary defense contractors, on the other hand, do consider money as part of
the equation. Like any other large firm, defense contractors seek to maximize
profit, but do so under a vastly different set of rules from those used by
companies operating in the private sector. Under a cost-plus arrangement, the
normal relationship between the price-to-cost differential and profit margin is
broken. As Seymour Melman notes in The Permanent War Economy, "If costs go up,
so too can prices, and thereby profits. This, in a nutshell, is the logic of
From the contractor's point of view, there is also little incentive to improve
the quality of the final product. Most defense systems are contracted to a
single supplier and normal competitive rules do not apply. Moreover, there is no
economic advantage in preserving quality. In his suggestively-titled book More
Bucks, Less Bang, Pentagon cost analyst Ernest Fitzgerald notes:
Contracts for bad weapon programs make as many jobs as contracts for good ones.
Maybe more, because the big contractors are paid extra to work at correcting
their own blunders.
In this climate, it is clear that quality is not necessarily ensured by
increased funding. In the case of projects in which extensive programming is
required, however, following these traditional procurement policies may well
prove disastrous. The following sections examine two areas in which programming
is quite different from most other engineering disciplines and consider how
those differences affect the nature of the defense contracting process.
The Mythical Man-Month
Developing the complex software needed for a modern military system is
fundamentally different from building a tank or a plane. The construction of
military hardware relies on engineering disciplines that are relatively mature
and predictable. Defense contractors have considerable experience in managing
hardware development and plan their projects based on that experience.
Unfortunately, software engineering obeys a different set of rules.
In traditional hardware-based projects, increasing the size of the labor force
usually reduces the time required to complete the contract, since the workers
can operate independently. In programming, this is rarely the case. This
observation forms the central thesis of the highly regarded collection of essays
by Frederick P. Brooks, Jr., entitled The Mythical Man-Month. Professor Brooks
outlines the basic fallacy as follows:
Cost does indeed vary as the product of the number of men and the number of
months. Progress does not. Hence the man-month as a unit for measuring the size
of a job is a dangerous and deceptive myth. It implies that men and months are
Men and months are interchangeable commodities only when a task can be
partitioned among many workers with no communication among them. This is true of
reaping wheat or picking cotton; it is not even approximately true of systems
Since software construction is inherently a systems effortÑan exercise in
complex interrelationshipsÑcommunication effort is great, and it quickly
dominates the decrease in individual task time brought about by partitioning.
Adding more men then lengthens, not shortens, the schedule.
The effect of required coordination on a task is illustrated graphically in
Figure 1, which is taken from Professor Brooks' text. In its most simply stated
form, the conclusion is expressed as "Brooks' Law: Adding manpower to a late
software project makes it later."
But what about the issue of quality? According to the argument above, additional
manpower may delay the delivery time but not necessarily reduce the quality of
the work. Unfortunately, quality also appears to suffer. For projects which are
overstaffed, imperfect communication and lack of coordinated development have a
negative impact on the conceptual integrity of the design. Drawing on his
experience with projects organized by the brute-force approach, Professor Brooks
concludes that the technique of adding manpower will "without a doubt, . . .
yield a poorer product, later."
The Range of Programmer Productivity
One of the more surprising features of software engineering is the extreme
variability in productivity among experienced programmers. Writing in The
Mythical Man-Month, Frederick Brooks describes the phenomenon as follows:
Programming managers have long recognized wide productivity variations between
good programmers and poor ones. But the actual measured magnitudes have
astounded all of us. In one of their studies, Sackman, Erikson, and Grant were
measuring performances within a group of experienced programmers. Within just
this group the ratios between best and worst performances averaged about 10:1 on
productivity measurements and an amazing 5:1 on program speed and space
Thus, the top programmers in a typical research institution are able to produce
code which is qualitatively five times more efficient in one tenth the time. Of
course, such "superprogrammers" find themselves in considerable demand, and each
company has an enormous incentive to attract these extraordinarily competent
Unfortunately, the problem of attracting such superprogrammers is complicated by
the fact that it is difficult to determine whether or not a particular job
candidate has the necessary potential. The Sackman study cited above for
example, could find no correlation between experience and performance among the
sample group. Thus, traditional measures of experience, such as the employment
history listed on a resume, are not sufficient as selection criteria. Instead,
employment decisions must be made on the basis of information subject to a
relatively high level of uncertainty.
In hiring, the company seeks to maximize the total productivity-to-cost ratio
for the workers it employs, and therefore tries to attract as many
superprogrammers as possible. To identify such candidates, the company must
institute some sort of test for determining "superprogrammer potential" which is
then applied to each candidate. Given the uncertainty, however, the test will
have a significant failure rate. Some candidates who seem strong on paper will
turn out to be of average competence, while real superprogrammers may fall
through the cracks.
For the sake of concreteness, suppose that a superprogrammer has a productivity
advantage of a factor of twenty over the merely average programmer and that the
"superprogrammer test" has a false negative rate greater than 10%. It is
difficult to let a candidate walk out the door when there is at least a one-in-
ten chance that that programmer is capable of doing twenty times as much work.
The natural tendency, given the uncertainty, is simply to hire all of the
programmers you can afford in order to maximize the chance of acquiring
superprogrammers. Gerald Weinberg describes this phenomenon in the introduction
to The Psychology of Computer Programming: "one of the corollaries of this
[differential ability] . . . is the high salaries that must be paid to those
programmers who 'have it'Ñand even to some that don't, just on the chance that
Of course, cost-plus contracting means that you can always afford to hire a
candidate. Since the Department of Defense will cover all costs incurred on the
contract, even a programmer who produces no usable software will not affect
profitability. Moreover, there is no compelling incentive to dismiss an employee
who proves to be non-productive. The marginal cost of keeping such an employee
on contract is zero, since that cost can be passed along to the contracting
agency. In fact, there is actually an economic advantage, in that the "bill-out"
rate (i.e., the cost which the contractor pays for the individual employee's
time) is augmented to include a generous overhead fee.
The real danger in overhiring occurs when the firm operates in both the
competitive marketplace and the cost-plus world of military contracting. For
economic reasons, the superprogrammers will most likely be assigned to those
projects where cost-management is essential and where the added expertise
corresponds to a competitive advantage. This is surely not in the sole-source,
cost-plus world of the government contractor, but in the contracts which this
company performs for the private commercial sector. Thus, just as cost-plus
procurement tends to attract the non-productive programmer, the internal
competition among different projects tends to ensure that the competent
programmers will spend most of their time away from the cost-plus contracts.
Similarly, when one finds a minimally competent programmer, it is to a company's
economic advantage to assign that programmer to a project on which the full
labor cost can be recovered. Thus, the cost-plus contracts tend to attract the
least competent programmers.
This problem is complicated further by the observation that the minimally
competent programmers may not be able to complete their assigned tasks. As
deadlines pass on contracts or programs fail to meet performance requirements, a
critical point may be reached at which it becomes necessary to call in a
superprogrammer to finish the work. Unfortunately, by the time this happens,
considerable damage has already been done. The less productive programmers have
not been entirely idle in the months before the superprogrammers take over. Most
of the design decisions and some of the actual programming work from the early
phase will eventually find their way into the final product. As the pressure to
complete a project grows, the superprogrammer's task often degenerates into
finding the quickest way to assemble the existing pieces of the project into a
marginally acceptable systemÑ specifically, one that works well enough to ship
to the contracting agency.
As we move toward a time when more and more of the operational control of
weapons systems depends on embedded software, the reliability of that software
becomes of paramount importance. Unfortunately, because software engineering is
qualitatively different from other engineering disciplines, the traditional
strategy for defense procurement gives us little reason for confidence that the
necessary standards of reliability are maintained. This suggests an urgent need
for reform in defense procurement practices and for a better understanding of
the dynamics of software construction.
Part of this work was produced in collaboration with Dr. Teresa Amott, Visiting
Assistant Professor of Economics at the University of Massachusetts, Boston.
From the Secretary's Desk
Laura GouldÑNational Secretary
The thirty-minute slide show entitled Reliability and Risk, a project of
CPSR/Boston dealing primarily with the use of computers in weapons systems, was
completed in November and promptly won a prestigious Gold Medal for Best
Documentary at the New England Association for Multi-Image competition. Copies
of the tape version of this slide show have already been sent to all chapters
and are also available to contact people and members through the CPSR office in
Palo Alto. To publicize this excellent production, CPSR will mail over 5000
brochures this month to peace groups across the country. Copies of the tape
version of Reliability and Risk will soon be distributed by tan Thiermann's
Educational Film and Video Project in Berkeley. We are negotiating with other
distributors and individuals to ensure wide dissemination.
CPSR/Seattle is planning a Symposium on Directions and Implications of Advanced
Computing to take place July 12, 1987, in conjunction with the AAAI Conference,
which will begin the following day in Seattle. We encourage our readership to
respond to the call for papers for this important Symposium (see insert in this
CPSR's first major membership drive is progressing well. It commenced with full-
page ads in the November issues of the CACM and the IEEE Spectrum and was helped
along by smaller ads in PC World, Publish!, and MacWorld magazines, and by
various chapter activities. In December, a 75,000-piece bulk mailing went out to
selected entries from the ACM and IEEE mailing lists. About 400 new people
joined during the first few weeks in response to this mailing, a few as lifetime
members, and membership mail continues to arrive at the average rate of 25
pieces per day.
We are pleased to announce that a new chapter named CPSR/New Haven, centered
around Yale University, has just formed. Several new contact people have
appeared as well (please see contact information pages contained in this
Newsletter). We expect that several new CPSR chapters will form during the
General elections will be held this spring for two Director-at-Large positions
and one Special Director position on CPSR's Board of Directors (please see
insert in Newsletter). Regional elections will also be held for various Regional
Director seats which will become available on July 1, 1987. We hope all CPSR
members will exercise their right to vote and will respond promptly to the
election materials which will be sent during the month of March.
Cliff Johnson's case against Caspar Weinberger, charging that launch-on-warning-
capability is unconstitutional, was to have been heard in Federal District Court
in San Francisco on December 10. Judge Lynch has postponed the hearing until
January 30 and requested further briefing because Weinberger's attorneys
introduced new substantive arguments in their response, indicating for the first
time that they are taking the case seriously. (Their previous responses have
simply indicated that this is a political matter and not justifiable.) Their new
position is that neither Congress nor the judiciary can interfere with the
President's operation of a launch-on-warning capability, that this is inherently
his right given his constitutional warpowers as commander-in-chief, and that he
has the authority to delegate control over the release of nuclear weapons to the
military. The Lawyer's Alliance for Nuclear Arms Control (LANAC) has filed a
petition to appear as amicus curiae in this case. More about this in the next
A New Telecommunications Tool
John LarsonÑCPSR/Palo Alto
Electronic mail discussions were the catalyst for the formation of CPSR, but
until now the lack of connectivity between mail systems has been a barrier to
using this powerful tool for enhancing communication within CPSR as a whole. The
large international networks such as Arpanet, CSNET, Bitnet, uucp, and many
corporate networks are already connected to each other by various mail gateways.
Unfortunately, the commercially available electronic mail services provide
little or no access to these connected networks. (This will change in a couple
of years when X.400 mail gateways become common.) Many CPSR members, including
nearly all the national officers, Board members and staff, use electronic mail
on the connected networks, but many other CPSR members have no way to access
these networks to communicate with their colleagues.
PeaceNet, a nonprofit organization dedicated to the support of peace oriented
network activities, has broken down this electronic barrier. PeaceNet offers an
electronic mail service running on a powerful Unix minicomputer which is
gatewayed to the connected networks through Unix uucp connections. CPSR/Palo
Alto member Dave Caulkins and I have worked with PeaceNet founders from the
beginning, so we are confident that this system will be very useful for those
who need such network tools. PeaceNet offers access to many interesting
databases and conferences as well as electronic mail, and numerous peace
organizations are already making good use of this system. The system is
available to anyone who has a terminal or personal computer and a modem, and
most people will be able to access it via a local phone call through the GTE
Telenet system. The cost of using this system is in line with other commercial
services and well worth it for the services it provides. Furthermore, the money
supports a very valuable organization. If you are not currently able to use one
of the connected networks to communicate with other CPSR members, consider
writing or calling PeaceNet and asking for their information packet. Thanks to
PeaceNet, CPSR has the potential to become a much more connected and effective
Berkeley, CA 94704
Privacy Bill Passed
On October 2, 1986, the Electronic Communications Privacy Act of 1986 won final
approval from Congress. The legislation is intended to bring existing law into
accord with new technologies such as computer-to-computer communication,
networks, and bulletin boards. The result of two years of hearings and
negotiations, the final version of the bill represents a broad coalition of
interests. President Reagan signed the measure into law October 21,1986.
Title III of the Omnibus Crime Control and Safe Streets Act of 1968 (known as
the Wiretap Act) had for the first time explicitly extended the Fourth
Amendment's protection against unreasonable search and seizure to telephone
conversations. That legislation applied primarily to the "aural acquisition" of
voice communications carried by wire over some kind of common carrier. It
protected neither private networks, the transmission of data, nor stored
communications. The new law fills some of those gaps. It amends the Wiretap Act
to cover "electronic communications," defined as "any transfer of signs,
signals, writing, images, sounds, data, or intelligence of any nature
transmitted in whole or in part by a wire, radio, electromagnetic,
photoelectronic or photooptical system that affects interstate or foreign
commerce." It restricts disclosure of stored communications and provides civil
and/or criminal penalties for individuals who, without authorization,
"willfully" intercept or disclose the contents of electronic communications or
who access such communications while in electronic storage. It also sets out
procedures government officials must follow to intercept data and access
The new measure caps a bipartisan effort. Representative Robert Kastenmeier, D-
Wis., chairman of the House Judiciary Subcommittee on Courts, Civil Liberties
and Administration of Justice, sponsored hearings on civil liberties in 1984 and
drafted the House legislation. The Justice Department was at first "reluctant to
tinker" with the Wiretap Act, having found it a valuable law enforcement tool,
but the ranking Republican on Kastenmeier's subcommittee, Carlos J. Moorhead, R-
Calif., encouraged Justice to cooperate. Senators Charles Mathias, Jr., R-Md.,
and Patrick J. Leahy, D-Vt., cosponsored the Senate version of the bill.
Support for the new legislation was drawn from diverse groups, including TRW,
IBM, MCI, GTE, AT&T, the American Civil Liberties Union (ACLU), the Institute of
Electrical and Electronics Engineers (IEEE), and the Electronic Mail
Association, among many others.
1987 CPSR Annual Meeting and Banquet
October 17-18, 1987
Massachusetts Institute of Technology
The 1987 CPSR Annual Meeting and Banquet will be held October 17 and 18 at the
Massachusetts Institute of Technology in Cambridge, Massachusetts. The Annual
Meeting and Banquet are opportunities for CPSR members to meet and discuss the
issues around which CPSR was formed, and to celebrate the work of CPSR in the
previous year and the years to come. The Annual Meeting is a chance for CPSR
members to meet other members from around the country, and to meet colleagues
from around the world. The 1986 Annual Meeting was attended by computer
professionals from across the United states and from Italy, Norway~~ Australia
and other countries
Schedule of the Annual Meeting:
Saturday, October 17: The morning session of the Annual Meeting, to be held in
Building 10-250 at MIT will feature several nationally-recognized speakers on
issues such as computers and the strategic Defense Initiative, accidental
nuclear war, and computers and civil liberties. A chance for questions and
discussion will follow each talk.
The afternoon session on Saturday will consist of a panel discussion on
computers and ethics, featuring Professor Joseph Weizenbaum of MIT, and
Professor Deborah Johnson of Rensellaer Polytechnic Institute, among other
panelists. Again, there will be an opportunity for open discussion by the
Sunday, October 18: Meeting again in Building 10-250 at MIT, participants in the
Sunday program will have an opportunity to hear about the work of CPSR chapters,
discuss the state and future of the organization, and participate in small
workshops on issues of concern to CPSR members. Sunday will feature reports on
chapter work by chapter representatives, a report on the financial status of the
organization, a summary of international work, and then an open discussion about
the character of the organization. This is an opportunity for CPSR members to
express themselves to the national leadership and to fellow members about their
interests and goals for CPSR.
The CPSR Annual Banquet, Saturday evening, October 17: The CPSR Annual Banquet
will be held in Walker Memorial Hall on the MIT campus, overlooking the Charles
River. The Banquet is a celebration of the accomplishments of the organization,
and a chance to socialize. The Banquet will feature a keynote speaker, to be
announced, who will be a nationally prominent figure addressing an issue of
significance to the work of CPSR. Last year's Banquet featured both Dr. Herbert
Abrams, one of the recipients of the 1985 Nobel Peace Prize, and Congressman
Edward Markey, author of the nuclear freeze resolution in the House of
Representatives. Participants at the 1986 Banquet will undoubtedly recommend
that the 1987 dinner should not be missed.
Further details about the 1987 CPSR Annual Meeting and Banquet will be announced
in the next issue of the CPSR Newsletter, and in mailings to CPSR members.
Please reserve a space on your calendar to attend these events October 17 and
18, 1987. Help celebrate the work of your organization. Computer Professionals
for Social Responsibility.
Focus on High Tech
The Fifth Biennial Student Pugwash USA International Conference will be held
June 28 through July 4, 1987, at Stanford University. Student Pugwash USA is a
non-profit organization which promotes dialogue among students regarding the
social implications of science and technology; students from twenty-five nations
have attended in the past. This summer's conference will be entitled Choices for
Our Generation: Ethics and Values at the Cutting Edge of Technology and will
focus on the following issues:
- Dilemmas for the Future of Computing Reproductive Technologies -- Water:
Politics, Pollution, and Supply - Science and Technology in the Media - Roles
for the Biotechnologies in International Development - Nuclear Proliferation and
the International Control of Atomic Energy
For each of the listed topics, a working group of ten to fifteen undergraduate
and graduate student delegates and five accomplished senior participants will
meet throughout the week. Senior participants for the computing group include
Terry Winograd, professor of computer science at Stanford, and Andy Hertzfeld,
code designer for the Apple Macintosh.
Students are admitted to the conference by application consisting of an abstract
and outline of a paper, a transcript, and a brief essay (deadline March
20,1987). Those accepted will receive food and lodging for the conference. For
further information please contact:
Benjamin Austin, Conference Director
Student Pugwash USA
505-B 2nd St.
Washington, DC 20002
Letters to the Editor
Doug Schuler's article in the last newsletter exhibited a pattern I've noticed
often in CPSR publications and in conversations I've had with other CPSR
members. The pattern goes as follows: an issue is raised; the issue is discussed
in a technical manner; non-technical (i.e.,
political/economic/ethical/philosophical) questions concerning the issue are
raised and then the discussion ends. The problem l see here is that we never
seem to get around to addressing the non-technical questions. I find this
distressing because many of them have great importance to the issues we are
concerned with, and in fact must be addressed sooner or later if we are to be
effective in our efforts. An example of what I'm talking about occurs at the end
of Doug's article. There he posed the question of whether or not technology is
itself to blame for its being used in socially irresponsible ways. If this is
really a question then shouldn't it be addressed prior to proposing
technological solutions to social/political problems? DoesnÕt this question
challenge the very premises CPSR is built upon?
There is another element to the pattern described above. It is that there
usually is a disclaimer made that we as computer professionals don't have the
knowledge or experience to deal with non-technical questions. This is not an
acceptable excuse. If we are to be socially responsible we must address all the
important aspects of the issues we take on. This means tackling the nontechnical
questions as well as the technical ones.
If, as it appears, we as a profession are ignorant of the means of answering
non-technical questions it seems to me that rectifying this situation should be
high on our priority list. Towards these ends I propose that we develop a
college level course aimed at giving computer professionals an understanding of
how to conduct political economic/ethical/ philosophical inquiry, as well as a
familiarity with the significant issues that computer professionals need to
address. Once we've developed this course we can work toward its incorporation
into CS programs at schools offering CS degrees, its inclusion in one form or
another into professional seminars, and its use as a basis upon which study
groups in CPSR chapters can be formed. Those interested in participating in such
a project may contact me with their ideas and suggestions on how it should be
pursued. I'll compile all the responses and route them to all those who have
shown an interest. From there we can decide what our next step should be.
4314 NE 72nd Ave.
Portland, OR 97218
(503) 288- 1991
The last CPSR Newsletter contained two articles, "A Responsible Computing
Initiative" by Doug Schuler and "Not Without Us" by Joseph Weizenbaum, each of
which was a personal statement. Although I share their frustration that our
profession is so heavily biased toward warmaking rather than humane goals,
nonetheless I have reservations about both articles.
In the case of Schuler's article, my complaint is that he fails to address what
seems to me to be the underlying problem. Certainly, we put too much into
"defense" and too little into other societal needsÑmedical research, ecological
concerns, etc. But it is too easy just to name things we wish our profession
were doing. The problem, unfortunately, is not a lack of ideas for better ways
to spend research
money; the problem is the set of priorities by which our government and society
allocate funds. And we can blame the government only to a limited extent. We,
after all, elect them; and the society by and large supports existing priorities
Of course these priorities are neither accidental nor whimsical. They stem from
a powerful military-industrial complex, coupled with an attitude about our
"enemies" that has been carefully nurtured in the public mind for well over a
generation. It isn't easy to see how to change these things and we must be
realistic about the power that a limited number of computer people concerned
about social matters can wield. No doubt holding up an alternate vision is
helpful, but selling it to the society seems likely to be the tough part. What I
would like to see are some realistic proposals in that area.
Finally I want to object to the title: ÒA Responsible Computing Initiative"
which somewhat pejoratively suggests that existing initiatives are
irresponsibleÑ without offering any justification for that charge. It seems to
me that this amounts to name-calling and is more likely to offend than titillate
those who feel that the threat of Soviet expansionism is real. Many responsible
people feel that suggestions to reduce 'defense" spending are themselves
dangerously irresponsible. Although I disagree with existing priorities, l feel
one is obligated to explain why one feels as one does. Ignoring or summarily
dismissing such concerns makes one seem just as myopic as "the other side."
In assessing Professor Weizenbaum's article I am left on the horns of a dilemma.
We are admonished to avoid working on vicious projects, yet at the same time we
are told that any research may potentially be used for such purposes. The
logical conclusion is that the only way to be sure one does no research that
contributes to warmaking technology is to do no research at all. Although
Weizenbaum seems to stop short of recommending that, he fails to make clear what
he is recommending.
It seems to me that while one might conclude that most technology has served
mankind's aggressive purposes far better than his humane ones, nonetheless few
people are prepared to act on that conclusion. Instead, most of us regularly
face difficult decisions about what we will and will not do. There are no clear
lines, and each of us must draw his or her ownÑalways a difficult task. In the
last analysis all I would recommend is to 'let your conscience be your guide." I
applaud Weizenbaum's reminder that these are responsibilities we should not
shirk and that we all do have consciences.
CPSR National Chairman
Created before October 2004