Personal tools
hoffman3.html
COMPUTER PRIVACY THREATS AND REMEDIES
Lance J. Hoffman
Department of Electrical Engineering and Computer Science
The George Washington University
Washington, D. C. 20052
hoffman@seas.gwu.edu
Abstract
Some potential new problems related to privacy and computers are presented, and the idea of a privacy audit is introduced. Some network privacy and security issues are mentioned, and a standard of due care is discussed. Some possible incentives for producing more privacy-sensitive systems are suggested.
1. Introduction
Many would argue that the elements of invasions of privacy have been creeping into place as more and more computer systems, networks, and applications have blossomed. Systems have now been developed such as those supporting on-line submission of income tax returns, on-line criminal histories, and on-line telephone directories available nationwide. Each could lead to more invasion of privacy by making more personal information available to more people at an inexpensive price. On the other hand, each could also lead to more privacy -- by eliminating proliferation of paper and excess paper shuffling.
2. Current Concerns
There are a number of current concerns related to technology, freedom, and privacy. We discuss just a few here.
2.1. Telephone Personal Identification Number
Personal Number Calling (PNC) "may represent the most revolutionary shift in telephone service since direct distance dialing was introduced in the 1950s." [Dorros 1990] With PNC, a phone number is permanently assigned to an individual. No longer would "fixed location" telephone numbers be called. Along with the benefits -- instant communications while traveling, for example, -- come the possibility of more invasions of privacy and of unprecedented ability to keep track of a given person. Nippon Telegraph and Telephone expects to introduce this in major Japanese cities by 1992.
2.2. Universal Identification Number
The universal identifica PNC raises a concern which really has been an important issue for some time [HEW 1973, PPSC 1977], that of the universal identification number. Obviously a unique universal identifier for each person makes administration of public and private (non-computer and computer) programs much simpler and more efficient. On the other hand, in the hands of a tyrant it can make the harassment or elimination of a given target population much more efficient as well.
2.3. Data Mining
The recent well-publicized introduction and subsequent withdrawl of the Lotus Marketplace:Households product is instructive. This system, a series of CD-ROMs containing personal information on 80 million American households, would have contained data fields of name, address, age, gender, marital status, household income, lifestyle, dwelling type, and buying propensity. Some of the data was actual, some of it modelled. The data was to have been provided by Equifax, one of the largest credit reporting agencies in the United States.
If individuals wished not to be included, they had to provide Lotus with their social security number, since the data (supplied by Equifax, a credit reporting agency based in Atlanta), was keyed to that number, which is almost a unique identifier. This, of course, facilitates the compilation of (new) personal information and raises many of the same problems as the use of a universal identifier does.
Lotus Marketplace raised unique privacy problems. As was pointed out in the Wall Street Journal of January 23, 1991,
-
`Marketplace touched a raw nerve among consumers,
and took on a broad symbolic significance in the debate
over electronic privacy. When Lotus offered to delete
data about anyone who called or wrote, it was flooded
with about 30,000 requests. Consumers learned about
the product through widespread news reports. ...
Marketplace also became one of the hottest topics on
the computer networks linking technology students and
professionals. Complaints and protest letters were
posted an copied on hundreds of networks. Opponents
circulated Lotus's phone number and the electronic-mail
address of Jim Manzi, its chief executive officer. "If
you market this product, it is my sincere hope that you
are sued by every person for whom your data is false,
withe the eventual result that your company goes
bankrupt," declared one letter to Lotus that was posted
on several networks.'
- `Privacy advocates' chief objection to Marketplace was that it wouldn't be easy enough for consumers to delete their data, or correct any inaccuracies. They worried that even if Lotus offered to update the disk with corrections and deletions, offending earlier versions would still go on sale.'
The Code of Fair Information Practices [HEW 1973] requires that data holders provide individuals with access to personal records and also an opportunity to review the records and to make corrections. An important principle in the Code is that personal information acquired for one purpose should not be used for another purpose without the individual's consent. This principle is routinely ignored by direct marketing organizations.
All of the information contained in Marketplace was acquired indirectly. Virtually none of the record subjects would have been provided an opportunity to consent to the use of their personal information in this manner.
By way of contrast, the most common public directory -- the telephone book -- allows individuals to determine, at the time they request phone service, how they would like their personal information disclosed to the public. Single women often substitute an initial for a first name and many phone subscribers do not include street listings. Most important, phone users may choose to remain unlisted and many exercise this right. [Rotenberg 1991]
3. Privacy Audits
Future developers of similar "data base mining" products like this will soon come to realize that they have to subject their systems to some sort of "privacy audit", rather than just "strip mine" the data. This privacy audit would entail the examination of what types of information are in the database in the first place, what could it possibly be used for, how it is protected, how data subjects are allowed to withdraw their names, and, if that is impractical or impossible, what the system can do to deter improper use.
The data in the system could be protected by encryption. Of course, once the legitimate user is authenticated and enabled to read the data, it becomes unencrypted to him or her and then can be electronically be transferred in an unencrypted form.
A centralized system could keep track (at some cost) of who uses what by maintaining a log of accesses throughout the system. There may be technical solutions to safeguard individual rights and expectations while not raising the costs of new systems to a prohibitive level.
Decentralized systems rely, of course, on legal safeguards. But it is well known that the law always lags behind new technology. When and if the whole legal safeguard system collapses, what are the implications for the unfortunate data subjects trapped in the middle? And who is responsible for any damages to these data subjects?
Technical safeguards for any system should be addressed very early, just as legal safeguards are, and appropriately budgeted for and put in place. Designing in security is far more efficient and effective than adding it as an afterthought.
4. Network Security and Privacy Concerns
The proliferation of computer networks with no policy guidelines raises a number of issues, among them the following [Hoffman 1991]:
- Are we designing the same privacy mistakes into our computer
and communications networks that we already see in the
telphone network (obscence calls, caller ID, etc.)
- Who (if anyone) should do gatekeeping before persons are allowed to connect to a net? What, if any, rules should there be?
- Is it possible to track illegal accesses and to effectively thwart and/or sanction them?
- Is it inevitable that some users or programmers will be regulated [Kocher 1989, Hoffman 1989]?
- Are network operators liable for legal damages if data subjects are wronged?
- Just recently, various Internet working groups have started looking more closely at security and privacy policy issues.
- Who (if anyone) should do gatekeeping before persons are allowed to connect to a net? What, if any, rules should there be?
5. Should New Products Meet A Standard of Due Care?
Years ago, Donn Parker pointed out certain analogies between computer software and other products, and suggested that certain standards (he called them "standards of due care") were necessary before a product should be deemed marketable. In their attack against Lotus:Marketplace, Marc Rotenberg and Computer Professionals for Social Responsibility (CPSR) argued that the product did not meet either of two standards (the Code of Fair Information Practices [HEW 1973] and the ACM Canons of Conduct [ACM 1982]). The idea of requiring a new computer product to be "built to code" is not novel; it is already in use in the computer security community, where (especially) defense contractors and agencies specify systems with a certain "level of trust", as defined in the "Orange Book" [NCSC 1983]. So who decides, and who awards the "seal of approval"? There is not (yet) an Underwriters Laboratory for the computer field, but recently some interesting suggestions have been made in this regard [NRC 1990] for the subfield of computer security.
6. Incentives
Some new products have the potential for injury, but that is not enough. Legislators are loathe to regulate when there is not enough injury shown. Legislation which does not hinder the development of new technologies but which causes them to be developed in a responsible way may be one answer, although we want to be wary of over-legislating. There are other solutions as well, which involve changing the reward system of not only computer scientists and other technologists. This could be done in a number of ways: putting teeth in the enforcement of IEEE and ACM codes; having the professional societies give awards for the most responsibly designed systems; and/or publishing a "dirty dozen" list of the worst system designs with respect to privacy and freedom.
7. Summary
We have discussed some new problems related to privacy and computers (including networks) and recalled others' ideas of a privacy audit and a standard of due care. Some possible incentives for producing more privacy-sensitive systems were also suggested.
References
[ACM 1982] ACM Canons of Conduct, in Weiss, Eric (ed.), Self assessment procedure IX: a self- assessment procedure dealing with ethics in computing. Communications of the ACM 25, 3 (1982), 183. [Denning 1990] Denning, Peter J., Computers Under Attack: Intruders, Worms, and Viruses, Addison-Wesley Publishing Co., New York, 1990. [Dorros 1990] Dorros, Irwin, "Calling People, Not Places", Communications Week, p. 12, Sept. 3, 1990. [HEW 1973] "Records, Computers, and the Rights of Citizens", Report of the Secretary's Advisory Committee on Automated Personal Data Systems, U. S. Dept. of Health, Education, and Welfare, July 1973, US GPO Bookstore Stock No. 1700-00116. [Hoffman 1991] Hoffman, Lance J. and Paul C. Clark, "Imminent Policy Considerations in the Design and Management of National and International Computer Networks", IEEE Communications Magazine, February 1991 (also appeared in Computing Research News, Vol. 3, Nos. 1 and 2, January and March 1991). [Hoffman 1990] Hoffman, Lance J., Rogue Programs: Viruses, Worms, and Trojan Horses, Van Nostrand Reinhold, New York, 1990. [Hoffman 1989] Hoffman, Lance J., statement before the Subcommittee on Criminal Justice of the House Judiciary Comm. on computer virus legislation, November 8, 1989. [Hoffman 1980] Hoffman, Lance J., Computers and Privacy in the Next Decade, Academic Press, New York, 1980. [Kocher 1989] Kocher, Bryan, "A Tangled Web of Laws", Communications of the ACM 32, 6 (June 1989), 660-662 (reprinted in [Denning 1990]). [NCSC 1983] National Computer Security Center, DoD trusted computer system evaluation criteria, CSC-STD-003-85, National Computer Security Center, Ft. Meade, Md., 1985. [NRC 1990] National Research Council, Computers at Risk: Safe Computing in the Information Age, National Academy Press, Washington, D. C., 1991. [PPSC 1977] Personal Privacy in an Information Society, the report of the Privacy Protection Study Commission, Washington, D. C., U. S. Government Printing Office, 1977. [Rotenberg 1991] Rotenberg, M., "Privacy Protection and Lotus Marketplace: What Position Should the ACM Take?", submitted to the ACM Committee on Scientific Freedom and Human Rights, 1991. [Westin 1972] Westin, A. F. and M. A. Baker, Databanks in a Free Society, Quadrangle Books, New York, 1972.
Copyright, 1991, Jim Warren & Computer Professionals for Social Responsibility All rights to copy the materials contained herein are reserved, except as hereafter explicitly licensed and permitted for anyone: Anyone may receive, store and distribute copies of this ASCII-format computer textfile in purely magnetic or electronic form, including on computer networks, computer bulletin board systems, computer conferencing systems, free computer diskettes, and host and personal computers, provided and only provided that:
- this file, including this notice, is not altered in any manner, and
- no profit or payment of any kind is charged for its distribution, other than normal online connect-time fees or the cost of the magnetic media, and
- it is not reproduced nor distributed in printed or paper form, nor on CD ROM, nor in any form other than the electronic forms described above without prior written permission from the copyright holder.
Return to CFP'91 Index page. |
Return to the CPSR home page. |
Send mail to webmaster. |
Created before October 2004