Personal tools


CFP'91 - Hoffman 1

Personal Information & Privacy-I

Tuesday, March 26, 1991

John Baker

Janlori Goldman

Marc Rotenberg

Alan F. Westin

Lance Hoffman, Chair

Copyright (c) 1991 IEEE. Reprinted, with permission, from The First Conference on Computers, Freedom and Privacy, held March 26-28, 1991, in Burlingame, California. Permission to copy without fee all or part of this material is granted provided that the copies are not made or distributed for direct commercial advantage, the IEEE copyright notice and the title of the publication and its date appear, and notice is given that copying is by permission of the Institute of Electrical and Electronics Engineers. To copy otherwise, or to republish, requires a fee and specific permission.

Published in 1991 by IEEE Computer Society Press, order number 2565. Library of Congress number 91-75772. Order hard copies from IEEE Computer Society Press, Customer Service Center, 10662 Los Vaqueros Circle, PO Box 3014, Los Alamitos, CA 90720-1264.

HOFFMAN: ... Welcome to the first of two sessions on personal information and privacy. My name is Lance Hoffman. I'm professor of Computer Science at the George Washington University in Washington, D.C., and I am the moderator for both ... sessions.

In this first session, we have two individual presentations followed by a debate - the "Mother of All Debates" some have called it [laughter] between Alan Westin and Marc Rotenberg. ...

Our first speaker is Janlori Goldman. Janlori is an attorney who directs the American Civil Liberties Union Project on Privacy and Technology. The project, which was created in the auspicious year of 1984, looks at the way that new information technologies impact on individual privacy.

GOLDMAN: Hi. This is really a wonderful, wonderful conference. I keep looking around and seeing faces, people I haven't seen in a couple of years, and it feels like kind of a coming together - like a reunion of people who have worked on these issues for a long time. So I appreciate the opportunity to be on this panel today.

I think that this conference is really a testament to the fact that something really is happening in the marketplace. I had a few words with Rob Veeder after the panel - about being negative - and certainly this conference is about being positive. It's about recognizing that there is something happening in the marketplace: That things are shifting. The debate is changing. The volume is rising.

And people are starting to understand. When I use the term "people," I include industry and the federal branch, the executive branch, in this inclusive term. People are starting to understand that there is a very deep-seated concern that people have about privacy.

Part of what is happening is that people are starting to catch on to information practices in the industry. As they're catching on they get a little annoyed over here, or a little annoyed over there, and little by little you develop a groundswell. There are a number of advocates who work on this issue and a number of people who really do care about it.

Sometimes the way that people figure it out is subtle. They'll get a piece of mail addressed to them that kind of looks customized. It'll say, "Dear Ms. GOLDMAN: Since you've just moved to the suburbs and you've bought a new car," - none of this is true by the way. [laughter] "You've bought a new car, and you've just had another child, and you just got this big raise" - that's especially not true - "and, ... you've been travelling to Hawaii a lot lately. You're going to be really interested in this product we have." Someone gets a letter like that and it clues them in that something's going on out there. Maybe they've never even heard of this company before. So they wonder, "How did they figure this out about me?"

In fact, a Direct Marketing Association brochure, which is supposed to allay some people's fears about this kind of practice, has a little blurb on the front where it says, "How did they get my name?" You're supposed to open the brochure, read it and feel OK about how they got your name.


Well, a lot of people are not relieved. They don't want them to have their name. And their salary. And how many children they have. And where they live. And what their hobbies are. And what their lifestyle is. Or at least what the company marketing to them thinks all those things are.

Sometimes it's subtle like that. Or maybe it's subtle - like everywhere they go people say, before they even talk to them, "What's your Social Security number?" [laughter] I had this happen to me recently. I'm not in the practice of telling anecdotes, but it's an intimate crowd, I guess.

You know, I called up the Higher Education Assistance Foundation, who is hassling me on a daily basis about my student loans. I picked up the phone, called them - they have this lovely 800 number - and I said, "Hello." The woman said, "What's your Social Security number?" [laughter]

The first thing she said to me! I said, "Oh, boy." Now, I don't think it's just because I do this privacy work on a daily basis that I was really annoyed. I said, "Don't you even want to say hello, or find out why I'm calling? Could we have at least the appearance of some friendliness here?" [laughter]

No! She wanted my Social Security number. It makes sense. She wants to call up my file and talk to me based on what's in my file. But it still gets on my nerves. Someone who's here today, whom I've worked with before, said that she has a practice of never giving her Social Security number, ever, ever, ever - no matter what. I said, "How do you live in the world? You got a drivers license, and you're able to get certain services and you can go to the doctor and you can file insurance claims - you can do all those things?" I'm marvelling at this, and she assured me that she can. Well, she's a very special person and has a lot of chutzpah, and probably people would give her anything whether she gives her Social Security number or not. But not everyone is so lucky.

Sometimes the way that people find out is not so subtle. One of the experiences I had a lot of fun working on was the incident where Judge Bork [Supreme Court nominee] had his video-rental list disclosed during his confirmation hearings in the Senate. Not so subtle. He had no idea that the [list of] movies that he and his family had been renting at this Georgetown video store could be made available at any time to anybody who wanted it. Now, he probably knew, if he thought about it, that there was no federal law that protected him against such a disclosure, but I don't think it ever occurred to him that that information could be disclosed to anybody who wanted it.

When he did find out, and when a number of U.S. senators found out, and other members of Congress and a number of other people, they were really outraged. That moved them. It moved them enough that we now have something called the Video Privacy Protection Act. [laughter]

You didn't know about it? Come on. In fact, when I tell people about the disclosure of Judge Bork, they get very nervous and they start to think, "Oh. All those movies I've been renting." And they get very uptight.

I assure them that there is a law now that protects their right to rent any video that they want and nobody can get that information except, of course, the federal government if they have a search warrant. There are reasons why they could want it - to place you at a particular place, or if you've been renting "those" kind of movies and showing them to kids.

There are all these cases that have come up where the Video Privacy Protection Act has kicked in.

Another example that was not so subtle was when Vice President Quayle's information from his credit record was disclosed in a Business Week article on privacy. Not very subtle, that this information about him - collected because of his credit-related transactions, because of the credit-related activities that he engages in - can be available to somebody. On-line. For a price. There are laws that are supposed to prevent that, but it's technologically possible and actually very easy.

And the Library Awareness Program - another example of a not-so- subtle intrusion. The FBI decided that the best way to catch certain Soviet spies and their assistants was to go into the library and get access to ... borrower records. [There's] a lot of concern about that, and an effort to pass legislation as part of the Video Privacy Act failed. The FBI really does want this stuff and they think that they have a good reason to get it - so we weren't able to get that protection.

But the point of all this - it's like the Emily Litella line, "What's all this I hear about privacy?" A number of years ago that was really true. I think that what we're seeing in the last couple of years, particularly this year, is that the debate is shifting. You've got concern - and an understanding, a heightened consciousness, on the part of the public about how information about them is used.

...I don't think people are as concerned about how information is used in the transaction that they engage in. They go to the doctor; the doctor needs the information to do a good diagnosis. You apply for insurance; the insurance company needs it to decide whether to reimburse you. To get a driver's license, they need it in that context. People are willing to give information in those circumstances.

What they're concerned about is the second use, the "second profit," the second sale of the information - without their knowledge and without their consent. They eventually find out about it, whether it's getting the mailing or a reporter [writes] a story ... about movies they've been watching - and they figure it out.

What we've also seen, which is most encouraging, is not just that people are angry, that they're outraged, that they're uncomfortable, that they want to do something about it - but there has been a response. The response may not be as encouraging as some people would like, but it looks as though it's building. You've got an office as part of the White House called the U.S. Office of Consumer Affairs. Its former director, Dr. Bonnie Guiton, made privacy one of her priority issues in that administration. She did a great deal of public speaking about it, testified at hearings in favor of legislation on privacy. Everyone was saying: "You sure she's going to testify in favor of that legislation?" And she did.

It was important to her. While there may not have been the kind of groundswell of administration support for privacy, there was certainly a beginning. In fact, at a conference last year, Bonnie Guiton did read a statement from George Bush, not terribly like what David Flaherty gave a little while ago, but something that at least mentioned the word privacy and was not derogatory. [laughter]

I think that what we've seen in terms of the industry response has also been very encouraging. You've got the Direct Marketing Association, with their privacy guidelines and their support for consumers to be able to have some control over information about them. The extent of that control is something that we'll be debating this afternoon.

You've got Equifax responding to the concern about Lotus Marketplace by saying: "There are so many concerns that have been raised here, we can't fix these concerns. We can't take care of all the privacy concerns that have been raised. We're not going to offer this product."

You've got some telephone companies responding to the concern about caller ID and saying, "People want blocking, we'll be offering blocking." Some others are being forced to respond in that way by local PUC's [state Public Utility Commissions] and by courts. There's federal legislation pending on that.

But the issue really is that there is a response. And, while I'm one of the newcomers to this privacy issue - I've only been doing it for 4-1/2 years, and a number of you have really made this your life's work - it is a change from what I saw a number of years ago. The small response at least shows that when the mistakes are made - when there's a misgauge of the public's concern about privacy - there's some way to adjust it. There is some kind of response.

So what is it about? Is it just that people are yelling and screaming a little louder now, and so there's some way to keep them quiet in the marketplace? I would hope that it's a little bit more than that.

Professor Tribe's remarks this morning were very inspirational in a number of ways. One of the important privacy issues, I think, is that people need to be able to control personal information about themselves when they enter into a ... business transaction, or when they apply for benefits, when they go to get a license. Whatever it is, they need to maintain some level of control over that information - even when they give it over to receive the services. Because otherwise they lose the ability to define themselves in the world. And I can speak globally now, since we're talking about EC'92 [European Community 1992].

It's very important that people say, "...I want you to know this about me, but not this" - that people have the freedom to make choices without always worrying that somebody is looking over their shoulder, or can look over their shoulder, or can find out something about them which could be damaging at some point in life. Or even if it's not damaging, it presupposes that people have something to hide - and that privacy laws are just there to protect people that have something to hide.

It's not about that at all. In fact, most of the people who are deeply concerned about it have nothing to hide. But they want to be able to make decisions about things that they buy and services that they receive without worrying that someone's going to make a judgment about them based on that information.

First of all, it's probably not going to be a complete judgment, and second of all, we live in a world where you can't escape those kinds of judgments. Some telephone company people have used this analogy to me about caller ID, saying, "It's just like it was in the old days with the small town. When you lived in a small town, you always had the operator connecting your calls from one to the other. And everybody knew everything. Well, it's just like that. We're just going back to the old days of the telephone."

Something doesn't sit right about that argument. Partly it's because that's not the world that we live in anymore. It doesn't make sense.

People in small towns could move if something damaging about them came out or if they were ostracized or they felt alienated from their community. They could move and start again. You can't do that anymore, because the information trail is electronic and it follows them. So you can't really start again. You can keep moving, and you punch in that Social Security number and it calls up information from a number of different databases, and there it is. It doesn't really matter anymore what community you live in. Those boundaries, those borders have really broken down.

Alan Westin had talked last year about "the information bargain," and that there's a bargain that consumers enter into when they're receiving certain goods and services. And they're willing to give over certain information in order to receive something. I think that's a good analogy. I would just take it one step further and say that in some situations you need to be careful, because people are not always on equal bargaining footing. Maybe it's my legal background - about unconscionable contracts, and people having to be on equal footing, and you can't have someone who's too powerful and someone who's not powerful, and then enter into a contract which later really only benefits one side.

But in this situation I think you've got to consider that there are people who are willing to give over any information. And sometimes they have to do it, maybe to receive AFDC [Aid to Families with Dependent Children]. It's not a question of, "I don't want to give you my Social Security number. I'd rather not eat, thank you." That is not an equal information bargain.

Now maybe there are other circumstances where it would be an equal bargain. For instance, the Department of Motor Vehicles: "We'd like to take this information about you and we're going to give you your driver's license" - so there's that exchange - "and we also would like to sell it. We find there's a real market for this kind of information and we'd like to sell it. So our second bargain with you" - because this is the second transaction - "is, åWhen we sell the information, we'll compensate you.'"

This idea does not start here. A number of other people have put that forward. It's an interesting idea. It's one that, I think, is do-able, particularly when you look at the whole information-bargain analogy that Dr. Westin has talked about.

This is really more about ... money. But it's also about the ability to be able to control information about yourself. The core values that we heard about this morning, that Professor Tribe talked about - that there are core, kind of "first-principle values" at heart here. We need to decide what kind of a society we want to live in, what kind of values are really a part of our Bill of Rights.

One of his suggestions about creating a 27th Amendment gives me pause. It makes me a little bit uneasy, because my belief is that the rights that we're talking about here today are already in the Bill of Rights, and the Court is just reading it wrong - that this whole talk about United States v. Katz and "reasonable expectation of privacy" is a very damaging proposition. At the time, it was considered ground-breaking because it overruled Olmstead.

But when you talk about reasonable expectation of privacy, and you talk about all the new devices that can intrude upon your privacy, and you know that they exist, your expectation is no longer reasonable. And nobody's going to call it reasonable. The court certainly doesn't call it reasonable. What we need to do is either get the Court in better shape - which is probably not the most optimistic approach - but to really look at ways that we can create these broad-based principles and apply them. Thank you. [applause]

HOFFMAN:Thank you, Janlori. ...

Our next speaker is John Baker, Senior Vice President of Equifax. John is in charge of consumer, public and government relations. He has had a number of management responsibilities, including the direction of the credit-bureau and marketing-service activities, including pre-screening and direct marketing.

BAKER: Thank you. Hi. I appreciate very much being invited to this forum and having the opportunity to express some thoughts about privacy from the viewpoint of a company that has provided information services since, really, 1899.

Some of my associates - I'll tell you - both inside and outside of Equifax, expressed surprise in various words and phrases of caution about participation in this conference, as if to warn against going into the lion's den. But I want to tell you that our view, more than ever before, [is that] we really need to understand emerging issues and concerns about privacy, and also to explain the value of our services and the steps that we have taken, the steps that we will take, to assure high information standards and a proper privacy balance.

I'd like to [outline some issues]. I won't spend a lot of time on information benefits, for fear of being booed too loudly. But I would like to start out by saying that we think our information services are of high value to the businesses that we serve, and to consumers, and to the economy. We believe that we help make possible better goods and services, and better prices. People can have checks or credit transactions approved quickly and conveniently.

They receive wider product choices. And we think the information flow makes for a pretty efficient economic system, with better resource allocation and more competitiveness in the business environment. New business start-ups can flourish in this kind of environment. In fact, our infrastructure of information and technology is really one that many countries with lower standards of living are trying to achieve. Oftentimes, we are their economic, as well as their information, model.

And at the same time - and the concerns that Janlori talked about are certainly real - we know that there are privacy impacts from the dramatic increase in computer capability, storage capacity, information sources and overall technology. The federal Fair Credit Reporting Act [FCRA], which regulates many of our services, is 20 years old and obviously does not speak to all of these new circumstances. While we know that consumers are more aware of credit bureaus than they used to be, a significant education challenge lies ahead if people are to understand their rights, their responsibilities and, in fact, their opportunities. And, as has already been overdocumented, I will say we've heard the concern about new combinations of information and technology, such as those incorporated in the now-defunct service of Lotus Marketplace.

Our response to this environment is one of attentiveness, sensitivity and concern. We've been active in testimony about possible new legislation. We developed our own set of fair-information practices. We asked Alan Westin to conduct privacy audits of our various services and procedures. And we did decide jointly, with Lotus, to stop development of the Marketplace product, realizing that the combination of information, new technology and a wide distribution capability caused a great deal of consumer fear.

I really hadn't planned to say much more about Lotus, but I've gotten so many questions about the Marketplace product that I thought I'd say a few more words and digress a bit. Yes, of course it was a tough decision. It was based on consumer concerns as well as an economic evaluation of the investments that were made, would have to be made, and the return on those investments. There was a considerable amount of time and money invested. There were elaborate privacy mechanisms that were developed, some of them with the help of a number of the people in this room.

But in the final analysis, we realized that we really couldn't overcome fears. I'll list a few fears that we think were the major ones.

First, ... the fear was that there was not a good screening process for prospective users, whereas we think there was.

There was a fear that improper use would be made of the information because of the wide distribution. That was the toughest one. We could not overcome that one.

There was a fear that more data might be incorporated, or eventually incorporated, into the product than was actually the case. People thought that you could look up marketing information about an individual, and that it would be highly sensitive, even perhaps including credit data, as opposed to estimates and aggregations and geographic segmentation, which was really the case.

And I think that another key factor was the delay factor of people who wanted to opt-out [who] might have to wait for the next release of the disc in order to have that opt-out effective. There were also issues about data encryption.

So I wanted to just spend a moment on those points because I know that this is a key issue in many of your minds. We want to listen carefully to consumers and also to privacy activists. And I think our action shows that we listen.

In fact, our survey that I'm going to talk about in a minute shows that the public believes consumer-advocacy groups have an important role to play in protecting privacy. And I might say that we, in Equifax, pay special attention to those groups and people who themselves appear to take into account not only privacy concerns but also the value of information systems to consumers and to our economic growth and development.

[Please consider the extremes on these issues.] ... On one extreme, a viewpoint that there should be no restrictions on information use. On the other extreme, a viewpoint that privacy concerns should always be dominant and information use considered of little value. You might enjoy trying to place yourself or your organization on this continuum.

On the information side, privacy is always an irritant and restrictions are spoken of in sentences that oftentimes end with the words, "...the demise of our economic system in the free world as we know it." [laughter]

Proponents of this extreme position often point to a constitutional free-speech basis for their rationale, as if they were consumers, not businesses, being forced by government to change the content of their views.

At the other extreme - also using a constitutional underpinning of those privacy emanations flowing from the personal protections of the Fourth, Fifth and Ninth amendments - there are those who see no redeeming features from new information uses, as if the information services were being used by government specifically to invade their privacy. The typical word to describe new information services on this extreme is always "outrageous." Everything is outrageous.

We see ourselves in Equifax - and we hope that you see us - as operating in the "balance" area, realizing that some restrictions on information use are necessary and proper and that new restrictions may be appropriate, depending on the circumstances. [I will mention some of] what I think the key questions should be in defining those circumstances:

  • What's the impact of the particular type of information use upon the individual?

  • What's the sensitivity of the data being reported?

  • What ability does the individual have to receive easy access to information reported, and how is that information disseminated?

  • How strong are the measures to prevent unauthorized access and preserve confidentiality?

  • How's the information collected? Where did it come from? Has it been used before?

  • How important is it to the decision-process at hand?

These are the key issues that we think need to be woven into any privacy- information balance in a particular context. And I think, in terms of Professor Tribe's comments this morning, these are really for the most part, less issues of technology and more issues that go to the core values that we seek to protect, or that we might seek to protect.

It was in this spirit of concern and inquiry that we undertook our survey of consumer attitudes about information and privacy last year. As a strategic matter, we realized that information would continue to be our most important asset, and we knew that it would be crucial to make sure information practices are known and respected - not only by our customers but also by the public.

With the background of Fair Credit Reporting Act hearings and a number of articles - we've all seen that headline asking whether information systems had gotten out of hand - we decided it was vital to have an independent and objective survey of public opinion.

You know, everybody tells us how people feel. We all tell each other how people feel, what their opinions are. Everybody purports to speak for the consumer. ...

We wanted a credible survey with supportable data that would make a valuable contribution to the national dialogue about privacy; one that would move that dialogue toward a more reasonable or balanced discussion and away from the sound-bite headlines that oftentimes generate more heat than light. We wanted to move that bloc along the continuum.

The survey consisted of over 2,000 extensive interviews with consumers, giving a very high confidence factor in the results. And it had two major findings, as you know.

First, there is, in fact, a widespread concern about privacy. ... While there's not [been] much change since Ô83, the concern remains at a high plateau, with four out of five people expressing strong or moderate fears about threats to their personal privacy.

The second major finding shows the complexity of thinking about privacy. Americans are very pragmatic about the collection and use of information. They want to receive the benefits that information services provide.

As we looked at the findings, we could see that what people were saying is that they really wanted themselves to look at each particular context to decide what the proper privacy trade-off is. I think really, that's the major finding of the survey: that we have to be careful not to make sweeping statements about, or sweeping changes to, the current web of privacy protections and information practices.

Instead, we really ought to look painstakingly at the various specific situations and the possible threats, keeping in mind this strong degree of consumer pragmatism. In fact, a final conclusion by the Louis Harris organization is that new restrictions on the use of consumer information could be regarded by them as worse than any disadvantages from the current system.

Another important lesson that we learned from the survey is that we have to do a lot better job [of] explaining how and why information processes work the way they do. We particularly found this out in a direct- marketing context. When asked about direct marketing in a way that emphasized only the benefits to businesses using personal information, the question ending with the words, "...and they do this without your permission. Is that OK?" most people found, as you might suspect, the practice to be unacceptable.

But when we explained in a follow-up question how direct marketing works and why it works the way it does, two-thirds of the public found acceptable the use of personal information for direct- marketing purposes. And the acceptance jumped to almost 90 percent if people understood that no personal financial information was involved, and they had an opportunity to opt-out.

The lesson for us was made clear by Dr. Westin's analysis of the respondents, which really broke the population into three groups: the fundamentalists, the pragmatists and the unconcerned.

The pragmatists are in the majority, and when aligned with either of those two other groups they constitute a 75-percent majority. That tells us that it is critical that information practices be communicated fully to consumers, explaining the rationale, the consumer benefits and any consumer controls or choices, such as opt-out. Because in any particular context the pragmatists, who are the majority of American people, will accept practices if they understand them; [if] they know why they work the way they do; if they see a basic sense of fairness in them; and if they can see what's in it for them - if they have some kind of benefit that they perceive.

Well, where are we now? Hopefully we have moved the privacy issue toward more reasoned dialogue with some factual information. Our pledge has been to publish the findings, communicate them, engage in discussions and reviews. We're going to update some of the survey questions annually.

We've established an Office of Consumer Affairs in Equifax ... that looks at all of our services. We'll continue the practice of our privacy audits. Our objective is to take a leadership role to assure superior information standards. ...

At the same time, we've instituted a number of actions in coordination with Associated Credit Bureaus and the other two credit- reporting systems, TRW and TransUnion. We're working to provide better service to consumers: an easier disclosure process; a no-charge policy if the consumer has been declined, ever - even no charge if the consumer has been declined not based on our report but based on a report by the other two systems. And they are doing the same thing.

Mechanisms are being developed to deliver re-verified information to all three systems. A lot of task forces have been working to develop new forms and standards, task forces with credit grantors and credit-reporting agencies. A copy is given to the consumer at disclosure and after reverification.

We're certainly working on the "unauthorized access" issue, requiring an inspection of and a contract with every prospective customer. And we also need to do - and it's just money here - more intensive spot- checking and auditing of usage, especially by certain categories of customers where any abuses have been reported in the past.

In the marketing area, ... we have begun, in fact, over the past year to notify all those people we come in contact with about marketing practices and about their opportunity to opt-out.

...Let me [state] some objectives and then I'll finish. ... We're going to develop new disclosure procedures. We think we need to communicate to the consumers - in addition to educating them - what their responsibilities are. You know, consumers who apply for a benefit should really use a consistency in the use of their name, identification, address - which is important for maximum accuracy. [audience noises, some laughter] If you want accuracy, ... you're going to have to ask consumers to use information consistently. If you don't want accuracy, then you don't need to ask them that. OK?

We're viewing consumers more and more as our customers. We do think there's a delicate balance between information and privacy that has to be evaluated continuously in every context. And we're committed to looking at the human side of our relationship with consumers as well as with the technological side of moving forward with new services. So thank you very much. [applause]


HOFFMAN: Thank you. ... [We'll take] questions for both of these speakers before the debate.

GOLDMAN: ... I don't think this is really about information benefits, or the benefits of information services. If you just give people the opportunity to consent to the use of the information then you can sell them the benefits of the service. I can't quite figure out how that works. If you tell people that they can either opt-out or opt-in for the different services, then they can make a decision about whether or not they think that the service is valuable to them - instead of worrying about all this balancing and charts and graphs and matrices.

BAKER: Well, I think, Janlori, that you're talking about secondary use. ... My slide was really to try to portray a broader picture of privacy and information uses that was not a secondary-use issue. I do agree that secondary use is a special circumstance. It's going to be debated shortly, and I've got a comment on that later that I'd like to make, because I do feel that the wooden, absolutist kind of approach on secondary use should give way to some consideration of a number of issues, some of which I put in the "balance" part of that matrix or slide.

HOFFMAN: [A written question] from Eric Siegel from Privacy Journal. ... to John BAKER: "Why did Equifax feel the need to develop a separate set of fair-information practices, and how does yours differ from that developed in the Ô70s?"

BAKER: We found that the Fair Credit Reporting Act, which is the main act that regulates most of our information activities, really did not go far enough and did not do the things that we felt we should do as a responsible information provider. The examples that I have given in my comments, or was going to give in my comments - that have to do with offering free credit reports to consumers, setting up [an] Office of Consumer Affairs across our corporation, conducting privacy audits, doing surveys of consumer attitudes; - those are things that, in addition to the statements that you have in the booklet about our fair-information practices, are necessary to go beyond what just the bare requirements of the law are.

HOFFMAN: ... Next question is to John Baker also: "Do people truly understand opt-out practices? Wouldn't an opt-in system protect individuals who would otherwise not know what information's being presented?" [loud applause]

BAKER: Yeah, people really don't understand opt-out fully. They understand it better than they used to. ... From our focus-group work, people really are pretty educated about how direct marketing works. They're very intelligent about it. But they don't ... always know where to write to opt-out, and they're not sure, and a lot of them don't want to opt- out of everything.

They want to opt-out of some things; they want to opt-in to other things. Just a pure opt-in, with no other use of information, is not economically feasible. [I could show you a chart of] how much it costs to do that, and you can see that direct marketing ... information would just fold. [moans and applause] I know that would break your heart.

One service that we did develop - and you may not like this either - that you've read about, is called "buyer's market," where a person fills out a questionnaire. They say what they want to opt-in to and what they want to opt-out of, and they get ... coupons and bonuses, which is kind of like the information bargain, back from those companies that they've indicated they have an interest in hearing from.

That may be a wave of the future. I'm not sure. But I do think that sending out a notice to every consumer, most of which will be thrown out - which is really kind of a wooden practice - will not result in very much of a response, because consumers respond at a rate of 1 to 2 percent on mailings, or 1 to 5 percent. So when you talk about opting-in, you really need to think about how the world works in terms of how you look at your mail and what you do with it when you get it.

HOFFMAN: ... One final question for Janlori: "In one of yesterday's tutorials, Paul Bernstein stated that the ACLU [American Civil Liberties Union] has failed to act in the interest of citizen privacy. In what areas do you see the ACLU activities most needing improvement?" This is from Harry Goodman.

GOLDMAN: I assume when you say the ACLU has failed to act, you mean everyone other than the Privacy and Technology Project. I would hope that that's true. It's a hard question. I could say that we could always do more, and that would be true. I would say that I would like to do more; that would be true. I would say that I wish we could hire ten people to work on this issue full-time. What we have to do - as any organization has to do but it's particularly true for us - is set priorities and decide each year, and every couple of years, and each Congress, what we're going to work on.

If the priorities that we've set are not the priorities that people think that we should be setting, join the ACLU. Write me a letter. ... To sum up, we could always do more..., certainly in the privacy area. The Privacy and Technology Project is primarily a public-policy project that looks at legislation and public-policy issues. In terms of litigation, which we don't do out of the project, we rely on our affiliates, which we have in most states around the country and in our national office in New York .... We're able to provide support to them. But they really run their own show in each state.


HOFFMAN: Let me now move on to the debate. ... We're going to have a debate, as stated in the program, on, "Should individuals have absolute control over secondary use of their personal information?" Or, to put it another way, "Resolved: No organization shall make secondary use of personal information without the individual's affirmative consent."

This debate tackles the issue of secondary sharing of personal information. It pits against each other two advocates with long experience in consumer-privacy issues.

Should a person have absolute control over the re-use of data submitted for one purpose, or should some exceptions be made to facilitate service delivery in fields such as medicine, credit and so forth ...

On the pro side, saying that no organization shall make secondary use of personal information without the individual's affirmative consent, we have Marc Rotenberg .... Marc is the Director of the Washington office of Computer Professionals for Social Responsibility (CPSR). He testifies frequently in Congress on issues involving computers and civil liberties. Marc was formerly counsel to the Subcommittee on Technology and Law of the Senate Judiciary Committee, and helped draft the recent federal laws on privacy and computer security. He was also an expert witness in the caller-ID proceedings in Maryland and the District of Columbia.

Arguing the other side of the issue will be Alan Westin. Alan is Professor of Public Law and Government at Columbia University. His book, Privacy and Freedom, in 1967 is considered by most experts to be the leading scholarly work on privacy issues in the new high-technology society. Most recently, Alan was the academic advisor to the 1990 Louis Harris National Survey of Public Attitudes Toward Privacy, sponsored, as you just heard, by Equifax.

... Without further ado, let me turn the floor over to Marc Rotenberg.

ROTENBERG: We are all privacy fundamentalists about some aspects of our personal lives. It may not be all aspects. It may not even be most aspects. But it is our ability to assert a fundamental privacy right that protects us as individuals. Sixty years ago Justice Brandeis wrote that the makers of our Constitution undertook to secure conditions favorable for the pursuit of happiness. They recognized the significance of our spiritual nature, of our feelings and of our intellect. They knew that only part of the pain, pleasure and satisfaction of life are to be found in material things. They sought to protect Americans and their beliefs, their thoughts and their sensations.

Without the right of privacy, there could be no public life.

Without the opportunity to form smaller communities within the larger community, diversity would collapse and dissent would be crushed. It is the most fragile freedom, better defined not by the eloquent expressions associated with the First Amendment [but rather by] the quiet contemplation that precedes the articulation of personal belief.

When we enter public life, we make judgments about how to disclose personal information based on the choices that are presented. A credit agency may well be entitled to know our annual income before deciding to grant us credit. But what could be the basis for allowing the credit agency to make use of this information, provided for the purpose of obtaining credit, to develop a direct-marketing product without the person's permission? Does mere possession constitute ownership?

John Baker spoke of balancing, and rightfully appeals to our deeply rooted sense of justice, to resolve difficult problems through fairness. But there can be no balance between individuals and institutions when the currency of privacy is presumptively taken from us. We would all come to such an auction too poor to reclaim that which once was ours.

Some of us may freely consent to the secondary use, though many of us may object. But - don't all of us share a fundamental belief that we ought, at least, to be consulted about how personal information is used that we entrust to institutions? And, there is nothing in this statement that precludes technological development or discourages innovative business practices. It merely asks that we be given the right - to the extent that our names are sold or our personal information is disclosed to strangers - to decide whether we freely consent to that activity.

It is not sufficient to provide a notice and allow the consumer to opt-out. Such a measure places an onerous burden on consumers in the information age. We should not be required to tell companies that they may not do what they should not do.

Yes, personal information has commercial value and is used to promote business, both large and small. But this observation merely restates the question and does not begin to address how the commercial value should be allocated. Shouldn't the individual have an interest in the commercial value, and shouldn't the individual come to the bargaining table in full possession of his or her identity?

And yes, on balance, most of us are not that concerned about information privacy. Most of us list names and numbers in the phone book. Most of us do not object to the transfer of mailing lists from one company to another. But the right to privacy is not simply a ratification of a majority practice.

For, if liberty is to mean anything, it must be the recognition of claims that individuals have against larger communities and the practices of those larger communities. You may not need the protection that an unlisted number provides, or the assurance that information given for one purpose will not be used for another, but other people do.

We must fight to protect the right of privacy. Not simply those rights that we ourselves exercise, but those that we understand that others may need. If we fail to do this, our technologies may continue to evolve [and] our material needs may continue to be satisfied, but the character of political life, our public world and our private well-being will diminish greatly, and the promise of enrichment that progress offers will gradually subside. Protect privacy. Change the default. [applause]

WESTIN: I'd like to present four reasons why I do not agree with the proposition that was stated in the debate: That absolute individual control of secondary use is the principle around which we should design some new privacy principles for the Ô90s.

First of all, whether the individual should or shouldn't have absolute control seems to me to depend first on the nature of the secondary use. If we're evaluating people to deny them or give them access to rights and benefits and opportunities, then every bit of privacy, due process and equality safeguards ought to be attached to any secondary use. That means the secondary use in that case has to be governed by a full array of privacy principles.

But in a wide variety of areas there are non-evaluative uses that are made in a secondary application of information. For example, much social research, business research [and] organizational-effectiveness research depends on making secondary use of information given for an original purpose, and - because in that case no evaluation is being made of the individual, and even with identifiers on it for longitudinal or application research that has to continue over time - it seems to me that the individual in the exact privacy sense is not being affected or diminished by that kind of research application.

So I think first of all we have to look at what kind of secondary use we're talking about, and not treat secondary use as a hobgoblin generic term.

Secondly, whether the individual should control or not seems to me to depend on what is the fair use of the information in each one of the organizational sectors in which we all participate with information: health, education, Social Security, insurance, employment and so forth. Each of these has developed over decades - if not centuries - notions of what are fair and legitimate uses within that zone. We live by those.

Most of us are very comfortable with them. When they're not right, society should change them. But the notion that, in advance, you're going to require every person and organization to specify every future potential use within that zone of information service and exchange, seems to me not only impractical, but a mindless and numbing bureaucratic enterprise that would make life in organized society extraordinarily difficult.

What's critical, in other words, is that society decides what are the fair uses. Not some kind of individual, nihilistic veto that controls the activity inside each organizational domain.

Third, I think individuals cannot control secondary uses that involve those compulsory disclosures of information that are vital to social and democratic processes and at the heart of the First Amendment - a very important part of the Constitution - that complements, in any good privacy theory, freedom of information and media access and disclosure of information needed for social, economic and political decision-making. [Such information is] what individuals must, as public information about themselves, see disclosed, whether they're comfortable with it or not.

Now this is a tricky area, and I want to be very clear I'm not saying that I'm comfortable with the way in which public-record information is currently made available under First Amendment or freedom-of-information concepts. I think we've gone too far in allowing information that we give for drivers licenses or land records or other kinds of uses to be, without our information and knowledge and consent, used for purposes that many people find not acceptable.

I believe we will, in the next few years, write some standards for access to public-record information that will try to respect individual choice. But I suggest to you that the whole domain of public information about things that are of concern to the whole society is an absolute underpinning of a First Amendment-based constitutional democracy.

Finally and fourth, it seems to me that whether secondary use should be permitted or not without the individual veto depends a great deal on whether the secondary user can and actually does provide adequate control over potential misuse of that personal data. In many cases I feel that what's been involved in the direct-marketing debate, for example, is a public concern that the information is not adequately protected against potential misusers, or that it is going to be reached by people with advanced data-security techniques, through rather weak techniques of data encryption or service.

Let me close by suggesting that there's a delicate line between creative anticipation of technological threats and strong privacy advocacy on the one hand and a kind of galloping paranoia and elitist social engineering on the other hand. I try very hard to walk that line. I hope you all do, too. [applause]

HOFFMAN: Marc, three minutes for rebuttal.

ROTENBERG: Professor Westin's opening statement makes good sense from the social-research perspective. It describes accurately what currently takes place, and suggests to us what may well lie ahead. But it tells us very little about the underlying interests at stake - about the right of privacy, what it means and how it is to be protected. And if there's a critical flaw in the analysis, it must surely be that, if there is a right, here, it is a right possessed by individuals against the society. ... It is a right shared by all individuals within the society - to protect their liberty and to protect their freedom. It cannot be based upon a social determination about what is pragmatic, about what is generally done. That is not a recipe for liberty; it is a recipe for a different kind of society. [applause]

Professor Westin suggests that there are First Amendment issues at stake. There are indeed First Amendment issues at stake whenever the government attempts to restrict the publication of information. But what First Amendment issue is at stake in the wrongful appropriation or misuse of personal information? Is a First Amendment issue at stake if someone chooses to use and wrongfully publish someone else's proprietary information? Someone else's business plan? Can the person who steals this information hide behind the First Amendment and point to the mere fact of publication and not the type of acquisition?

Finally, social engineering, technological paranoia and what are we to make of the future? That's not what this discussion is about. There is not a problem here with the technology. Indeed it was Ithiel Pool who said in, Technologies of Freedom, "Technology provides us with opportunity. It is for us to decide what opportunities to make."

Nor is it the situation that I put forward a recommendation that would suggest the government get in the midst of every information stream in this country. It is precisely, in fact, the opposite point.

To the extent that commercial transactions take place with personal information, give individuals the right to control with other parties the value of that information. The marketplace can work out those negotiations. And it may well be, as Janlori Goldman suggested earlier, that we will need to establish a baseline to protect individuals against unconscionable transactions that they should not enter into. But let's take it as a starting point, that these negotiations are between private parties, fully protected by the liberty - the right - of privacy. [applause]

WESTIN: An interesting part of any debate is to decide for yourself exactly where the proponents disagree. In that interest, let me select one example and give you my sense of how the approach I would take would be applied - and you can contrast it and see some similarities, I'm sure, between [it and] what Marc said.

My own sense is that, by the year 2000, we will use only consensual databases for marketing, and the consumer will be compensated for the use of their information. I think we've left the era in which consumer profiles could be gathered by direct marketers as a free good, like the once-free air and free water that economists wrote about. That's because, as Janlori mentioned, I do believe that there is a changing nature of the information bargain in the society. And it's that we as consumers have very valuable information - our personal attributes and transactional characteristics. We should be able to be compensated for those in a real way, and we should decide how those are shared. So in an objective sense, or as a final objective, I don't really sense that ... there's that much difference between what the goal would be as I've stated it, and perhaps the way Marc stated it.

The difference, I think, is how we get there. I sense in Marc's approach a desire for immediate regulatory intervention that would force the business community and users of information to adhere to the opt-in principle - affirmative consent before any personal information can be used. My own sense is that there is already underway a proper process by which the business community is beginning to seek - through direct questioning with the consumer - the kind of opting-out and opting-in mechanisms that make sense in particular fields.

Many organizations, for example, that we have direct contact with, ask us if they would allow us to use their information for magazine subscription, additional uses or for organizational affiliations for additional uses, and we can decide whether we agree or disagree.

The problem is that there are many important bodies of consumer information in which there is not a direct relationship between the holder of that information and the consumer. The credit bureau is the perfect example. It does not have a direct relationship with the consumer. The consumer has a direct relationship with a credit grantor who reports the information. That's a complicated situation, and my view is that we'll probably develop some concept of joint ownership or shared ownership that will allow individuals to be contacted directly by the credit bureau and to achieve a process of agreement in the various uses of information.

For me, the key issue is, "How do we get from a point in 1991 to what I think will be a very different organization of information use in the future?"

As a privacy advocate, I want to see us get there in a way that doesn't, in an extraordinary way, disrupt my interest as a consumer in access to information or my interest as the citizen in the full participation in a First Amendment-type of society. [applause]


HOFFMAN: ... [A question] first to Alan Westin and then Marc can respond. ... Alan, you mentioned you saw this process evolving. You said quite properly we were getting to the point where, in the words of one of our questioners out in the audience (Ken Applegate from the L.A. County Municipal Courts) we're basically setting up effectively a market-type royalty system, perhaps.

Now if that's the case, let me put out a straw man here ... - that the technology is outrunning this. By the time - I'll intentionally be overdramatic - by the time you lawyers get anything done in this and sort out all the nuances, the technology will have outrun you and the ball game will be over, and there will be no privacy left. What do you say to this?

WESTIN: I guess as a non-technologist I have more faith in the technology than you do. [laughter]

It seems to me that the technology is superb at treating different people differently, and allowing options for people to either agree to various choices or not to agree to various choices. We can audit systems of large databases so that we make sure that - at that end, at least; the high- end of mainframe systems and large databases - that we apply the "responsible keeper" kinds of controls that allow society's rules to be administered. I would tremble for privacy in a clerk-based society in which the answer to every effort to protect privacy would be, "But we can't do it," "It would take too long," "It would cost too much."

The interesting thing about information technology is that it's a far more powerful instrument for carrying out the social policies that we choose, and have the will to impose, than the manual society was. It comes back to the point Marc made. How we use the technology is really up to us. It's not determining these kinds of issues.

ROTENBERG: Fifteen years ago the United States was at a high-water mark of privacy protection. We passed the Privacy Act [1974], and many countries across the world looked to us for guidance and inspiration. For 15 years we've turned our back, and we live with the consequences today.

We have a serious problem, no doubt. But it is a problem that can be solved through a joint effort of policymakers [and] technologists who share a fundamental commitment to protect privacy. I have no doubt that the problem can be solved. But it can only be solved once we make the commitment to address it.

HOFFMAN: [To stay on schedule]... I am going to first give Marc and then Alan their final two-minute statements. Marc, you first.

ROTENBERG: This is a lot of up and down for a debate. I agree with Alan, that we need to be looking ahead for how to protect privacy in the next decade. But I disagree with Alan about the urgency of individuals asserting their personal interest in protecting the right of privacy.

If there was one lesson that was learned from the [Lotus] Marketplace episode, it was that without individuals putting forward their concerns, there would have been no response. It was not a decision made in Washington. It was not a decision made by a single organization. It was a decision made by a group of people who looked at information - some of it conflicting, some of it corroborating - and reached a determination. It is a description about how a democratic society should operate.

We are going to confront many more debates in the years ahead like Lotus Marketplace. We should do it with the hope that we can promote new technology; that we can foster business opportunity. But we should never lose sight of the underlying political interest, which is the right of privacy which protects us all as individuals. [applause]

WESTIN: When the Equifax survey was done ... the breakdown of the public ... was drawn from taking the answers to five diverse questions and dividing the American population into those three categories. This produced what was called the "privacy dynamic," that is, a sense of how the public will make up its mind. And the centerpiece of that was that the public will look to see whether its conception of what is fair-information practices or a proper balance between privacy and disclosure or privacy and protective surveillance in the society is what they see.

My feeling is that the process we're going through now - a conference like this and the media attention and the interest-group attention - is extraordinarily healthy. It's going to raise these issues for the public and ultimately, I think, it's going to be a public choice.

All of us can say to the public what we think the right values and the right technology protections are, but basically this remains a democratic society and the decision is going to come from a public which truly does want the advantages of a knowledge society, of wide consumer choice, of the ability to customize products, services, of government and business to respond to individual difference and individual choice.

Individualism cuts both ways here. If we want the advantages of tailoring products, services and societal values and distribution to the individual, we need knowledge. If, on the other hand, we don't protect the individual against the power imbalances that too much knowledge to organizations or to government provide, we also lose a vital part of individual freedom.

That delicate balance, it seems to me, is what we're talking about here, and I return to the theme - that the essence is to keep this balance between the paranoia on the one hand and the devoted intelligence to pursue privacy on the other. Delicate though it is, it seems to me it's real. After we get done with a certain amount of ventilation of anger and hostility to technology or the system or marketing and so forth, I have great confidence that everybody here and the public will make the right kind of choices. [applause]

GOLDMAN: ... [commenting on her one-minute time limit] When I talked about control, this is not what I had in mind.

The first thing I wanted to mention about Alan Westin's comment is that when he talked about social research I wanted to point up what happened with the 1990 Census. Here's a situation where there's a tremendous collection of personal, sensitive information, but it's not personally identifiable. It goes into a database. You can do all the research you want with it. You're not supposed to be able to personally identify people in that database. But a very large percentage of people didn't believe that was true and didn't want to answer the census - probably the highest non-return rate that they've seen.

The other question is about misuse. Let's just let [information] out once and let's make sure that after we make this initial secondary disclosure we don't let it happen again because there may be misuse.

I'm going to be speaking in a couple of months at a conference on genetic testing. Here's a situation where, in the process of receiving a job [or] getting insurance, you're going to have to submit to a genetic test that can chart your tendency to develop certain kinds of illnesses or diseases in the future. What are we talking about when we're talking about potential for misuse with that kind of information?

The other issue is that, if in the year 2000 - which is very soon - we are going to have these consensual databases where people are going to be receiving a profit, it doesn't seem that far away from now. I'm not exactly sure how we're going to get from here to there, but I would certainly hope that we make that retroactive. [applause and some laughter]

HOFFMAN: I want her as my negotiator.

GOLDMAN: There's one question that you handed me. ... Somebody asked about the ACLU being able to accept contributions directly to the ACLU Privacy and Technology Project. ... [Yes], we can do that.

HOFFMAN: OK. John Baker.

BAKER: First I have to say that I'm very impressed with the debate. I do think that a prohibition of secondary use should not be the 28th Amendment, assuming we have a 27th.

Second, although the 27th frankly did sound good, ... the secondary or compatible-use principle is the only one of the 1973 Privacy Principles that's been troubling, and that's because it's stated in such a meat-cleaver way. It seems to me we're trying to use secondary use, particularly in this discussion, to prevent direct marketing. Maybe that's OK, but I think we need a lot better understanding of how consumers feel about that, how the process works and why it's important to them - and what happens if you do that.

The notion of consent, I think, is too often a red herring. It involves a wooden form of notice. I think different types of secondary use require different types of notices, different types of consent mechanisms.

The issues that I would propose that need to be used to decide how much consent is appropriate gets down to what I talked about before, [which] is the sensitivity of the data. Did it really come from the consumer? Has it been used before? Was it in the public domain already? How critical is it to the decision process? And how critical is it to the consumer? How is the consumer impacted by the use of that information? We look forward to participating in that dialogue in the future. [applause] .

Return to CFP'91 Index page.

Return to the CPSR home page.

Send mail to webmaster.

Archived CPSR Information
Created before October 2004

Sign up for CPSR announcements emails


International Chapters -

> Canada
> Japan
> Peru
> Spain

USA Chapters -

> Chicago, IL
> Pittsburgh, PA
> San Francisco Bay Area
> Seattle, WA
Why did you join CPSR?

... As an IT professional I want to show my support and interest in areas where IT can better society at large and the that [sic] we as technology professionals have a responsibility to properly promote technology especially in areas neglected.