Personal tools



CFP'92 - Computers in the Workplace : Elysium or Panoptical?

Friday, March 20, 1992

Chair: Alan F. Westin, Columbia University

Panel: Gary Marx, MIT
Kristina Zahorik, U.S. Senate Labor and Human Resources Subcommittee on Employment and Productivity
Willis H. Ware, RAND Corporation

PETER DENNING: I'd like now to introduce Alan Westin, who is the chair of this session and is going to tell you how the session is organized.

WESTIN: Our session next deals with computers in the workplace. As the chair, I want to present some historical and sociopolitical observations about what I think is the profound transformation taking place in the American workplace, and therefore how perilous it is. At the same time, some major opportunities for applying technology arrive at the time when the workplace is in such a state of transformation and change. After that, as a way of opening up the topic, we'll have three presentations, very nicely divided between, first, a sociologist; second, a legislative analyst with a regulatory thrust; and finally, a computer scientist/technologist approach to the issues. So if we are trying to identify the proverbial large elephant, we are coming from a variety of nice disciplines to try to explore the parts and see if we can get the whole together.

I think that it is very important to understand that when information technology enters an area of social activity, the best fit is when the rules and the norms in that area are well-defined, publicly understood, and in which there are very crystallized values about standards and processes. When that happens, information technology can come in. It can be finely tuned to those prevailing values and norms, and there can be the most comfortable fit between technology and societal interests. On the other hand, when you have an area that is undergoing profound change, conflict of values, and tensions in terms of competing interests, it becomes a dangerous time for technological applications, because either the technological applications will be used to enforce the older standards that are in disarray and breaking up, or they may be used to prefer one or another of the competing interests before there has been a social consensus on just how that area ought to perform in American society. And I'd like to suggest to you that that's exactly the situation in which we find ourselves as we think about computers in the workplace in the next decade.

It helps to understand what I mean by the transformation of the workplace and how this is a very perilous setting for technological applications. Anyone familiar with American employment history and law knows that we have traversed at least two eras to get to the 1990s. First, from about the 1890s until the 1950s, we managed what could be called, the era of employer prerogative in American law and practice. Employers were free to hire, administer, and fire at will. They could set any standards they wished in terms of selecting people for employment, employ any tests they wished to verify or to gauge people's ideas, orientations, lifestyles, or fitness for work as the employer defined it. On the job, there was no concept that there was any employer limitation when it came to watching, controlling, keeping records on employees. Obviously, this was before the era in which any kind of equality standards defined opportunities for equal treatment on the part of diverse groups in the population. Employment, the law said, was at will, because the employee was free to quit when the employee wanted to; therefore, the employer was free to fire whenever the employer wanted to. It was an axiom of American employment law that courts would not inquire into the reason why a person was discharged from private employment. The employer could do it for good reason, bad reason, or no reason at all, the famous court construction of employment at will put it. And this was the system that, by and large, prevailed with only minor modifications as far as government employment was concerned, throughout this long 60-year era of the 1890s into the 1950s.

The second era that we pass through is the shift into what I would call the sociomediated concept of employment, where, for a number of critical public policy vantage points, American society decided to limit the total employer prerogative concept. Equal employment opportunity standards were written, guaranteeing access to jobs on a non-discriminatory basis for a steadily enlarging set of protective categories: race, religion, sex, nationality, disability or handicap, etc. In addition, statutes were passed in this era of the 1960s into the early 1980s giving employees, in most states, the right to see what was in their personnel records and to have access to the record-keeping function. The Fair Credit Reporting Act, enacted in 1969 and put into effect in 1970, provided some rights for individuals to know if a report had been drawn on them, to see what information had been used by an employer not to hire them, and to have various rights of challenge and contest.

We also began to define protection of whistle blowers if they reported what they thought was illegal behavior on the part of their employer -- protection against being fired or disciplined for trying to adhere to government regulations or to laws governing the conduct of the employment relationship. As a result of these kinds of new public policies limiting the classic employer prerogative, most American employers went out of the business of being as intrusive and as discriminatory as they had been in the earlier first epoch of the employer prerogative period. If you looked at the practices of most large private corporations, and the practices, obviously, of government agencies, what you found was that in the period of the 1960s until the mid-1980s, a lot of what had been the standard privacy-intrusive kind of activity was either discarded as not being necessary or functional to get employees to pursue productive work relationships, or not the kind of things that the public was comfortable with. So the great majority of employers in this period retreated or walked away from what had been the most intrusive or the most controlling practices in the earlier era.

Something happened in the late 1980s, and it's this transformation that has to be seen as the dominant thrust of the 1990s. For a number of reasons, employers have been invited back -- and many have been forced or have joyfully moved -- into a new intervention in the private lives and into the activities of their employees. The first reason is the thread of legal liability. There are lawsuits now that hold an employer liable if there is negligent hiring. This means that the criminal history record of the person applying for the job has not been investigated and ascertained, especially where there is a criminal history of some kind of violent behavior or sex-related crimes that could affect the employee in relationships with the public, customers, or fellow employees.

The government made a major thrust on employers to get into the drug testing business in the late 1980s. Public concern about drugs in the workplace, crime, and losses as a result of drug abuse in the workplace pushed large numbers of employers into adopting drug testing programs, which enjoyed very high public opinion support. Whenever the public is asked whether they approve, or do not approve, of drug testing at the workplace for people whose jobs affect public safety, public attitudes are overwhelmingly in favor of drug testing programs.

Health care costs have become so major at workplaces that there is tremendous employer concern to control those kinds of costs, since in a global competitive world, astronomical health costs can put a tax on American competitiveness. This is of direct concern to the survival and viability of many enterprises. As a result, since we are not a society that seems to be willing to enact national health insurance, employers are being driven to try to select applicants and to administer their work forces in ways that will, in a major sense, control their expenditures on health utilization, disabilities, worker's compensation, and other health-related costs. These and other kinds of trends that I could mention are driving employers back into a desire to learn more about job applicants, to control behavior on and off the job once people are hired, and to seek sources of information about their employees in ways that, while not quite going back to the earlier era of employer prerogative, resemble it in terms of employer thrust in favor of the collection and utilization of very sensitive and personal information.

I think there are four areas in terms of technological application and this changing workplace where we are going to be seeing enormous privacy conflicts played out in the 1990s. In mentioning these four, I think it's also important to note that they're unfolding at a time when we are reshaping the very nature of work and authority relationships in our society. It's clear that the traditional authoritarian structures of work in America have broken down. We are now, virtually as a whole society, committed to experimentation and decentralized work structures in new employer/employee relationships with cooperative participation and not rigid authoritarianism. We are also concerned about new relationships between work and family, and dealing with harmful stress at workplaces and its effect on the physical and mental health of employees in society, etc. The very structures and relationships of the workplace are undergoing dramatic change, wholly apart from these privacy issues, but it's in those settings in which the privacy issues unfold.

What are the kinds of issues that I think are central and ones which many of our speakers today will be discussing? First is applicant selection itself. There is strong public support in all the surveys for verifying job-relevant information when applicants give their qualifications for jobs. There are now many, many database-oriented kinds of files which can be consulted to verify information. In the new amendments to the Fair Credit Reporting Act that are presently being developed in Congress, there will be important new rights for job applicants to be informed in advance that verification of their qualifications and of employment background will be done. If the Act is passed, they will be entitled to receive a copy of that report and respond to it before adverse action is taken by the employer. This is a very important addition to due process, if not privacy rights, on the part of applicants for employment, long overdue. I think that it is something that industry and consumer advocates and privacy advocates see as an important addition to the employee rights dimension.

There is strong opposition in all the survey data to employers looking into personal lifestyles and off-the-job civic and political activities in making job selection decisions. For example, about 18 states in the last several years have passed statutes that forbid employers to make off-the-job smoking or the use of other legal substances such as alcohol a criterion for denying employment. I think that reflects a strong sense that the public is not ready to make public-health- oriented kinds of concerns a basis for employers excluding people from the workplace. That's very different from forbidding smoking at the workplace, but it deals with whether off-the-job smoking and testing people for off-the-job smoking can be made a criterion for applicant selection.

Just one other point about applicant selection. The statute that comes into effect July 26 of this year -- the Americans with Disabilities Act -- is probably the most sweeping privacy protection (but not called that) that we've had at workplaces in the past several decades. It will require all employers under that statute to define the essential functions of every job and to prove that they have selected people only on the basis of those essential functions. The ADA also forbids the giving of any medical test, unless it is provided for all applicants equally, and that has a very important effect on preventing selective application of medical, or mental tests, as an employer practice.

I think the most dramatic issue, looking into the future in terms of selecting applicants, is the potential for genetic screening of job applicants. Recent surveys done by Gallup show that about 75 to 90% of the American public oppose employers using genetic screening in order to find out the potential health hazards that job applicants possess. That's a cheery note for me on which to begin the decade, but I think there will be many efforts to portray the use of genetic screening as something that is necessary for employers to control health costs and for employers to have the right information with which to place employees even if they do hire them. I'm not at all confident that that kind of concerted public opinion at the early end of the curve about genetic testing will hold. I think it's going to take a lot of vigilance to make sure that as the Human Genome Project develops, and all the reasons why genetic screening is an important tool for employers, for productivity, etc. are stated, that we hold to that kind of limitation. My own feeling is that this could easily produce a two-class society of people who genetically seem to be mostly healthy and those who would be relegated to substandard employment because they would be seen as being health hazardous or in high-cost types of relationships. I think that we have one major cluster of issues that will deal with applicant selection.

Second is administering work activity. Here one of the central issues, which will be discussed later, deals with the new work machine -- the video display terminal and related office systems technology. The issue is what is and is not appropriate monitoring of work, especially by clerical workers in data entry, or customer service operators using terminal or telephone. It seems to me that we have a sharp collision of legitimate interests here. Employers need to be sure that their customer service operators are courteous, that they follow proper business rules in handling customers; that they comply with legal rules as to how they treat complaints; and that they provide services where there is regulation, as there are many, many types of telephone transactions. In the old days, where you counted widgets, employers could supervise people physically.

In addition, consumers want to be treated courteously. They do not want to be rudely and improperly treated; therefore, any business that wants to survive in a service-oriented environment is going to have to be very concerned about the quality of service that delivered through the telecommunications and database-oriented interface with the consumer.

On the other hand, there can be such an oppressive kind of electronic sweatshop monitoring done on video display terminal systems that we have created a highly stressful and intrusive means of employer surveillance. Since we have in the United States an 85% nonunion set of workplaces, it's not typical to have a labor-management environment for resolution of this issue. So, it will either have to be a market- based decision on what employees will or will not select as jobs -- a very slender reed in an era of heavy unemployment, to add a concern about job security -- or we're going to have to have some kind of regulatory or rulemaking process that will deal with those employers who are abusing the capacity to monitor, and make sure those employers use that capacity in an intelligent way. I think we'll be working on that kind of balance throughout the 1990s.

A third area that I see is developing is the whole area of new voice and data communications at work. E-mail systems pose a fascinating set of problems. This is the system paid for by the employer which is supposed to be used for work and not personal reasons. It is an instrument supplied for accomplishing the work of the employer. On the other hand, it is an instrument that people use for a variety of expressions, not always loyal to the employer, or deferential in the way that they converse about what is going on in the workplace. It's not entirely clear what kind of supervisory role the employer ought to have over E-mail systems. Clearly if crimes are being committed over the E-mail systems -- such as distributing drugs -- there is a probable cause type reason for the employer on a warrant/limited basis to monitor use of the E-mail system in order to deal with illegal activity. On the hand, I think that any employer who monitors the content of E-mail in order to find out who is loyal to which faction in the employer management force, or to try to control people's freedom of expression, in the better sense, inside the work relationship is perverting a very important new means of communication.

Let me give a second example of the kind of problems that we will be wrestling with. Many of you are familiar with something called call detail accounting -- the capacity of the carrier to give the employer a list of every telephone number that is called from an office telephone extension. As a means of controlling improper calls, etc., it is a tool, for example, that MCI has been touting with ads that show an employer looking very accounting-oriented. The big slug says, "We will tell you about every call -- every single call made from your premises." Well, that's a very dangerous kind of tool for employers to have, if they can trace every call that every employee makes from the telephones on the premises. European data protection approaches have been, I think, very creative here. They have suggested that we ought to suppress the last four digits of that call detail accounting, so that all you get is the exchange notification, but you do not get the actual called telephone line. There are many ways we can attempt to set the rules by which a technology that provides for cost containment and for analysis of telephone telecommunication use can be properly monitored.

Fourth and finally, let me give an example of an emerging application of technology that I think poses some very powerful issues. About two or three dozen companies that I am aware of are building something called health surveillance data banks at the workplace. They put in exposure data as to where employees are working and what chemicals and substances they are exposed to. Health utilization data on health benefit programs, worker's compensation, and disability claims by the employee, personnel information on performance appraisals and absenteeism, etc., and finally, epidemiological data about the effect of these substances in medical studies. By putting these components together, they have what is seen as a very important new tool to enhance the health and safety of employees at the workplace. It could be used in very positive ways, to reduce health risks at workplaces that have dangerous substances and processes. On the other hand, it can be a way of pushing out employees who exhibit adverse health effects; it can intrude into many intimate aspects of life, as medical data is collected from medical practitioners, or from employees who are forced to produce the data for the company. And it provides a very attractive database, again, for third-party users in government and elsewhere who will at various times seek access to these medical records for a variety of civil and criminal purposes. So we have an example of a potentially important breakthrough in health protection in the workplace, but one that also poses, classically, many hazards to the balance between private and public.

Let me close this overview by suggesting that we have clearly passed the era in which it is correct or right to say that there is no right to privacy for employees in the workplace. Courts, legislators, and social commentators agree that there must be reasonable expectations of privacy even on the employer's premises, even in the record systems that the employer generates, even when the employer has the responsibility to prevent crime and misconduct in the workplaces. I think that it will be the work of this decade to apply the definition of reasonable expectation of privacy in ways that will allow the new tools of technology to be picked up and enhanced, but without the kind of deep and powerful threats to these emerging employee privacy rights that I think are so central to the way in which we are all going to want to live our lives at work in this century.

The first of our commentators is going to be Gary Marx, a professor of Sociology at MIT who is particularly interested in the tools for surveillance and the capacity for control using new technological tools in a variety of places, workplaces being one of them. Gary? (applause)

MARX: Thank you, Alan, for that balanced, informative and optimistic point of view. I might add, consistent with our colleagues from yesterday, that I don't speak for MIT, I sometimes speak in spite of MIT. It has even been said that MIT stands for "Marx Isn't There." (laughter) Those of you who know MIT know that that refers to Karl, as well as Groucho.

Alan asked me to indicate first what it is that's new (Oh, that's terrific -- the monitor is working for me, it wasn't on for him. That's OK, we all know about authority, hierarchy, and power relations! laughter) In the ten minutes that I was allotted, I wanted to do five things and I'll try not to talk too fast. I'll try not to talk more than ten minutes, maybe I'll take twelve.

Anyway, I first want to set work monitoring in a broader kind of context. It doesn't really stand alone; it's a part of deep-lying changes that are altering the boundaries between the public and the private and the self and society. I want to indicate why legislative action is particularly important in the United States, given the relative weakness of labor unions. I want to mention some technical fallacies that characterize the world view of those who advocate unrestrictive, unrestrained monitoring. Then I want to suggest some broad principles that I think ought to underlie the kind of judicious and decent policies that Professor Westin suggested. Finally, I have a policy proposal. The summary of this is contained in the materials that you were given. I am also drawing from a lot of other stuff that I would be glad to give out to anyone or to send to anyone.

To locate it, first of all there are broader changes that we are all aware of, whether it is video or audio surveillance, electronic location monitoring, satellite surveillance, dossiers, night vision technology, drug testing. I have a list of about 25 pages. I call these extractive technologies. I think they have unique elements, and I think they, in some ways, parallel what you see in a maximum security prison, and those techniques of the prison are diffusing into the broader society. I ask whether we are not becoming a maximum security society -- such a society is transparent and porous, information leakage is rampant. It is indeed hemorrhaging barriers and boundaries, be they distance, darkness, time, walls, windows, even skin, which have been fundamental to our conception of liberty, privacy, and individuality. Actions, thoughts, feelings, pasts, even futures are increasingly visible. The line between the public and private is weakened, observation seems constant, more goes on permanent record, whether we will this or not, whether we know this or not, the merging of different kinds of data, etc.

A few people can monitor a great many. People become partners in their own monitoring, which is perverse, but at the same time, it is democratic. Surveillance systems are increasingly triggered when a person uses the telephone, or computer, enters and leaves a controlled area, takes a magnetically-marked item through a checkpoint. There's a focus on engineering behavior, prevention, or soft control. The case of just work monitoring: the focus is on certain attributes which are relevant that I think alter the social contract and involve the fact that the monitoring can be done remotely, that it is invisible. That it is potentially omnipresent, it's not episodic, that it covers more areas, that it can be stored, accessed, and analyzed. That it is done increasingly and disproportionately by machines. It isn't personal, it isn't place-specific, and also, interestingly, it's kind of a democratization of surveillance. It goes across occupations -- it isn't just clerical personnel, but it's lawyers, architects, and university settings. Those with small minds who think they're scientific do quantitative searches of databases to see how many references there are to a particular person as an assessment of their broad contributions.

I think there is a danger of an almost unseen surveillance creed in which we unreflectively back into a cowardly new world. In this world, deceptively easy technical solutions are offered to tough social and political problems. This reality was brought home to me when I wrote a satirical newspaper article which was also contained in the material given to you. I published it on April Fools Day, April 1, and it described an imaginary restroom trip policy. I wrote it in the bureaucratic jargon of a company memo. The policy gave workers weekly restroom trip credit, RTC, of forty trips. Access was controlled by a computer-linked voice print recognition system. The stalls had time tissue roll retraction, flushing, door opening capabilities, which were automatically activated after three and a half minutes. (laughter) There was capability for automatic urinanalysis to permit drug testing without the demeaning presence of an observer. I wrote the article as an extreme exaggeration of trends that I found disturbing, such as US companies that electronically counted time in the restroom and gave the employees demerits when the established time limit was gone over -- three were grounds for dismissal. Imagine my surprise when I learned that there was a Japanese company that markets a toilet stall that can, in fact, automatically test for drugs in urine. And in Europe, there are some toilet doors that do automatically open after a lapsed period of time. I was even more surprised, and I realized how rapidly our culture has changed and how weakened our expectations are regarding privacy and technology, when perhaps half of the readers were so conditioned by contemporary electronic intrusions that they thought the memo was genuine. We have come so far, so fast that people were ready to accept this outlandish, imaginary example as real, given the kinds of pressures and needs that Professor Westin referred to. I also was asked by a number of people -- and it disturbed me -- where the company was, and some companies wrote to ask where they could purchase the monitoring equipment. (laughter)

As a true conservative and in the spirit of Alexis de Tocqueville, I prefer to see social order emerge out of a balance of interests among strong associations instead of through legislation, and that conservers account to government and each other. Unfortunately, in the United States, for historic reasons, the power of labor is relatively weak and it is growing weaker. I think we really need laws to compensate for the weaknesses of the social structure. It's nice to contrast Europe, and particularly Germany, where there are strong employer associations -- where what happens in industry can only happen as a result of negotiations between employer associations and management. We tend to have that to a much less extent here.

Let me turn quickly to some of the technofallacies. What I want to deal with here is cultural beliefs. I must say in listening to the stimulating discussions in the last couple of days, I often find my mind wandering over the kind of details presented by academics, at least social science academics. I think they try to get too much involved in the policy game, and they are maybe in the wrong game. Academics, social scientists, have a noble contribution to make. It's OK to deal with squishy soft things like cultural beliefs or behavior. We don't have to be lawyers or economists. I'm explicitly playing that role in talking about cultural beliefs.

I have identified a number of what I call tarnished silver bullet technofallacies of the information age. Before we blithely adopt technical innovations, we have to think about the broader cultural climate, the rationale of action, and the empirical and value assumptions on which these are based. There is always a web of passive assumptions that undergird our actions, and we have to tease these out. I won't list them, but these are all quotes:

  • "Turn the technologies loose and let the benefits flow."

  • "Monitoring is for the worker's own good."

  • "Do away with human interface."

Without identifying the case of work monitoring, I'll just briefly mention six fallacies. First -- and these are partly my values, and partly my summary of social science research literature -- if it gets entangled a little bit, I apologize for that.

First, the fallacy of assuming that technology is only a means of increasing productivity and profits and improving service. Rather, it is also a means to enhance job satisfaction for workers. Europe differs very much from the United States. There is a wonderful video made by California Newsreel about how in Scandinavia the introduction of a computer-aided printing technology was designed, not only with concerns about productivity, but also with a concern about how the job for the worker could be enhanced and made more creative.

There is a fallacy of assuming that personal information on workers or customers that a company can collect is just another commodity, like raw materials, to be combined, reused, or sold as the company sees fit, without informing or obtaining consent of the subjects.

There is a fallacy of implied consent in free choice when it is assumed that in choosing to work for an employer and being told about the employer's practices, the employed consents to these practices. How many of you have children who are in college, or are about to graduate, or recently graduated from college? OK, a lot. How many of you know that in the graduating class of 1992, 25% will be unemployed? So I ask you, what does free choice mean in these kinds of settings?

A fourth fallacy is the fallacy that machine-generated facts speak for themselves and are necessarily more valid, reliable, and neutral than human-generated facts.

The fifth fallacy is the fallacy of confusing quantity with quality, and what can be easily measured with what is important. Of course that is a broad tilt in American society toward quantification.

The sixth fallacy is the assumption that people are best controlled through deception and the creation of uncertainty by not telling them when they are monitored or when they will be monitored.

Let me suggest, again drawing from Alan Westin's work and the work of many people in this audience, some principles that I think ought to underlie these actions. Let me also suggest that the exciting thing about all these questions is that we need extremes to bring the middle together, but in fact we're not dealing with issues which one can easily choose between. There is a story that is told about a young couple who take a train for the first time. They go to the station where there is an old conductor, and they ask the conductor, "Is the train going to be on time?" The conductor waits a long time, looks up and down the track, strokes his beard, and says, "Well, that depends." And the couple then says, "Well, what does it depend on?" He waits an even longer time, he looks up, he looks down, and he says, "Well, that depends, too."

Let me suggest what the monitoring ought to depend on, and I don't remember the exact number, if I have 15 or 17 or 12, because it varies by which talk I am giving. These are principles that ought to govern the work monitoring context. The first five come from the 1973 code of the US Department of Health, Education, and Welfare. There is a principle of the informed subject, a principle of data inspection, a principle of consistent usage, a principle of correction, a principle of relevance. And I would add to those a principle of co- determination, so that in the work context people subject to an information extraction technology have some involvement in setting the conditions. There should be a principle of minimization, a principle of validity, a principle of timelessness, a principle of data security and confidentiality, a principle of joint ownership of transactional data, a principle of human review, a principle of redress, a safety net or equity principle, and finally a principle of consistency such that broad ideals, rather than the specific characteristics of technology, determine privacy protection.

Let me conclude with a modest policy proposal, although I earlier made a plea for social scientists not necessarily having to do that. I would suggest that in the conventional view, monitoring is seen -- and in a sense, I am changing hats now from being a critic to being advocate -- only as a way to extend managerial control. It's not seen as a way to democratize the workplace, and in the American context where technology is generally designed and used primarily in the interest of management, it is likely to increase workplace inequality. Given this imbalance, an important public policy concern ought to be not increasing workplace inequality. The advocates of monitoring often eloquently point to benefits: there are at least seven that may increase productivity. It can lead to greater accountability and deterrence. It can mean a just desserts, a situation where people are rewarded, punished, or counselled depending on their behavior. It can protect the consumer, and it can deter lawsuits. It can protect employees from unfair accusations, and it can mean job improvement as a result of feedback.

Now let me suggest that if management believes this, and I do not doubt the pillars of our society, then it would seem very reasonable to apply the same monitoring ideology and technologies to managers and higher level executives. In fact, the case for monitoring them is much stronger than for monitoring those lower in the hierarchy (applause) -- just play it out. If the former are performing inadequately or illegally, much greater damage can be done. Employees could stand to lose their jobs and stockholders their investments if the company fails, not to mention the diminution of service quality for consumers, and liability issues. We might even adopt a principle that holds that the more central the function of the position, the greater the costs from its performing poorly, the greater should be the degree of monitoring. If management is sometimes incapable of watching itself, as is certainly the case in some sectors, given recent business scandals in banking, insurance, and defense contracting, then why not have monitoring units made up of workers, stockholders, consumers, even government regulators who use the latest technical developments to carefully monitor managers? Imagine what would be accomplished if a full audio and visual record of all the behavior of senior executives and managers was available, as well as any of their entries into company computers. Now of course we have to be humane and compassionate. If weaknesses in performance are found, procedures are violated, quotas are not met, they need not be fired, but through counselling and retraining (laughter), an effort would be made to deal with the problem. Great things might be accomplished with respect to productivity, profits, customer service conformity, etc. The credibility of management advocates of monitoring would greatly increase to the extent that they would be willing to submit themselves to these same technologies. You might recall the surreal example of President Reagan and his Cabinet giving urine samples and holding them up. That isn't quite what I had in mind, but it is kind of a democratization.

Let me conclude by telling you that I was raised in Hollywood, although, as the case with many of us, I grew up in Berkeley (think about it, OK?). (laughter) One of my most vivid childhood memories was seeing the film, "The Wizard of Oz." I was terrified, as we all were if you saw it as a child, by the power of the wizard. The fact that he was unseen made it possible to conjure up images of a truly ferocious entity who might be anywhere and remotely cause anything to happen, The lightning and thunder that he controlled, his deep and authoritative commands, were very intimidating. As you may recall, at the end of the film, a little dog, Toto, pulls the curtain away and the wicked wizard is revealed to be an elderly frail man. And at once we hear him say, "Pay no attention to the little man behind the curtain with the microphone in his hand -- the Great Oz has spoken." I think if the United States is to remain a decent and productive society in which technology is put in the service of its citizens, we must pay attention to the men and women behind the electronic curtain, and not only those in front of it. Thank you. (applause)

WESTIN: Our next speaker is Kristina Zahorik, who is a legislative aide to the Senate Labor and Human Resources Subcommittee on Employment and Productivity. She has been the lead staff person responsible for a bill presently pending in Congress, called the Privacy for Consumers and Workers Act. She is not only going to talk about that, but in general give some observations about how she sees Congress dealing or not dealing with the general area of employee privacy.

ZAHORIK: I'm not so sure how much insight I can bring to you, considering I am coming from Congress and we're in an election year, but nonetheless I'll do my best. You all know what's going on in the computer field, probably better than myself. I don't profess to be an expert with my own computer -- I'll leave that to you all. I know how to turn it on and work it, and that's about it. I do know, though, that the increasing computerization of the workplace and the advances that have been made in technological capabilities have really exacerbated the relationship between the employer's right to control and supervise, and the employees' right to privacy, autonomy and dignity.

As electronic monitoring has expanded, so have complaints about its intrusiveness and the pressures that it creates. We started to hear this, at least in the subcommittee that I'm on (Employment and Productivity). We started to get phone calls, reports came to us, more so on the abusive nature of things that are going on. One study that was done in the early 1980s by "9 to 5" (National Association of Working Women) cited that employees who are monitored by computers are 19% more likely to describe their jobs as stressful. The stress that these employees experience should not be overlooked. According to another report that was issued by the ACLU, workplace stress costs this country $50 billion a year, and that's a cost that we cannot continue to afford. Additionally 9 to 5 and other organizations have documented countless cases where employers have abused electronic monitoring systems, including bugging personal calls or listening to private conversations between employees. This is particularly disturbing since it is already illegal; however, how is an employee to know that this is happening to her, if she doesn't even know that the employer engages in monitoring to begin with? She finds out when she is fired, or terminated from her job.

In many ways, abusive monitoring acts as an electronic whip; we've heard this before, the drives and fast pace at the workplace in the growing service industry. Monitored employees, whether in telephone conversations with the public, or in producing work with computers, must carry out repetitive duties that require rigorous attention to detail executed under the stress of constant supervision and demand for a faster output. Unrestrained surveillance of workers has turned many modern offices into the electronic sweatshops that you read about. Current monitoring practices also operate as a form of de facto discrimination because women are primarily employed in the types of jobs that are subject to monitoring, such as clerical workers, telephone operators, and customer service representatives. The conservative estimate is that about six million employees are currently monitored. This figure doesn't include, however, professional, technical, and managerial workers. I must say I do like Gary's idea of monitoring those who are higher up on the corporate ladder. If you include these types of workers with those who are monitored right now, we would add about 2 million additional monitored employees. Moreover, as the workplace becomes more computerized and service-oriented, the number of those electronically monitored will increase. It's estimated that about 50 million Americans right now use computers at work, and by the year 2000, it is estimated that nearly everyone will use the computer.

There currently is a void of legislative solutions to really stop or deter this type of abusive practice. Someone argued that this should be left to labor management, but as Dr. Westin indicated, about 16% of our employers are unionized, so does this mean that we leave the rest of the employees out in the cold and leave them on their own? If, as Dr. Westin pointed out earlier, if you say, "Well, you can just walk away from your job," try telling this to a single mother who has children and is trying to make it on her own. There is really not a choice there for her to make, whether to stay on the job and continue under the abusive practices, or look for a new job, especially given today's economy. Others would argue that we allow business to change voluntarily; however, history clearly indicates that this isn't the best way to go in trying to gain compliance. In fact, business has rejected almost every worker protection legislation, such as minimum wage and worker safety protection under OSHA, arguing that it would be bad for business, or it would cost jobs. We found historically that this just doesn't happen. This is not to paint all businesses as bad or with the same brush; however, in conjunction with the government study conducted by the Office of Technology Assessment that our moderator, Dr. Westin, worked on, they found that one-third of the companies studied applied fair work standard evaluations on electronic monitoring.

It is a sad irony that while the FBI is required by law to obtain a court order to bug a conversation, even in the cases of national security, employers are permitted to spy at will on their employees in the public. That is a very chilling, chilling thought. Many of you probably didn't know that if you call the IRS for information, there is often someone listening in on the phone conversation. We had somebody call our office who happened to call the IRS to inquire how to get tax forms for previous years and gave their name and address. Well, guess what happened? A knock on the door from the IRS. There is a delicate balance, however, between the demands for technological change and for citizen protection. As a nation, we have supported laws that protect us everywhere from evasive spying and prying into our private lives except in the workplace. The United States stands alone with South Africa in failing to protect workers' rights.

As I mentioned, Congress does recognize the need to address the new technologies, somewhat, and in 1987 did request the OTA to complete a study on the use of new technology in the workplace, its effects on privacy, civil liberties, and the quality of working life. One finding: it did report that the intensity and continuity of computer- based monitoring raised questions about employee privacy, fairness, and quality of work life. As an indirect result, Congress introduced legislation for the Privacy for Consumers and Workers Act to try and address the excessive abuses and to protect employee and consumer privacy. The Privacy for Consumers and Workers Act as introduced would require employers to provide employees with prior written notice of the forms of electronic monitoring that they will be subject to and the frequency of monitoring. Also, how to interpret the records of printouts of the statistics on the monitoring, and how production standards are based on those statistics. And also what kind of personal data on the employee will be kept and what the personal data will be used for. In addition, employers will be required to notify their employees when monitoring is taking place. The employer could not use the data that was gathered as the sole basis for disciplinary action unless (and sometimes people forget this part) the employees are allowed to review the personal data within a reasonable amount of time after the data is obtained.

The House legislation (H.R.1218) has already been amended at subcommittee level, and is expected to move this spring to full committee markup. There have been changes that have been made to the bill and some compromises. The Senate legislation, which I have jurisdiction over, primarily S.516, is awaiting markup before the Subcommittee on Employment and Productivity. I suspect that we will move forward on it sometime this spring as well. The legislation does not prohibit electronic monitoring -- it doesn't say that it is bad, or it is wrong. It is simply a notification bill -- a right to know. We believe that employees should have a right to know what is being collected about them and how it is being used. It doesn't say that it should not be used, but what it does say is that electronic monitoring should not be abused. Employees should not be forced to give up their freedom, dignity, or sacrifice their health when they go to work. The Privacy for Consumers and Workers Act, as I said, is not perfect. Clarifications and improvements will be made on the bill, but it is a needed step in the right direction, we believe, toward protecting fundamental privacy rights when currently there are none. Thank you. (applause)

WESTIN: I can assure you that our next speaker will not start out by saying that he does not know too much about computers, since he is one of the eminent computer scientists in the United States -- Willis Ware of the RAND Corporation. I have known Willis for many years, in one capacity as vice chairman of the Privacy Protection Study Commission, a post that he held and served in with great distinction. I've always found that as new technologies unfold, Willis's observations about the issues that they pose and how to deal with them have been a major part of the national conversation. Willis.

WARE: I'm cast in a role this morning. I am to be the resident in- house guru for technology/security/privacy for a corporation that is serious about exploiting modern technology, but also serious about avoiding abuse of it. I thought, though, in view of what I was going to tell this corporation, that I would be wiser to be not an in-house guru, but rather an outside consultant who had been called in to advise the CEO and his board of directors, perhaps. Now, of course, in adopting this other role, I will have done my homework and know what the CEO's annual earnings are, and I'll set my consulting rate accordingly (laughter) because I want to be sure to have his attention. (laughter)

Now I would start out by telling them some things that many of you will know. I'll be repetitious here for a brief bit about technology and its role in the scheme of things. I would start out by pointing out to the CEO and his top managers, his board, that generally speaking, technology comes with two sides: one is the advantages and payoffs that it offers to society, to society's organizations and to society's members. The other side, of course, is the negative consequences to the same, to society, to individuals, to organizations, to international relations, to the country, etc. In this case, the organization of interests is your company, Mr. CEO, and in this case, the negative consequences are potentially to your employees, your customers, and your clients. I would point out to you, Mr. CEO, that technology can become downright obnoxious! And it can get in that state without very much trouble, especially if organizations run with it, and especially if organizations run with it from a profit or revenue-driven frame of mind.

Speaking specifically about information technology, there is no reason to think that it is going to be any different. Why should it be different? The fact of the matter is that it is probably worse, simply because information in its proper role is so pervasive through everything that individuals, organizations, companies, and society do. You've already seen, or I wouldn't be talking to you, Mr. CEO, some of the exploitation of information technology that seems to you to be abuse and you want to avoid. So we have a sense of the problems that it is creating. I'll note for you that some of those problems are downright carelessness, or thoughtlessness, or oversights on the part of lawmakers, of organizational managers, of the organizations per se, or perhaps of communities of organizations. I will also point out to you that sometimes a problem arises because managers become overzealous in doing the job that they see their job to be, and trying to perform in a fashion that they believe their top management expects. I'll also point out to you that sometimes the problems arise because top managements are motivated for tighter control of the work force, or higher productivity of the work force, or closer oversight of the work force. Again, the problems will arise, because such things are done without consideration of the larger issues, mainly the impact on society or individuals. And some problems even seem to just happen because of a series of events. I would put DNA typing probably in that category. Science found out how to type and examine the human genome. They found out how to relate genetic flaws to various kinds of physiological and emotional aberrations. That was science on the march and that event happened to occur, and the consequences are what we worry about in the commercial workplace.

Occasionally you know that the problems that beset you are physical -- you heard your employees complain about long hours sitting at a terminal, or physical discomfort from sitting in perhaps inappropriate ergonomic settings. I will say to you with the utmost emphasis that most of your problems will be related to information use, and notably problems will be related to the use of information about people or information derived from observations about people, or when such information is used for oversight, monitoring, or control, or when it is used as some part of determination in regard to a right, a privilege or an entitlement.

You have heard, as I have heard, all the sort of things that can arise, so I won't tick them off for you. Let me observe the role of technology in all of this. It's the facilitator, it's the driver -- it is not the problem. Technology facilitates a lot of actions and a lot of events, but it is not the central issue. The central issue is information use -- information use in the context of what society will tolerate in the use and exploitation of information on our people, or what the individual will tolerate in the workplace, or outside of the workplace. When will society decide that some uses are for the better good of society, no matter what consequence it might have for particular individuals? How will we -- the country -- and how will you -- the corporation -- decide which are appropriate uses of information, when and where will you make those decisions, and who will enforce them? So let me just underline that one with emphasis, Mr. CEO. The central issue is a matter of public policy, or corporate policy, or if you like, social policy. Determine that, tell the technologist how you wish information to be used, how you wish it to be controlled, how it will be disseminated, how it will be revealed, and to whom. Then count on those technologists, those information technologists, to put in place the system features and the controls to implement and conform with those policies. Do not ask that technologist to make those decisions in your behalf. It's not his professional responsibility, his job obligation, or his singular role in your organization. To be sure, he is as player in the decision, but he is not the maker of the decision -- that is your job.

Who makes policy? In your corporation, it will be you, CEO and board. In the world, Congress is one player for policy; regulatory agencies are another; activist groups sometimes force de facto policy. Organizations, like your case, will formulate their own. Trade unions sometimes negotiate details of policy, and policy can arise in a vacuum. If there is no policy, that is of itself a policy. No one is attending some issue -- the commonplace procedures of practice, as they happen to develop become, through evolution, the policy. Let me caution you, CEO, don't let the technologists make de facto policy for you by ignoring what they might be doing. There is more than one computing center buried several layers deep in the hierarchy of an organization that has made corporate policy, and the top of the organization didn't even know that it was happening. And please do not accuse me, CEO, of having created a problem for you. Your own managers have created the problems, in their drive to do better, the jobs that they think you want done. The worst that I might have done is to tell them about technology. Or I might have helped them implement something, but if I did, it was with the expectation that they were asking for that with your implicit or your explicit concurrence.

So, my bottom line to you goes like this: don't be deluded by the argument that technologists can fix the problems that you are asking me to speak about, and that you are concerned about. Decide at the top level -- which means you, CEO; you, Board -- what information policies will be, what information policies you wish to have in place, taking account of law, taking account of contract arrangements, taking account of traits and characteristics of people, taking account of organizational goals, taking account of organizational attitudes, taking account of your feelings of the organization and its social responsibility, taking account of the consideration you have told me you wish to exhibit for your employees and clients. With all of that before you on the table, decide what the corporate policy is to be. Then promulgate it and instruct your managers -- and in particular instruct your computer people -- to implement that policy with appropriate safeguards and procedures. In the case of your computer systems, sometimes fixes will be procedural, sometimes physical, sometimes personnel and various other mechanisms. To put it in one line, tell your computer people to do the security job right and to do it in the large global context, not in the small. Be prepared to pay the bill, because it will cost you something!

I will say to you that unless the dissemination of information and its use can be controlled, then adhering to even the best of corporate policy will be difficult and probably impossible; hence my emphasis on, what is called in the field, computer security. Importantly, keep this whole exercise in your full sight. Keep it in sight of top management to assure that the implementation fulfills what you expected of it, what you formulated in your policy direction. Keep this in full sight to assure that there are no unnoticed or unintended side effects or anomalies. Keep it in sight so that no loopholes get by your attention. Keep it in sight and make sure that there are appropriate feedback mechanisms to detect and repair shortfalls and oversee problems.

I say to you lastly -- don't think you can walk away from this problem. Don't think that you can do it once and forget it. Things will change, new effects will appear, new uses for information will be suggested, new technology will come wandering by. Managers will become overzealous; managers will behave in ways that they think you want them to behave. Laws will change, managers may move off into directions which are treacherous for the policy view of the corporation. You cannot walk away from this once having done it and forget it forever more. You must watch the information policy issues, the information utilization policy issues, every bit as carefully as you watch the behavior of your organization in all other aspects of conduct of business.

That's what I would say to the CEO, and if the reception were good, I might, on the fly, double or triple my consulting rate. (laughter and applause)

WESTIN: We'll be following the standard procedure for recognizing people for questions. I'll ask panel members, since all of us have had a chance to speak, to keep our responses as brief and non-additionally speechy as possible. I'll move across the three microphones in a standard manner, starting here at my left.

MIKE GODWIN: Hi, I'm Mike Godwin with EFF. One of the things that concerns me about employers' policy with regards to the use of electronic mail systems and computing power in working environments is that it is often a case of an employer who typically has invested a lot of capital in setting up the computer system and electronic mail and is deeply concerned about what he or she perceives as a frivolous or playful or non-work-directed use of the system.

Yet, you see policies and implementation of monitoring to insure that the uses of these systems are not non-job-related, not frivolous, etc. One of the things that I have noticed, however, in some businesses is that in general, the people who explore the playful uses, the frivolous uses -- planning parties with electronic mail, using electronic mail for gossip, getting on Usenet user groups and talking about their hobbies -- always end up being the people that are capable of most effectively using the tools for business purposes. One of the things that I would like to have the panel comment on is whether there is a role for the playful, or perhaps the non-direct, frivolous use of electronic mail and electronic conferencing.

ZAHORIK: Well, I do know that there have been some cases out in California with respect to E-mail which I am sure that many of you probably know about. There is an assumption of privacy to some degree, given the fact that you usually have to come up with your own password or codeword to access your computer, and in addition a codeword or password that you come up with yourself to access the E-mail system. Therefore, if an employer is monitoring the E-mail, is that in fact legal? We can think of this in terms of telephone conversations. We have said that if it is a personal phone call, once the employer recognizes it as such, he or she is to hang up and not listen. I think there is some question in terms of the courts right now as to whether or not there isn't a reasonable expectation of privacy in terms of the E-mail system, and I would suspect that hopefully there would be. I think that needs to still be debated.

WESTIN: OK. Right here, please.

MARX: I think you have identified one of the really interesting ironies or paradoxes of social organization, and that is that creativity, innovation, and advancement often involve rule breaking, essentially. We think of progress as breaking the rules, and there is sort of good rule breaking and bad rule breaking. I think what you are referring to is clearly the kind of good, experimental rule breaking, and also the issue of your cracking nuts with sledge hammers is in a sense of proportionality. Most of the arguments that management puts forth for monitoring, I think, in fact, are legal. It's a question of, in Justice Potter Stewart's brilliant, wonderful, wise words, "realizing that just because you have a legal right to do something, that doesn't mean that it is the right thing to do!"

WARE: Let me just add this other point of view. One of the problems of E-mail is that we don't know what to think of it -- we don't know how to characterize it. A lot of people say, oh, it's first class mail, like an office memo, but it really isn't. It has quite a different set of attributes and if one wants to put it in its proper place, we in the field don't know really what its proper place is. The answer to the specific question, though, is like the story that was told earlier --it all depends. In an R&D organization like I live in, of course there is enormous freedom. I couldn't run a committee without my E-mail connectivity. In a tightly controlled production shop, it is a different scene. I think my personal view, if I were running an organization, would be to be moderately tolerant about the issue, and to say, OK, some level of E-mail activity I essentially give to you as a freebie to play with. My only request is that you tell me if you find any good things to do with it in my behalf. The other part of your obligation is to stick to corporate business and conduct business through E-mail in the large, but sure, play on the fringe a little bit. Now in today's world that probably is a relatively rare attitude, but I suspect that it is a workable one, and in the long run I suspect that it will be the acceptable one.

MAYA BERNSTEIN: My name is Maya Bernstein, and I work on information policy in the Office of Management and Budget. I guess my question is mainly for Willis Ware, but for any other members of the panel that care to comment. I am wondering, since you say that technologists should not make decisions about policy, how will informed policy get made if, at least, some policymakers are not also technologists and some technologists are not also informed about policy? Shouldn't technologists consider what they are creating, before they create it? Or should we let technologists basically create anything they can think up? Should technologists, on the other side, create anything that they are asked to create by management, even if they think that it is immoral, illegal, or just bad policy?

WARE: Well, first of all, I'm not a believer in the technology mandate. I don't believe that just because we can do it, we should do it. So that is the first observation. Secondly, I am a believer in pinning the tail where it belongs, and so if someone is going to make a decision, then that person should be accountable for that decision and bear appropriate responsibility, be it corporate responsibility, or legal responsibility, or whatever. On the other hand, the typical management of corporations is not well informed about the ins and outs and the intricacies, and so that's why I would insist that the technologist be a player in the decision, but not the maker of the decision.

MARX: A comment on your question. You asked if technologists should only serve management? I think that's terribly revealing. What kind of a society do we have when the technology primarily comes from one particular interest group, even though a terribly important interest group? Why aren't we also in a society in which you could ask, should the technology also serve the interests of low income people, physically handicapped people, the elderly, people whose reasons are isolated? It's a nice way that language reveals something about the society.

KATHY KLEIMAN: I'm Kathy K1eiman. I'm a law student at Boston University, and my question is initially directed to Kristina Zahorik. A few years ago I worked in the information services department of a major Wall Street investment bank. One of my jobs was keeping track of the database that recorded all the telephone calls that were made on thousands of lines in that investment bank. The major purpose of the database was to compare our records against the records of AT&T and MCI, because we felt that their tracking was inaccurate. We wound up receiving about a million dollars back from AT&T for misbillings. Nonetheless, the database was created for all the calls that were made, out of all the locations in the country. I was concerned about that database, and there were no policies regarding the use of the information. It was very much left in the hands of my manager, who was a very middle level manager at that investment bank. His ethical standards were very high, and we only revealed information in cases of subpoenas for insider trading. I'm concerned about that information -- legal requirements of notice are terrific, but I'd like to know what's the timetable for laws about security levels on this kind of information and also sunset laws, when this information has to be deleted. I'm concerned -- my manager has left, and I am concerned about the person who has control of that information now.

ZAHORIK: I think it's a very good concern to have. My feeling is that Congress -- I'm sure you all know -- moves rather slowly (laughter) to say the least, and the bill that we have has been around for while, longer than I have actually been up on the Hill. I suspect that it will be there for a little bit longer, unfortunately. I think that part of the concern is that a lot of people are not aware -- Congress is not, at least -- that this is a pervasive problem. I don't think that it has been brought to the attention of many of the folks who are up there, to the level that it really needs to be. Oftentimes, Congress is reactive, not proactive. I think that in terms of legislation like ours, the intent, at least, is simple notification. We're having a difficult time with it, and there's a lot of misinformation about it. Folks in the industry, by rule, don't like mandates, and therefore do anything that they can to stop what smells like a mandate. I guess my answer is that it is going to be a long time coming, unless folks like yourself take advocate roles and make sure that Congress moves and acts appropriately in terms of creating legislation that would deal with issues like sunset laws, who has access to the data, and what kind of data is being stored. Part of the problem, too, is educating the members up there.

MAARTIN VAN SWAAY: Maartin Van Swaay, Kansas State University. I've had two delightful days here, listening to people argue very persuasively for all sorts of things. The primary point that I've heard, and I presume that I have heard it right, is that we really ought to protect privacy as one of the fundamental freedoms of this country. I will admit to you that I have been a guest of this country for the past 29 years and I am still enjoying it. (laughter) And yes, I enjoy its freedoms. Something that intrigues me is that any rule that protects the freedom -- and let me take the key freedom -- any rule that protects privacy will protect its abuse. Any rule that limits the abuse of a new tool will also limit its use. We talk about rules that ought to have teeth in them. Well, rules with teeth amount to people who will be given the authority to wait for us in a dark alley with a baseball bat. The question I'd like to ask is whether rules will fix what concerns us. Whether we should not ask one level down, and ask if we want to protect freedom, should we not talk about trust? Thank you. (applause)

MARX: I think that's a great observation, but there is no one single solution to anything. We need all of the above, and one thing that I argue strongly for is the importance of developing a culture of civility, of developing manners of increasing people's awareness of what it is that is at stake. You have both the hard and the squishy positions: part of the people want teeth -- they want to lock up the offenders, and I really don't want to be vulgar, but they really want to kick ass. Then other people say we really have to persuade, and I think clearly that they are both important. It gets a little scary when you start talking about socializing for values, because of the implication of whose values, but I do think there is an American civic religion and I think manners are fundamental. A great social therapist, like Durkheim Weber, stressed those, in opposition to my namesake.

WESTIN: Let me make one observation. I tried to suggest at the beginning that we're in a transformation at the workplace. If I had to hold my breath for a law to be passed here, I think I'd get very worried. I think one of the more powerful forces that is at work is that the authoritarian management system has broken down, and that leading edge companies that are reorganizing themselves are, as I tried to suggest, having to, not because they want to -- because most of them think that Frederick Taylor had absolutely the right idea about how to manage work. It's broken down as a result of technology, world competition, and new ways of producing knowledge and goods. So the best hope, I feel, is that you simply cannot get good productivity by abusing people anymore. It doesn't have to do with the job security dimension. Again, if we had to depend upon the power of people to refuse consent or to refuse to do things at the workplace, again we'd have a slender reed. The real point is that in the competition for quality goods and quality services, you cannot drive people like cattle at the workplace. More and more, the managers of American enterprise are realizing that the old factory system will not work. They're going to have to civilize or do the things that have to do with mutuality and trust at the workplace, because that is the only way that quality goods are going to be produced.

ZAHORIK: To just comment on that, too. I think it's interesting that when we are talking about workforce, there are 2,000 issues where people in the committee perceive differently who we are competing against. There are those who still hold onto the old mechanisms of manufacturing and believe that really who we are competing against are Third World economies and low-wage workers, etc., where, in fact, we should be competing in high skills and high wages. It's interesting that some of the technology has taken away some of the creativity and the flexibility that our workforce has, and should have, and continue to have by some of the repetitive types of data gathering that are available now in the information age that we have.

LORRAYNE SCHAEFER: Lorrayne Schaefer of the Mitre Corporation. I have a question for Willis Ware. You said that technology facilitates monitoring, yet the information that is used is the major issue. What about computers that are taught to learn, such as computers that are taught to learn how to play chess, and based on their mistakes, they can improve. And we also have things on the other end of the spectrum called intrusion-detection systems, which are taught to learn what a person will probably do in any given session. For example, a computer programmer usually will look at certain files, and based on that information, if that programmer all of a sudden does something completely uncharacteristic of previous sessions, the computer will take evasive action, such as logging that person off. Do you have any comments on that?

WARE: Sure, there are circumstances where you will be very glad for what you described. The obvious one is an organization that deals with a lot of classified information, typically in the defense business. You do indeed worry a lot about who sees what or who does what to which. Under those circumstances, deviations in patterns of behavior are one tip-off that hanky panky may be underway. So there are appropriate circumstances for what you described. There are also a lot of circumstances that are thoroughly inappropriate.

HUGH DANIEL: Hello, my name is Hugh Daniel, and I'm a citizen of this country. (laughter) Someone has to represent the rest of us. My question, I think, is directed mostly at Alan Westin. I'm concerned that a lot of our solutions to these problems are getting overly complex, and you're not trying to deal with issues of who's responsible for what. Specifically, I'd like to ask a question: is this not a reasonable solution to the following problem? The problems that you mentioned in both your first and last issues of genetic screening, and susceptibility to different diseases -- the problems with those are largely currently based on who pays for the problem. If the tax structure in our nation were repaired to the point where people could afford to deal with their own medical bills, instead of having to hide them behind employer income, then the question of being genetically susceptible to a particular disease would be the direct responsibility of the citizen whose genes they are, and there would not be a disclosure problem.

WESTIN: I think I would say that's about 80% right, and that's why I commented in my opening remarks that if we had a national health insurance program, as some other countries do, that was not tied to the employer paying the costs of his or her workers, we'd have taken a large step toward progress. I'll tell you the other point that has to be taken into account. An employer who hires someone spends a great deal of money in training that person and in anticipating their useful life in the career that person makes in a corporation or government agency. If you follow the logic of genetic screening, there will be employers who will say it is not entirely a matter of how much money it is, shouldn't the employer have the right to make reasonable projections as to whether a person who has a genetic tendency will in fact fulfill the investment that employer makes in that person's career? I wish it were as simple, though it's plenty complicated, as it is on the financial payment end. I think that the employer arguments that they are entitled to the information for more than just the payment mechanism basis, are the way the debate is going to unfold -- that it's good, in a sense, for the employer and society together to be able to predict these future capabilities of the individual. I don't subscribe to that, I'm just saying that that's an argument that I think will be made.

DANIEL: Since I'm still at the microphone, I would like to point out that having the US Government being responsible for knowing my genes and knowing what's safe for me is an even worse situation than having an employer know. Thank you.

JATON WEST: I'm Jaton West, and I'm with Mitre Corporation. One of the beauties of this entire conference has been the wide range of perspectives on issues. One of the things that I find interesting about this particular panel is that there is nobody from industry -- there's no employer who does monitoring in the workforce on this panel. I would like to speak to that issue, having experience in that area, and having experience in the social sciences and in telecommunications customer management. There are a number of success stories of monitoring the workplace, when it is done right. Our employees were paid to answer customer telephone trouble calls, create trouble reports, follow up, and get the troubles resolved. That's what their performance was measured on. We captured that data, we met with them once a month and said, "Here's where you are on the rating scale, based on your performance." They loved it. We posted the customer call abandoned rate, which was our banner. We posted that every hour, so they could get a sense of how we are doing. When we posted it once a day, the customer call abandoned rate was 12%, and that's bad. When we posted it every hour, we got it down to 7%, because everyone was involved in working this through. In terms of managers being monitored, I was a manager and you could bet your bippie that I was monitored. The customer call abandoned rate was a part of my per- formance appraisal. My VP was measured on that -- he was measured on the percentage of trouble reports closed.

WESTIN: Were his phone calls listened to?

WEST: They weren't listened to, but the product of this effort was measured. A lot of automated tools were used to measure his performance, such as, What is the network availability? All of these kinds of things were dealt with. In my organization, I had a 6% attrition rate. In a service industry with entry level employees and low salary jobs, that happens to be quite excellent, and this was in 1984 when people could have left anytime they wanted to because there were plenty of jobs. We were written up in the "Washington Consumer Checkbook" as having the best customer service of all the telcos. I would just like to see an employer's perspective on a panel like this -- somebody who does monitoring in the workplace. It's not all evil.

ZAHORIK: I think, at least in my remarks, that I did try to point out that in fact it's not all evil, and I don't think that we're trying to paint it as a good or bad issue. I think that the other panelists pointed out that it is how it is being used, and who has access to it. In fact, there are wonderful success stories and good stories where you have good management practices. Unfortunately, we can't legislate good management, and there are abuses that are out there, and horror stories. How do you address those? I think that, again, we're not saying that monitoring is bad, or technology is bad. It's just how it is being used.

WESTIN: I'm going to try to get the last two people in. Go ahead quickly, please.

MARX: Business is not here because of informed refusal. There was an effort, so your point is well taken.

GLENN TENNEY: Comment to the last speaker, just very fast. Telephone operators, like directory assistance, having a banner, for example, showing their responses over the previous hour. That's a sword over their heads -- they have to improve, or they'll lose their jobs. It may be a success from the company's point of view, but it may not be from the employees' point of view. Quickly, I think the panel didn't do enough service to the fact that monitoring affects everyone outside the company, not just the employees. You just touched on it. My question is specifically, it seems, Dr. Ware, that what you're saying is that the real solution to the problem is a long-term vision of where we should be. A quick summary, and I know it is too, too trite, but does the panel believe that it is a correct view, and that the real long-term solution is to have a long-term view and a vision?

ZAHORIK: I think it would be nice to have a long-term perspective, but how is that going to address the things that are going on right now? Five years is a long time to wait for even the most simple piece of legislation that just asks employers to notify their employees. Our bill would also notify the consumer over the phone if the phone call is being monitored. It would be nice to have the long term vision, but I don't think that it is going to work that way.

TENNEY: We can have short term visions, apart from action, but part of it is the long term view ...

WESTIN: Mr. Congressman, down, boy! (laughter)

TENNEY: I'm dead serious.

WARE: Look, let's not be collectively naive. We aren't going to march into the future without making mistakes. Progress never occurs like that, so we're going to zig and zag, and we will evolve our way. There will be evolutionary dead ends and evolutionary disasters. So let's face it, some of the things that today are painted as offensive will be an evolutionary disaster and if things work right, it will go away. We will invent our countermechanism. The things that are working well that were characterized will gradually come into better balance -- society is going to shift. Your kids and my kids will grow up with a different set of expectations than we are discussing here. So, we are going to make mistakes as we chug ahead on this, and it's going to be particularly devilish in our racket because of the pervasiveness of information. Particularly so.

WESTIN: I think that if we can have the last speaker up we will have done what most sessions did not do -- have every single person that was waiting to be heard, heard, so you have the last word.

STEPHANIE PERRIN: Thank you. My name is Stephanie Perrin, and I'm with the Canadian Department of Communications. I apologize -- this is a squishy cultural value kind of question which is maybe not the best one for just before lunch (laughter), but it seems to me that our culture was founded on a religious and philosophical belief that people were created equal, had an infinite capacity for change, and had the power to be redeemed when they strayed and erred. I'm not saying that we do not believe in that now, but it does seem that we are being data driven with this employee profiling. It does seem that as soon as you get x number of hits on your employee profile, you're no longer a good employee, or a good person, whether that's genetic proclivity to disease, or a tendency to drink, or hang around with the wrong people, or spend too long on the phone -- so many strikes and you're out. Now I wonder if you would agree with that tendency to be data-driven, and if so, what does it take to change that sort of philosophical trend?

WARE: I'll agree at the drop of a hat with the stupidity of management (laughter), and what you characterize as data-driven is what I would like to call the accountant mentality thinking that it is saving money, when, in fact, it is spending more money to save the money. We see that everywhere, and that's one of the evolutionary disasters that's going to go away. I hope that we'll get to the point where managements have an understanding of what it costs to do something and what the payoff is to do something. The judgment will be based on that, not on the absolute preference of some profession, or some job assignment, or some division of the company. Now, you may say that's Fantasyland. Maybe so, but at least I can think about it.

WESTIN: Our time is up, so if you'd like to come up and talk, everyone would be glad to do so. (applause)

Return to CPSR conferences page.

Return to the CPSR home page.

Send mail to webmaster.

Archived CPSR Information
Created before October 2004

Sign up for CPSR announcements emails


International Chapters -

> Canada
> Japan
> Peru
> Spain

USA Chapters -

> Chicago, IL
> Pittsburgh, PA
> San Francisco Bay Area
> Seattle, WA
Why did you join CPSR?

To network and volunteer to support initiatives.