Personal tools
agre.html
CFP'93 - Articulated Tracking and the Political Economy of Privacy
by Philip E. Agre
Department of Communication D-003University of California, San Diego
La Jolla, California 92093-0503
(Note: This position paper is based on the introduction to a much longer paper that I hope to circulate around the time of the conference. That paper will supply numerous references that are missing here.)
1. The Surveillance Model
The rapid development of computer technology has made it increasingly easy for private and public organizations to gather, store, and circulate personal data. A substantial literature has described some of the risks that these developments pose to individual privacy and other civil liberties. These concerns are normally modeled on pre-computer-era concerns about state surveillance of individuals and groups, such as that practiced by police agencies and their organized networks of informants, with or without high technology, particularly in late 19th century Europe and in the totalitarian societies of the 20th century - and to a lesser but still significant extent in our own society. "Surveillance" in this sense has become a whole cultural system, fixed in the civil imagination through works such as Orwell's "1984". Several elements of this cultural system call for examination:
- its visual metaphors, as in Orwell's "Big Brother is watching you";
- the assumption that this "watching" is nondisruptive and surreptitious (except perhaps when going astray or issuing a threat);
- its centralized orchestration by means of a bureaucracy with a unified set of "files"; and
- its identification with the state, and in particular with consciously planned-out malevolent aims of a specifically political nature.
These developments are the subject of a growing literature. Several authors have analyzed the civil liberties concerns raised by these technologies, sometimes offering policy remedies. Others have rethought the concept of privacy in light of these concerns and policy issues. Still others have questioned whether a right to privacy even exists in either a legal or philosophical sense. What is clear is that the proliferation of computerized information-collection has challenged the notion of an autonomous individual. Authors such as Haraway and Stone, for example, have described an expansion of our social selves to embrace technological elements and technologically mediated forms of self-presentation, so that any account of social agency must understand people as 'cyborgs'.
2. Articulated tracking
My own project is not an inquiry into the concept of privacy, nor into the details of particular technologies. Instead, I wish to argue that the explosion of sociotechnical schemes for the maintenance of personal information has made the surveillance model largely obsolete. In its place, I plan to sketch some sociological concepts for the analysis of particular technologies and their place in society. As with all social-scientific inquiry, detailed analyses will have to be performed separately in each domain of social life: the legal and medical systems, workplaces in particular industries, the market in personal information, state intelligence gathering, and so forth. Nontheless, these disparate domains share their embedding in a common social order and a set of broadly analogous trends toward intensified use of information technologies. In virtue of these commonalities, the concepts I will describe appear generally useful in many of the necessary local analyses.
My central concept is "articulated tracking" (AT). Before proceeding, let me offer a provisional definition of this phrase and its component terms. "Tracking", roughly speaking, refers to any sociotechnical process through which particular information technologies maintain formal models of particular aspects or categories of human activity. "Articulation" is the process by which human activities are reorganized in order to facilitate tracking through their accountability to formal representations. "Articulated tracking", then, reflects the outcome of the social processes through which human activities and information technologies coevolve to maintain an increasingly intricate set of formal relationships between them.
Having introduced these concepts, I should hasten to anticipate a particular line of response, which would accuse me of a generalized opposition to technology or a failure to recognize the benefits of new technologies along with the risks. Notions such as "tracking" are so closely identified with the "surveillance" model I sketched above that analyses of them are often interpreted as conspiracy theories, as positing widespread malevolent intent, and so forth. But I wish specifically to reject the elements of the surveillance model that give rise to these interpretations:
- AT does not passively "watch" human activities; rather it actively maintains a relationship between those activities and formal models of them.
- AT, far from being surreptitious, is normally largely visible; and far from being nondisruptive, it regularly entails (or at least presupposes) a qualitative reorganization of the activity.
- AT is decentralized and heterogeneous; it is normally conducted within particular, local practices which involve people in the workings of larger social formations.
- AT endlessly crosses the boundaries between "state" and "private" organizations; these boundaries are generally quite permeable anyway.
3. Example - Automatic vehicle identification
Let us briefly consider an example, automatic vehicle identification (AVI) systems. These systems have been installed in a number of US states for the automatic collection of highway tolls, and are under development for several others. A motorist participating in an AVI program will install a "transponder" on his or her car, typically on the dashboard. When the car passes through a toll-collection station, a roadside transmitter will "ping" the transponder, record its unique identification number, and debit the corresponding account. Motorists can pay their tolls anonymously in advance by bringing the transponder and money to a special storefront, or they can arrange to be billed for their tolls, perhaps by registering a credit card number with the highway authority. Concerns about privacy are widespread at first, but as people "get used" to the system, they generally begin to pay on credit, trusting that the toll records can only be released to an outside party through a court order. I will not present a detailed case study of AVI systems, but a few simple observations will help mark out the range of issues I wish to analyze:
- AVI is a social technology, not just a computer technology. Its component social technologies include "customer service", bill collection, roadway law enforcment, advertising, legal interpretation of the Fourth Amendment, and so forth. The social and physical technologies are inextricably intertwined.
- Its progress is not simply driven by technology. AVI systems are operated for profit by private firms, typically public utilities regulated by state Departments of Transportation. The transmitter and transponder technologies are designed through a complex interaction among these organizations that includes public hearings, press reports, university consultants, and so on.
- It requires automobile travel to be articulated into discrete elements corresponding to a formal model. In the case of AVI systems, most of the necessary articulation was established earlier, in earlier technologies of toll-collection on limited-access highways. These highways, through their physical and legal structures, enforce a "grammar" of discrete, numbered "entrances" and "exits". Similarly, drivers must pass through discrete toll "gates" corresponding to discrete "lanes".
- It entails specialized languages that mediate the tracking process. The transmitters and transponders exchange digitally encoded radio signals of a prescribed form.
- It is fundamentally a phenomenon of administered rationality, not of physical technology narrowly construed. Technology can support it but does not itself entail it. It affords, for example through the collection of statistics, a variety of rational operations pertaining to traffic management and law enforcement.
- It is not foolproof. Indeed, it is irremediably problematic due to the fallibility of the hardware, the necessity of interpreting points of law, and the possibilities for hacking and other kinds of fraud. But this does not keep the system from "working" in some important sense.
- It can be, and is being, contested. Privacy advocates in California, for example, have lobbied for privacy protections in the physical hardware of the transponders (transmitting the ID code of the transponder and not of the car, for example) and in state legislation. Other possibilities for contestation are more or less obvious.
Return to CPSR conferences page. |
Return to the CPSR home page. |
Send mail to webmaster. |
Created before October 2004