Return to Privacy Module VI

 

INTRODUCTION TO MODULE VI:

Self-Help Mechanisms:  Cryptography, Privacy-enhancing Technologies, and P3P

 

Five weeks ago, when the current series launched, we posed the central problem of “Privacy in Cyberspace.” Let us restate the problem:  whether "[t]he claim of individuals, groups, or institutions to determine for themselves how, when, and to what extent information about them is communicated to others"[1] should prevail over the interests of technologists, industry, or government to be able, unfettered, to capture, store, and utilize that information.  We have not answered the question, but the series has sought to illuminate some of the issues that underlie the problem.

 

In this module we address the potential of so-called “self-help” mechanisms for addressing privacy concerns in cyberspace.   These mechanisms range the technological gamut from cryptography to cookie cutters, and from anonymizers to an industry-proposed consumer “choice” approach called “Platform for Privacy Preferences,” or P3P.  The proposals originate from technologists, consumers, or industry, all of who profess an interest in “protecting” users’ privacy in cyberspace.

 

All the proposals examined in this module share two basic assumptions.  First, these interventions presume that present technologies of internet routing and website information capture are sufficiently invasive of users’ privacy to warrant counter-measures.  Accordingly, proponents of these technologies advance measures that impede, undermine, block or regulate surveillance by private parties (ISP’s, employers, websites) or government. 

 

Some of these self-help technologies spring from a mindset of self-defense.  If websites drop a cookie, then I’ll buy “cookie cruncher” software that eliminates them.  If I worry that my employer may install a spyware program that tracks all my mouseclicks, I’ll obtain surveillance software that lets me know if and when that occurs.  If I worry that my internet game console is surreptitiously sending information back to the vendor, I’ll install a firewall to prevent it.  In addition, since I worry about the government snooping into my email, I’ll use snoop-proof email, or an anonymous re-mailer, or encryption.  It is a world of spy-vs.-spy, if you will, where the surveillance technologists compete to defeat the privacy technologists, who in turn find some way around the data collectors. Each is competing to defend informational turf, and to forestall the turf-encroaching technology of the opposition.

 

Other technologies (such as P3P, a developing industry standard), arise from a different source.  Proponents of self-regulation by the industry argue that there exists some “middle ground” between total control of information by a user, and abdication by the cyberuser of any control over information routinely conveyed by surfing behavior.  Industry representatives, who believe that for net commerce to thrive, industry as a whole must “compromise” with consumer groups in restricting the type of private information that is collected, and the use to which it is put, require some self-restrictions.  Such industry groups as TRUSTe and the Network Advertising Initiative (NAI) have formulated “principles” (not rules of law) that are designed to restrict certain types of behavior by their members, and thereby reassure consumers that many members of industry have no desire to ride roughshod over their legitimate privacy concerns.

 

Yet, even these industry groups acknowledge that some members of the online business world don’t operate within the suggested principles.  Moreover, consumer groups complain that in many cases even those businesses that are members of industry consortiums find that sanctions are more bark than bite.  Review the industry standards and consider for yourself whether they adequately respect webusers’ privacy.

 

A Second assumption underlying these technologies reflects skepticism about the role of law in mediating the conflicting demands of the various interest groups.  These technologies of measure vs. countermeasure demonstrate that law as presently constituted and interpreted is not fit (or is useless) as a frame for protecting privacy in cyberspace.  An alternative interpretation might be that the law does reflect the current balance of power between consumers/industry, employers/employees, and governments/cyberusers.  Regardless, one should ask of all these technologies if they are the appropriate mechanism for resolving these competing tensions.

 

For example, do the “self-help” measures listed by groups such as EFF and EPIC solve users’ online privacy dilemmas?  (Of course, these measures were drafted with the idea of ameliorating a bad situation, not with “solving” the privacy dilemma.)  The assignments also identify a number of industry efforts designed to come to terms with the online privacy concerns manifested by numerous privacy groups.  Do TRUSTe and the Network Advertising Initiative offer constructive alternatives to the position of many privacy groups that no data should be collected beyond transactional data, and that data should be destroyed as soon as possible? Which, if any, are examples of technologies that effectively balance liberty and security?

 

Consider cryptography.  What problems does encryption of one’s emails solve?  Encryption might ensure secrecy, but as Jeffrey Rosen points out, secrecy is a different issue than privacy.  Most people, when they send email, don’t object to snoops on grounds that some secret has been spilled, but on grounds their privacy has been compromised. 

 

Why is it—or why should it be--necessary for the ordinary emailer to adopt counter-surveillance measures such as cryptography merely to shield her privacy?  Why should it be necessary for the ordinary websurfer to learn about the wide variety of counter measures that an anti-webprofiling person could employ in order to defeat the data collection techniques of online merchants?  Perhaps ordinary law—such as requirements protecting ordinary people doing ordinary websurfing from dataprofiling—would wreak a more appropriate compromise between the interests of commercial appropriation (or governmental surveillance) on the one hand, and privacy of their surfing behavior on the other.

 

One reason why the role of law should be discussed in connection with “self-help” mechanisms concerns the “leapfrog” nature of these security measures described above. Invasive technologies and technologies that protect privacy sometimes seem to operate in tandem.  When a technology develops that seems to threaten privacy (e.g., cookies), other users develop counter-measures (e.g., cookie eaters).  When cookie eaters become pervasive, merchants start using web bugs.  Should cyberspace users defer to the technologists to develop privacy protection devices?  Or, should legislatures and courts be involved in setting privacy protection measures?

 

Further questions to pose about self-help mechanisms.

Some of the self-help mechanisms may create their own privacy problems.  Ironically, the fact that encryption—ordinarily regarded as a strong privacy-enhancing device—might actually in the future interfere with privacy.  The Encryption Assignment  raises this prospect. Does encryption itself carry the potential of creating new privacy problems? 

 

This Module also raises the question of whether legislatures and courts can resolve the conflicts over centralization of data in cyberspace.   Consider, for example, the post-September 11 privacy issues.  Are you optimistic about the possibility of constructing checks and balances against the political and commercial forces that seek to centralize data after September 11?

 

There is, of course, a basic puzzle not resolved by the post-September 11 legislation against terrorism.  If I’m really a terrorist, and have something to hide, I’ll (1) use encrypted messages, (2) from an anonymous emailer, (3) originating from a public source used by many people, and (4) incorporating steganography[2].  So, who are the people whose online activities are going to be captured by governmental surveillance?  Maybe not terrorists.  See “Terror's Confounding Online Trail,” New YorkTimes, March 28, 2002,

http://www.nytimes.com/2002/03/28/technology/circuits/28TERR.html?pagewanted=2&ei=1&en=d7b0237f9318b34d&ex=1018333699

 

 

 

 



[1] Alan Westin, Privacy and Freedom, (Atheneum 1967).

 

[2] Steganography involves hiding messages in a microdot camouflaged within a larger pattern such as a photograph.