Why the Internet is Good
Community governance that works well.

Berkman Center Working Draft

This version:
regulation-19990326.html
Previous version:
regulation-19990128.html
Latest Version
see index.
Author:
Joseph Reagle, Resident Fellow
<reagle@mit.edu>
Berkman Center for Internet and Society
Harvard Law School
* On Sabbatical from W3C/MIT
Conformance:
Valid HTML 4.0! Valid CSS!

Status of this Document

This is the fourth (very stable) public draft of this document. Comments to the author are encouraged. Some of the ideas presented in this paper are based on conversations with and comments from Lawrence Lessig and Andrew McLaughlin. I thank Daniel Dardailler for his comments on the draft. This paper uses a fairly novel format that heavily relies upon quotations, maxims, and primary source documents (as well as first hand experience) to communicate an understanding of Internet institutions and culture. Sources for quotations not provided within this document are listed in Appendix 1: Internet Quotations; quotations that capture Internet social norms or media perceptions – as described later – are highlighted.

Some of the concepts of this paper influenced the Economist article: Regulating the Internet: The Consensus Machine which I recommend if you're interested in a consise treatment of these issues in the context of standards and domain name governance.

This paper is an independent analysis and does not necessarily express the views of the W3C, MIT, Harvard, or the Berkman Center.

Copyright © 1998 Joseph M. Reagle Jr., All Rights Reserved. This document is best viewed with a W3C Stylesheet compliant application.

Abstract

Many think the Internet is a good thing because it is unregulatable. The Internet is good, but not because it cannot be regulated. Like anything else, policies are voiced and implemented on the Internet. The true strength of the Internet is that, as an institution, it exhibits characteristics of policy formation that appeal to one's sense of liberty. This is not solely because of maxims like "The Net interprets censorship as damage and routes around it," or "No one knows you're a dog on the Internet." Free speech and privacy are laudable characteristics of the early Internet, however they are neither absolute nor guaranteed forever more. In fact, mechanisms of identifying oneself and controlling content can be useful as well as invasive. Instead, what makes the Internet a "good thing" is its anarchical characteristics of policy formation, such as decentralization, consensus, and openness that real world social structures have striven for – some with more success than others. I examine these characteristics in the context of popular Internet quotations (and anarchist principles) that act as the belief system of the Internet.

Outline

Thesis: Mechanisms of Internet governance have resolved most Internet technical and social  problems well. Not only should real world governments tread lightly upon the Net, but they might learn something from it.

  1. Introduction to Internet Regulation
  2. Governance
    1. Governance Rationale
    2. Governance Model
  3. Internet Policy by Real World Governance
  4. Native Internet Policy Formation
    1. Open Participation
    2. No Kings, but Elders?
    3. Consensus and Competitive Scaling
    4. Implementation and Enforcement
      1. Limitation of Scope
      2. Funded Mandates and Lack of Fiat
      3. Uniform Enforcement
      4. Descriptive Policy
      5. Policy Deprecation
      6. Metrics
  5. A Best Effort (Conclusion)

Appendices

  1. Internet Quotation Index
  2. Characteristics of Policy Formation
  3. Case Study: Why ICANN is Frightening
  4. Case Study: Spam
  5. Case Study: PICS

1 Introduction to Internet Regulation

"Some jerk infected the Internet with an outright lie. It shows how easy it is to do and how credulous people are." Kurt Vonnegut.

"Unlike a virus, which is encoded in DNA molecules, a meme is nothing more than a pattern of information, one that happens to have evolved a form which induces people to repeat that pattern. Typical memes include individual slogans, ideas, catch phrases, melodies, icons, inventions, and fashions. It may sound a bit sinister, this idea that people are hosts for mind-altering strings of symbols, but in fact this is what human culture is all about." Glenn Grant [Gran98].

Social norms, perceptions, and expectations regulate behavior. When one thinks of the Internet, one thinks of a decentralized, far flung, heterogeneous, and unregulatable space. However, there are strongly held social norms that regulate (affect) the behaviour of Net users. Designers and users of Cyberspace have created/captured these norms in pithy maxims – it seems appropriate that memes regulate a land of ideas. I claim that the maxims discussed below act as a particular type of social norm, regulation by meme:

1. "The Net interprets censorship as damage and routes around it." John Gilmore.

2. "In Cyberspace, the First Amendment is a local ordinance." John Perry Barlow.

As pointed out by Lessig [Less98a], there are in total four things that regulate cyberspace: laws (by government sanction and force), social norms (by expectation, encouragement, or embarrassment), markets (by price and availability), and architecture (what the technology permits, favors, dissuades, or prohibits). Interestingly, the social norm captured in the quotations above seems to side with its symbiotic partner, the dencentralized Internet architecture, and with the the laissez-faire market in order to challenge the power of law. Social norms, markets, and architectures do not always oppose law; however, in this instance, these memes challenge the authority of the government to control speech.

Another meme, which was not intended to attack regulation by law, humorously characterizes the anonymous nature of the Net (something that governments fear) and was born of the famous New Yorker cartoon with the byline:

3. "On the Internet, Nobody Knows You’re a Dog." Peter Steiner (Cartoonist at The New Yorker.) [picture]

This Triumvirate of quotes serve as a belief system of the Net. It challenges governments' ability to regulate identity and speech. In the past two years, the set of beliefs have become so powerful as to become generalized:

"The single unifying force is what we don't want government running things." Joe Simms (ICANN counsel.)

"The private sector should lead. Governments should avoid undue restrictions on electronic commerce." Ira Magaziner, (Former point man for electronic commerce within the Clinton Administration.)

While these latter quotations are not as pithy as the Triumvirate, they have become pervasive. However, regulation by norm is unstable. It cannot stand in opposition to law, divorced from architecture and markets. If the architecture or the market move in a different direction, norms often follow. Unfortunately, the concept of privacy in the US has been subject to this pressure. As companies collect ever more information through advances in technology, people's expectation of privacy (the social norm) declines. This is also why I believe the reaction to PICS, a mechanism of labeling content for filtration, was attacked by the believers of the Triumvirate. Or why paper titles such as "The Architectures of Mandated Access Controls" [LR98] frighten me: even well intended technical advances in selecting/filtering content, promoting identity, and policy analysis challenge present day social norms of unregulatability!

I believe the weakness of beneficial social norms divorced of law, architecture, or markets is why Lessig has called for us, "To understand government's role not as some unnecessary appendage ... but as an institution that makes possible a certain perspective on social life.... we will only do it well when we have abandoned this indulgent anti-governmentalism." [Less98c] To restate this position in the context of this paper, whereas governance and regulation by law are in opposition to the interests of the citizens, those citizens cannot afford to rely upon architecture and social norms to protect their interests forever – particularly when one does not view the market through rose colored glasses. It follows that the right thing to do is to engage governance and law such that they also reflect the interests of the citizens.

I agree with this position to a point. We cannot rely upon three quotations and a set of RFCs (IETF technical standards) to further civil interests indefinitely. However, I believe that a cautious attitude towards real world governments is well founded and that there is more to the Internet than RFCs and quips. The Internet is not good because of the three maxims, but because of how those maxims came to be. What is good about the Internet is the very process by which one builds and selects communities, and how their policies are proposed, tested, and implemented.

In this paper I examine the Internet as an instrument of policy formation, predicated on:

"Architecture is politics." Kapor, Mitchell. (EFF)

and my contemporaries [Less98*, Reid*, Boyle, GK96] subsequent analysis. If architecture is politics and code is law, what similarities and differences characterize the formation of Internet policy?

I first present a model of governance as a means of policy formation, deployment, and enforcement in the context of anarchist principles. (I refer to the Anarchist FAQ because of its content and because it an exemplar of the principles I speak of.) I then cast real world attempts at Internet governance in this model. However, the focus of this paper is in §4 which describes the institutional characteristics of the Internet as an instrument of governance and policy formation.  I argue that the Internet is good because it supports mechanisms of discourse, consensus, and community that are often better than what we have in the real world. I argue that the advances made by the Internet are predicated on open participation and contributions; that the policies that emerge are intrinsically conservative in their scope, but rigorous and uniform in their application; that mechanisms of authority shed power rather than hoard it; and that the Internet has no gods, but leaders advanced by their technical merit rather than telegenic appeal. I conclude by recommending that Internet governance questions that do not benefit from the rigor of these characteristics should simulate them when possible. Finally, the case study appendices apply and test the arguments of this paper with real world Internet governance problems.

2 Governance

"Our identities have no bodies, so, unlike you, we cannot obtain order by physical coercion. We believe that from ethics, enlightened self-interest, and the commonwealth, our governance will emerge." John Perry Barlow (EFF).

I wish to introduce a simple model of governance for the purposes of this paper. My analysis is set in an anarchical context: I do not assume traditional governance is always a useful thing. Rather, a proposal for governance must justify itself in the context of the question, "why cannot people act for themselves and their own local communities?"

2.1 Governance Rationale

My model has two answers to the question above. Both answers address a scenario in which individual preferences (values, beliefs, and local policies) need to be combined to form a policy of broad scope. These answers apply equally to the governance of community and individual behavior:

collective

choice: A single policy must be set that cannot help but affect all those within the scope of the community. Varied preference values must be aggregated such that the single value is acceptable and considered legitimate by the community.

efficiency

: Even when individuals can act upon their individual preferences (values), an aggregate value set by a central institution may be more efficient or rational over a longer time frame for the individuals of the commuinity. In such cases, it is useful for a central authority to establish a single value rather than (1) requiring each individual to undertake the transaction cost of negotiation, or (2) permitting market failures and short term irrationality.

Examples of collective choice are related to problems of scarce resources and public goods. Scarce resources are those for which there demand exceeds a limited supply. For instance, there may only be so much bandwidth on a given network, so mechanisms of allocating portions of the bandwidth might be needed. Public goods are characterized by being non-rivalrous (one's consumption does not interfere with another's) and non-exclusive (one cannot exclude others). For instance, on a public email list everyone has the capability to produce and consume information without interfering with the ability of others; furthermore, it is difficult to preclude "lurkers" and "flamers" from benefiting from good posts without making their own worthwhile contributions. This may explain why many moderated lists tend to be of a higher quality than unmoderated ones.

An example of regulation motivated by efficiency is online privacy. The individual cost of understanding the varied and obscure privacy practices of every Web site is so high as to lead to a market failure: users will not be able to make informed and rational choices. Consequently, one could argue that a central governance mechanism should set privacy practices, or at least standardize disclosure upon uniform principles and terms.

2.2 Governance Model

The model of policy formation focuses on the flow and stages of its creation. Individual preferences are aggregated to yield a single policy. That policy is than deployed, implemented, and enforced. Policy is traditionally defined as the method by which an institution is administered. An Internet policy is an expressed goal about the use and operation of the Internet. Note, when I speak of Internet policy I'm speaking broadly; I'm speaking of informal or formal rules and expectations of conduct, behavior, and requirements over people, institutions, processes, and protocols.

One of the difficulties of speaking of policy arises from its inherent ambiguity. For instance, legislators might pass a law, the executive branch may enforce certain terms and let others lay dormant, and courts will issue rulings qualifying the law further. Is the actual statute written by the legislators the policy, or is its combined/emergent interactions of all these entities the real policy? I use "policy" in the emergent sense, and I use  regulation to denote the formal expression of a governing institution.

Table 1: Flow and Stages of Policy Formation

FLOW STAGES
UPSTREAM:
preferences, (representative) democracy, deliberative polling, boycotts, etc.
expressed: values, information, and preferences are manifested by individuals within the community.
aggregated
: multiple sets of preferences are combined to set a common value for the community.
DOWNSTREAM:
policies, rules, regulation, etc.
deployed: policy is then communicated to the community.
implemented
: mechanisms of changing (or sustaining) behavior required by the policy are put in place.
enforced
: those mechanisms are employed (by some force) to incent the deployed policy.
Centralized <--> Distributed <-->  Dencentralized

Furthermore, upstream and downstream flows can be achieved through centralized, distributed, or decentralized mechanisms. [GK96] Centralized mechanisms require an authorative and exclusive set of entities (often one) to provide a service. With distributed mechanisms, services can be delegated from an authority to a lower level. For instance, the *.edu domain name authority delegates the naming of *.mit.edu to MIT. Decentralized mechanisms require no single authority for the provision of a service. For instance, a common way of being able to arbitrarily extend Web protocols is to provide a simple extensibility field within the protocol. Thus, if someone later wishes to say, "the resource at the URL 'http://w3.org/DSIG' defines the digital signatures extension" no central authority was needed to permit this action. Anyone can place any valid URLwithin that field. This is a different approach than that used by some protocols that require extensions or additions to be made at a central registry. (See Appendix 3.)

In the real world, the downstream promulgation of policies is often associated with coercive enforcement mechanisms. Governance need not be coercive, meaning it need not threaten by force of violence, imprisonment, or theft. Upstream centralization (often achieved through heirchary) is useful in aggregating preferences and setting a common value. If all participants agree to abide by that value, no downstream coercion is needed to enforce it. Participation is incentive enough – many technical standards are motivate by this reason. Granted, individuals will not always abide by every policy and coercive downstream mechanisms might be the only way to implement them.

Finally, one should realize that one tool of implementation can serve two policies. For instance, Web cookies were introduced as a way to permit a site to remember a user over multiple requests for Web pages. (This permits something like a shopping basket that stores selected items as the user browses.) This tool also served the related policy of user profiling users across different sites, an unintended but related policy.

3 Internet Policy by Real World Governance

"The Internet is a shallow and unreliable electronic repository of dirty pictures, inaccurate rumors, bad spelling and worse grammar, inhabited largely by people with no demonstrable social skills." Chronicle of Higher Education.

"The Internet, of course, is more than a place to find pictures of people having sex with dogs." Philip Elmer-Dewitt. (Time)

In the real world the balance achieved by modern democratic institutions is that of the rule of majority while protecting the rights of minorities. The degree to which citizens of a community are satisfied with the process of governance [Table 1] relates to (1) their satisfaction in the process's ability to yield rational, global policies akin to their own; and (2) and the integrity and trustworthiness of the process when the results are less than satisfactory to a minority.

The process in the real world is not perfect, but people – by and large – are satified with these democratic principles.  In fact, mechanisms of real world democractic governance are sometimes offered as models for Internet goverannce. Before examining mechanisms of cyberspace policy formation, I want to briefly explore real world attempts at Internet regulation; it will serve as a useful contrast in subsequent analysis.

Generally, the scope of Internet regulation seems to fall within four categories:

  1. scarce resource and public goods: bandwidth allocation and the quality of communal spaces.
  2. efficiency: anti-fraud regulation.
  3. interoperability: open standards (IETF, W3C, etc.), open source (linux, apache, mozilla, etc.), and protocol and name registration (IANA, ICANN, etc.).
  4. behavior: prohibitions on obscene speech.

It is interesting to note that the range of activities (1-3) falls within my rationale for governance: collective choice and efficiency. Behaviour (4) is something governments have trouble controlling on-line. This is partly because one of the strongest rationales for traditional governance is the prevention of physical harm. Real world governments use coercive physical methods (restraint and jail) to prevent coercive physical abuses (attacks to someone's person). On the Internet, it is not possible to physically reach out and harm someone's person. If there are no swords, Boyle's sovereign has a limited reach indeed: "If the king's writ reaches only as far as the king's sword, then much of the content on the Net might be presumed to be free from the regulation of any particular sovereign." [Boyle] Consequently, governments lack one of their strongest rationales as well as their method of enforcement!

As demonstrated by Lessig [Less98b], regulation is implemented by the following:

  1. laws: cryptographic software is legally regulated as a controlled munition.
  2. markets: the common business practice of user profiling by Web sites affects user privacy.
  3. nature/code: the initial inability to control content on the Internet benefits free speech.
  4. norms: people attempt to portray the Internet as a paradigm of virtue, or as a den of obscenity.

The US Constitution is an adept instrument of constraining direct legal regulation, "Congress shall make no law ...." However, modern regulation often is indirect, it sets incentives and disincentives for others (usually the market) to implement and enforce policies more effectively than the government ever could. Whereas Reidenberg suggests that governments should shift the "focus of government action away from direct regulation and towards indirect influence;" I find this trend to be frightening because he makes an assumption that I am unwilling to make: "The shift can, nevertheless, still preserve strong attributes of public oversight." [Reid97, 588] The US Constitution is poorly equipped to constrain indirect regulation.

Consider the following mechanisms of cyberspace regulation:

These are the principal methods by which real world governments would like to regulate the Internet. Let us now turn to the methods the Internet has developed to regulate itself.

4 Native Internet Policy Formation

My goal for this paper is to examine how policy is born of Internet institutions and community. I describe ten characteristics related to the establishment of policy according to the model I explained earlier. For my examples, I shall use the creation of voluntary technical standards and the governance of on-line communities. One thing to be aware of in the context of my model as applied to the Internet is that one can be a member of many Internet communities, each with a fairly well defined scope. Two related observations about the Internet are that:

  1. the Internet permits people to voluntarily belong to selected communities.
  2. in the past, the coarseness of the real world often made us lump disparate preferences together for reasons of efficiency; today, technology permits greater discrimination and choice.

Online, you need not be bound by the same policies of commerce, content, or privacy as your real world neighbor. You can choose your own community, which may be a community of one, and choose policies to your liking. This characteristic itself is one of the greatest strengths of the Internet.

4.1 Open Participation

"Just as the strength of the Internet is chaos, so the strength of our liberty depends upon the chaos and cacophony of the unfettered speech the First Amendment protects." Judge Dalzell (CDA panel).

"And, just to state the obvious, anarchy does not mean chaos nor do anarchists seek to create chaos or disorder. Instead, we wish to create a society based upon individual freedom and voluntary co-operation. In other words, order from the bottom up, not disorder imposed from the top down by authorities."  [A.1.1 What does "anarchy" mean? Anarchist FAQ]

Broadly speaking, anyone can participate at some level in Internet communities. An individual's voice may be heard based on the appeal, quality, and/or quantity of her contributions to a mailing list, MUD, Web site, technical working group (IETF), or software community (e.g., open source). In fact, simply browsing the Web is a expression of your interests.

Elsewhere, I've labeled this cultural perception of every individual being a potential contributor as that of the citizen engineer. The unwillingness to participate as such results in one being labeled a "lurker;" if your participation is actually offensive and uninformed you might be labeled "clueless" – a designation most wish to avoid. This is not to state that every inhabitant of cyberspace needs a computer science degree. Many functions, like sending emails, posting Web pages, or writing a Frequently Asked Question (FAQ) document are all within reach of anyone able to browse the Web. The important characteristic is the desire to communicate with and contribute to a community; the citizen is a builder of the communities she inhabits. The efforts of the Free and Open Source software communities are good examples of individual participation and cooperation serving the cooperative interests of a larger community.

Interestingly, one of the dangers of the mainstreaming of the Internet is the threat to concepts of individual participation, responsibility and the belief that a community is what the individual participants make of it. The most disappointing thing I heard in 1997 was at a meeting, in Washington D.C., regarding children and adult content on the Internet. The US Administration asked the "Internet community" what they were going to do about the problem. AOL and Disney said, we "The Internet" will do X. The danger is that we (the people) will lose the ability to represent ourselves through our actions, and that governement or corporate proxies will step forward to represent us counter to our own interests. For instance, I recall a regulator arguing that the W3C was indeed a proxy for the Internet community and that the W3C must do X, Y, and Z. He had little understanding that the W3C had no ability to rule by fiat (see §4.4.2).

These observations leads to the understanding that there have been few formal Internet institutions that real world governments could coerce because institutions of Internet policy are voluntary, decentralized, and non-coercive themselves! There are few choke points others can grab hold of and few mechanisms for delegating the coercive implementation of external policies. In this context, one of the few partially successfull strategies are to delegate the propogation of policies through market mechanisms since governments can affect companies.

Furthermore, governance need not be defined as the exertion of power by a central authority. Governance is the act of affecting behaviour, which need not be carried out by a formal or legal institution (see §1). I am on a mailing list that has a rule that if you send old new clippings to the list and someone challenges you, you have to go find "new bits" (information that will be novel to the community) within a couple days. I've broken the rule and abided by the penalty because I feel this social governance is just and well intended.

4.2 No Kings, but Elders?

"We reject kings, presidents and voting. We believe in rough consensus and running code." David Clark (MIT).

"Anarchists maintain that anarchy, the absence of rulers, is a viable form of social system and so work for the maximization of individual liberty and social equality." [A.1.1 What does "anarchy" mean? Anarchist FAQ]

"Congress will pass a law restricting public comment on the Internet to individuals who have spent a minimum of one hour actually accomplishing a specific task while on line." Andrew Grove (Intel).

The first meme above was created by Clark at a 1992 IETF meeting and is now informally known as the IETF Credo. This maxim can not be read as stating that Internet culture has no authorities. Individuals of respect and standing play an important role in the aggregation of individual preferences and development of consensus within the community. Internet rulers can be most likened to Elders: those who through merit, contributions, and experience became or built institutions that affect the Net. Elders are citizen engineers who built wonderful things. Examples of Elders include Tim Berners-Lee, ("Father of the Web"), the late Jon Postel (IETF RFC Editor and IANA Director), Linus Torvalds (creator of Linux), and Larry Wall (creator of Perl). Amusingly, Guido van Rossum, the creator of the Python, is often respectfully referred to as BDFL (Benevolent Dictator for Life). (In Web years, it need not take much time to establish oneself as an Elder, nor discredit ones-self amongst one's peers.)

While I will not extensively compare the power of the Elders and its legitimacy to political institutions of executive authority, it is safe to say that this power differs from that of a king or president. It is predicated on merit, experience, and ability within a well-defined domain – technical in these instances. Authority over a domain derives from being a significant contributor to – or even creator of – that domain. They have accomplished a significant and worthwhile task.

However, as the Internet matures, the ability of trusted personalities to continue as the sole basis of an institution is likely to decline. Many of the registration duties formerly supervised by Postel at IANA are now being transferred to ICANN. This transference of trusted technical authority to an untrusted – but potentially open and "representative" – legal institution will not be an easy one.

As a different example, Garfinkel in The Web’s Unelected Government acknowledged Berners-Lee as a respected elder, but challenged the authority of the World Wide Web Consortium that he created. Garfinkel stated that while "Almost everyone involved with the Web has tremendous respect for Berners-Lee...." critics say, "the group has become a significant maker of public policy—and ought to start acting like one. They argue that the W3C should open its membership and meetings to broader, more democratic participation." [Garf98]

Regardless, as the Internet matures, one can expect to see more formal mechanisms of preference aggregation, non-expert authority, voting, and partisanship.

4.3 Consensus and Competitive Scaling

"We reject kings, presidents and voting. We believe in rough consensus and running code." David Clark (MIT).

"The few anarchists who reject direct democracy within free associations generally support consensus in decision making. Consensus is based upon everyone on a group agreeing to a decision before it can be put into action. Thus, it is argued, consensus stops the majority ruling the minority and is more consistent with anarchist principles." [A.2.12 Is consensus an alternative to direct democracy?. Anarchist FAQ]

The selected text in Clark's maxim proposes community consensus as an alternative to kings, presidents, and formal voting. Of course, the key questions are what does consensus mean, and what are its alternatives? The second question is what is the effect of this mechanism of deliberation on the resulting policy?

Berners-Lee has identified three different methods of reaching a conclusion within a W3C Working Group (WG). First, the WG Chair is the judge of consensus; this enables the group to move rapidly and filter the political from the technical. The Anarchist FAQ points out that such mechanisms preclude a majority from ruling the minority. To rephrase this sentiment, consensus mechanisms gives a small minority veto power over the will of the majority. [Rea98 §3.3] Second, the alternative of requiring unanimity forces all avenues to be explored, allows abstention when the issue is merely a matter of taste, and ensures a united group in on-going work. Third, formal voting means that a decision will always be made, but that minorities might be abused and consensus lost. Finally, Berners-Lee argues that regardless of the method used minority views must always be documented because a minority within one group may represent the needs of a large constituency elsewhere. [Bern98]

However, this exposition still leaves unanswered the question of what is consensus? The W3C Process Document defines it as such.

1.3 W3C's consensus policy

Integral to the W3C process is the notion of consensus. The W3C process requires those who are considering an issue to address all participants' views and objections and strive to resolve them. Consensus is established when substantial agreement has been reached by the participants. Substantial agreement means more than a simple majority, but not necessarily unanimity. In some circumstances, consensus is achieved when the minority no longer wishes to articulate its objections. When disagreement is strong, the opinions of the minority are recorded in appropriate documents alongside those of the majority.

Groups strive to reach consensus in order to provide a single solution acceptable to the market at large. If a group makes a decision that causes the market to fragment -- despite agreement by those participating in the decision -- the decision does not reflect a single market and therefore the group has failed to reach true consensus.  [W3C98]

The IETF Working Group Guidelines and Procedures also addresses the issue:

3.3. Session management

Working groups make decisions through a "rough consensus" process. IETF consensus does not require that all participants agree although this is, of course, preferred. In general the dominant view of the working group shall prevail. (However, it must be noted that "dominance" is not to be determined on the basis of volume or persistence, but rather a more general sense of agreement.) Consensus can be determined by balloting, humming, or any other means on which the Working Group agrees (by rough consensus, of course).

The challenge to managing working group sessions is to balance the need for open and fair consideration of the issues against the need to make forward progress. The working group, as a whole, has the final responsibility for striking this balance. The Chair has the responsibility for overseeing the process but may delegate direct process management to a formally-designated Facilitator. [RFC1603]

If one is still looking for a more concrete definition, one is unlikely to find it. The notable characteristic of consensus mechanisms as a means of preference aggregation is that they are flexible and informal, and that they work best in small communities such that differences can be easily documented, considered, and resolved. The method of reaching consensus alone does not scale well to large communities; however, when combined with the characteristics below, it often scales better than a ballot among a million people! This is because of competitive scaling: a small group of people get to produce their best work under consensus, and then compete, coordinate, cooperate, and learn with other groups.

4.4 Implementation and Enforcement

"We reject kings, presidents and voting. We believe in rough consensus and running code." David Clark (MIT).

The test of implementation is the harshest standard any proposed policy faces. This test is central to my argument as to why the Internet is good. With the cacophony of ideas, proposals, and debates, and a lack of a central authority to cleave the good from the bad, how does one sort it all out? It sorts itself out. We need not delegate our values to a central authority – subject to tyrannical or partisan tendencies. The success of any policy is based simply on its adoption by the community.

Of course, the requirements and tests for implementation on technical policies and proposals (e.g., a new protocol) seem so much more concrete and obvious than for social policies – but they need not be. The concreteness is not predicated on the domain (technical versus social) but on a genuine need of those within the technical domain to have policies that are applied in a consistent, well founded, and uniform manner. The uneven, selective, and unfounded nature of some social policies (through law) is a bug, not a feature. It is the purpose of organizations such as the IETF and W3C to remove any such bugs from technical policies.

The descriptive and non-authoritative nature of these standards are captured by the IETF's humble use of the "Request for Comment" (RFC) designation and the W3C's Recommendation.

The IETF states the following with respect to the advancement of its specifications:

1.2 The Internet Standards Process

Thus, a candidate specification must be implemented and tested for correct operation and interoperability by multiple independent parties and utilized in increasingly demanding environments, before it can be adopted as an Internet Standard. [RFC2026]

Interestingly, the W3C strongly protested its production of Recommendations as a standard for much of its early life:

6.3 The W3C Recommendation track

A Recommendation indicates that consensus has been reached within W3C about a specific topic and that a document - typically a specification - is appropriate for widespread use.... The result of the W3C Recommendation process may be submitted to a formal standards body for ratification, however this is not required or guaranteed. [W3C98]

The IETF process is interesting in that it is descriptive, not prescriptive. An IETF Standard is not a statement that all must abide by the technical specification – unlike much law and some of the standards of government sanctioned standards bodies. Rather, it is a descriptive statement to say that (1) the policies specified by the document are desirable and (2) that the quality is high enough to permit developers to create indepdentent implementations.

Furthermore, the multi-level (proposed, draft, and standard) and descriptive nature of standards advancement at the IETF can help resolve legal as well as technical issues. For example, software patents can interfere with the goals of standards organizations, namely the wide deployment of accessible and open technologies. It is not uncommon for intellectual property disputes to threaten this goal. Standards organizations must choose between (1) the advancement of standards based on propietrary technology, (2) arbitrating claims and licensing terms, (3) taking it upon itself to dispute the intellectual property claims, or (4) allowing a standard to falter. Most standard organization wisely avoid alternatives (2) and (3). The IETF process punts on those alternatives in an interesting way. The IETF standards track is primarily concerned with the robustness of the specification and its resulting implementations. If patented parts of a specification are implemented by companies licensing a patent, so be it. If companies dispute the claims and implement the technology, so be it. The IETF need not make legal or market determinations, it simply asks if there are implementations and documents related contention; it reflects the determinations of others based on the level of implementation.

4.4.1 Limitation of Scope

"Be conservative in what you do, be liberal in what you accept from others." Jon Postel. (IETF) IETF networking protocol design maxim. [source: USC]

"The great creative work of a federal agency must be done in the first decade of its existence if it is to be done at all. After that it is likely to become a prisoner of bureaucracy...."   Justice William O. Douglas [Doug74].

Since a proposal at the IETF or W3C ultimately requires implementation that demonstrates community interest and a test of its formulation, engineers are often conservative in what they specify. This has a natural tendency to force a Working Group to be extremely rigorous in defining and enforcing the scope of its activity. In order to conserve time and effort, activities that supercede their formerly chartered scope (with respect to resources, time, or subject matter) need to be explicitly reexamined. Consequently, there is a strong cultural resistance to the strong institutional temptation to expand one's purview, scope, and authority.

4.4.2 Funded Mandates and Lack of Fiat

"Good ideas are not adopted automatically. They must be driven into practice with courageous patience." Admiral Rickover.

The implementation and operational use of a technical policy demonstrate an interest and ability to deploy the policy at large. Political institutions are capable of developing policies which do not have the necessary support or resources for implementation. One of the interesting characteristics of Internet policy is that it is intrinsically conservative; if one wishes to make a change, one requires substantial "practice and courageous patience." It would be meaningless for any Internet authority to arbitrarily state, "we will change the IP numbering scheme tomorrow." A change to the IP numbering scheme needs to be recognized, addressed, and deployed by the community at large with continued support for those who cannot immediately abide by the change (backwards compatibility). Consequently, a policy intrinsically requires enough community support to ensure its own deployment, implementation, and enforcement. As shown in §3.1, one of the strategies of real world governance is to set policies and require others to fund, implement and enforce them.

4.4.3 Uniform Enforcement

"How many of you have broken no laws this month? That's the kind of society I want to build ... -- with physics and mathematics, not with laws...." John Gilmore (EFF).

Many Internet policies are voluntarily adopted by an individual desire to partake of community benefits, such as comradery, discussion, and cooperation (interoperable protocols). One of the greatest dangers of political/legal policy implementation is the capability of selective enforcement. This means that regulations are established in such a non-rigorous way that most everyone is breaking a law; authorities then enforce them selectively and capriciously. This behaviour as applied to minorities is common throughout history. However, can Internet policies be selectively enforced against a minority? Not easily since Internet policies are primarily adopted voluntarily. In this model, the adoption of a standard is based on the desire to realize benefits and mitigate detriments – as perceived by the individual and the larger community. It makes little sense for an authority to state that one particular implementation is an exception to a requirement within a specification.

4.4.4 Descriptive Policy

"Your click-stream is your vote." Joseph Reagle (W3C).

Because Internet policies are often descriptive, they tend to reflect reality rather than unrealized goals. Policies which diverge from the will of Internet users are – in a sense – moot. For instance, prohibitions on certain types of speech are largely ignored, including prohibitions on potentially obscene materials. In this context, it is interesting to note that selective enforcement does help enforce laws in light of this ineffectualness. It only takes the execution of a few people who violate a regulation to effect a policy. However, as useful as this deterence might be in implementing a policy, it is an exercise of centralized and coercive authority and subject to abuses as described above.

4.4.5 Policy and Institutional Deprecation

"You can't take something off the Internet - it's like taking pee out of a pool." NewsRadio.

"... I told FDR over and over again that every agency he created should be abolished in ten years. And since he might not be around to dissolve it, he should insert in the basic charter of the agency a provision for its termination. Roosevelt would always roar with delight at that suggestion, and of course never did do anything about it."   Justice Douglas [Doug74].

It is useful for a policy that is no longer in operation to be stricken from the books; it simplifies the understanding one must have about one's regulatory environment. One not need recall laws of conduct from the 19th century in order to live a 21st century life. Furthermore, those same laws cannot be applied selectively and counter to the interests of the disfavored.

The meme about pee in a pool is a very accurate comment about the Internet. For example, risqué photographs and videos gravitate to the Net, and stay there forever more  – as demonstrated by efforts of a popular radio psychologist and a television actress to have nude photos and movies removed from Web sites. However, there is an interesting counter example: IETF Internet Drafts.

An Internet-Draft that is published as an RFC, or that has remained unchanged in the Internet-Drafts directory for more than six months without being recommended by the IESG for publication as an RFC, is simply removed from the Internet-Drafts directory. At any time, an Internet-Draft may be replaced by a more recent version of the same specification, restarting the six-month timeout period. [RFC2026]

A Draft has no formal standing, it is merely a document provided for informal review and communication, and consequently should not "be referenced by any paper, [or] report ... [and] are subject to change or removal at any time." [RFC2026] Consequently, if one types the following URI into one's browser six months from now, the draft will no longer be there.

"http://www.ietf.org/internet-drafts/draft-ietf-dnssec-ddi-06.txt"

This philosophy is counter to W3C policy – developed after the IETF – as expressed by Tim Berners-Lee's Sameness Axiom for URIs, "a URI will repeatably refer to 'the same' thing." The W3C policy is that anything published by the Consortium must persist. The depecratation or obsoletion of a document is not achieved by removing the document, but by providing a link which is dynamically changed to point to the latest version of the document, or by metadata, or an icon or background color that indicates that a document's status has changed (e.g., historical). And of course, one can always find old IETF-draft repositories and mirrors elsewhere, just not at the original IETF site.

Regardless, my point is that the formal technical policy mechanisms of the Internet have formal deprecation mechanisms. An IETF RFC reference includes points to newer versions or documents. A newer version of a specification often declares how it differs from past version and which elements of previous versions have been deprecrated or obsoleted (e.g. HTML 4.0) Technical policies do expire, even if historical copies continue to persist forever.

However, even expired policies sometimes linger. Like a W3C document, once an application is built to a specification, it should always continue working the way it once did. Fortunately, technical policies are often purposefully designed such that one can be backwards compatible (newer applications built to the latest policy can work with previous applications) and forwards compatible (older applications can work well with newer versions, meaning they will easily migrate or be able to safely ignore new additions/extensions). If only we had a similar mechanism for law! What one can say about the Internet is that because the Internet's very purpose is communication, it is relatively straightforward to understand which technical policies and protocols one is acting under.

4.4.6 Metrics

The scope and requiments of a technical undertaking are often well defined. Sometimes criteria of success for an effort are often explicitly specified. Regardless if this formal step is taken, when a policy is implemented, it is relatively easy to determine if technical policies have been successful. For instance, the 1.1 version of the HTTP protocol for exchanging Web information was intended to be more efficient and there are readily available metric for determining efficiency. [NGBLL97] One does not need a panel of adversial experts arguing before a court or camera to demonstrate that the policy is successful or not: it is plainly evident.

5 A Best Effort (Conclusion)

"Best effort delivery is also known as highly reliable delivery. It is somewhat unique that the qualifying adjective highly weakens the definition of reliable in this context." [RFC1301]

It is foolish to assume that all the world's political problems could be solved Internet style. It is even disingenuous to describe the concepts discussed in this paper as "Internet style." These concepts have been discussed in the context of legal and anarchist thought before the Internet existed [see Appendix 2]. What is exciting is that in no place are the principles as integral to a mass culture as on the Internet. This is because of the nature of those that developed and used the early Internet, but also because the Internet is explicitly a tool for communication and community free of physical coercion. Many of the characteristics I describe derive from the accessibility and freedom of information and speech, as well as from the transparency the network inherently supports.

If the (technical) robustness and rigor described in this paper are not inherent to the domain of the Internet policy (political, legal), it should be simulated within the specification of the policy itself or by the institutions that create such policies. For instance,

  1. Policies should be adopted on the basis of technical merit; policies should not discriminate on the basis of expressive content.
  2. Consensus positions and recommendations should be accompanied by minority opinions and dissenting views, if any. The consensus position or recommendation should address and respond to minority concerns.
  3. Activities and policies should be rigorous in defining and enforcing the scope of their activity. Where appropriate, sunset clauses, expiration dates, and expectations regarding the revisiting of a policy or activity should expressly stated.
  4. Criteria of success should be specified within a proposed policy and used as a basis for future criticism and improvement.
  5. Proposed policies must be shown to be in the best interests of the Internet community and should demonstrate strong evidence that such policies can be implemented. Proposed policies should be tested on a smaller scale. The implementation and operational use of a technical policy demonstrates an interest and ability to deploy the policy at large.
  6. Policies must be applied in a consistent, well founded, and uniform manner. Policies should be designed so as to minimize the risk of selective enforcement or abuse.
  7. Institutional processes should adhere to the principles of openness, transparency, decentralization, bottom-up coordination and constructive competition among small groups and communities.

I hope these principles will guide the new generation of Internet political institutions.

Finally, I've been somewhat unfair in my treatment of these characteristics, painting a very rosy picture of the Internet. I briefly mentioned the detriments of the consensus mechanism and its susceptibility to "minority veto" and organizations that can out-spend, out-propose, or out-code competitors. I hardly touched on the legitimacy of IETF old boy networks in the context of accountability or responsiveness to the users. I waved my hands about how the uniform enforcement of technical policies seems guaranteed and ubiquitous when in fact market battles are fought over the selective support of technical standards to garner market lead and lock-in. There are exceptions to every rule. However, the mechanisms of Internet governance I describe in this paper have served the Internet well. In my eyes, the humble best efforts of my fellow citizen engineers have served me better than any politician of the real world.

Bibliography

[Bern98] Berners-Lee. Web Future. (Talk before the W3C Advisory Committee).   http://www.w3.org/Member/1998/11/Talks/tbl-2/

[Bern96] Berners-Lee. Axioms of Web architecture: URIs (19 Dec 96)

[Boyle] Boyle. Foucault In Cyberspace:Surveillance, Sovereignty, and Hard-Wired Censors, 66 U. Cin. L. Rev. 177 (1997)

[Doug74] Douglas. Go East, Young Man : The Early Years; The Autobiography of William O. Douglas. Random House, New York, 1974, at p. 297.

[EFMN98] Elkin, Flood, McKay, and Neal (eds.) An Anarchist FAQ. Version 7.4 - 17/12/98.

[FC96] Frank and Cook. The Winner-Take-All Society : Why the Few at the Top Get So Much More Than the Rest of Us. 1996.

[Garf98] Garfinkel. The Web’s Unelected Government. Technology Review. November/December 1998

[GK96] Gillett and Kapor. The Self-governing Internet: Coordination by Design. Coordination and Administration of the Internet. Workshop at Kennedy School of Government, Harvard University

[Gran98] Grant. Memes: Introduction. alt.memetics Resource Page.

[LR98] Lessig and Resnick. The Architectures of Mandated Access Controls.

[Less98a] Lessig. The Laws of Cyberspace.

[Less98b] Lessig. What Things Regulate Speech

[Less98c] Lessig. Governance and the DNS Process.

[Less98d] Lessig. The Spam Wars. The Industry Standard. December 31, 1998   

[NGBLL97] Nielsen, Gettys, Baird-Smith, Prud'hommeaux, Lie, and Lilley. Network Performance Effects of HTTP/1.1, CSS1, and PNG. W3C NOTE 24-June 1997.

[Post97] Post. Governing Cyberspace. Wayne Law Review, Fall 1997.

[Rea98] Reagle. Eskimo Snow and Scottish Rain: Legal Considerations of Schema Design. Berkman Center Working Draft. W3C Note 10-December-1999.

[RW98] Reagle and Weitzner. Statement on the Intent and Use of PICS: Using PICS Well. W3C NOTE 01-June-1998.

[Reid97] Reidenberg. Governing Networks and Rule-Making in Cyberspace, 45 EMORY L. J. 911 (1996) reprinted in  BORDERS IN CYBERSPACE, Brian Kahin and Charles Nesson, eds. (MIT Press: 1997)

[Reid98] Reidenberg. Lex Informatica: The Formulation of Information Policy Rules through Technology, 76 TEXAS L. REV. 553 (1998)

[Resn98] PICS and Intellectual Freedom FAQ. Paul Resnick. W3C Web site.

[RFC1301] Armstrong, Freier and Marzullo. RFC1301 -- Multicast Transport Protocol. February 1992.

[RFC1603]  Huizer and Crocker. RFC1603  -- IETF Working Group Guidelines and Procedures. March 1994. (Informational).

[RFC2026] Bradner. RFC2026 -- The Internet Standards Process -- Revision 3. October 1996. (Best Common Practice #9)

[W3C98] Jacobs (ed.). World Wide Web Consortium Process Document. http://www.w3.org/Consortium/Process/Process-19981112


Appendix 1: Internet Quotations

See the Internet Quotation List.

Appendix 2: Characteristics of Policy Formation

Internet Characteristic Benefit Detriment real world or legal complement
open participation marketplace of ideas high noise to signal ratio independent publishing, zines.
rough consensus cohesion minority power town hall meetings
elders; no kings, presidents nor voting meritocracy and expert authority non-represantive and no direct accountability
running code; rigorous test of implementation promotion of good ideas; marketplace of ideas n/a common law; UCC standardizes upon tested common law
lack of coercion all policy is legitimate, lack of ability to abuse physical power requires cooperation, can't enforce laws upon minorities (including criminals)
limitation of scope constrains expansion of power, promotes realization of goals n/a
funded mandates and lack of fiat change requires work, which demonstrates support and formulation n/a
uniform enforcement constrains abuse of power, promotes realization of goals makes it difficult to selectively enforce crimes which are difficult to enforce uniformly (deterence)
veridical policy inherently democratic; all policy is legitimately based on user preferences; can't enforce laws upon minorities (including criminals)
policy deprecation constrains abuse of power and knowledge required; forward/backward compatbility can be inefficient if issues are formally churned without need for reconsideration legal sunset clauses
cooperative and competitive scaling marketplace of ideas (MoI) subject to dangerous externalities where someone can get the "jump" on everyone else common law and federalism
decentralized; lack of heicharical authority ((MoI); no chokepoint for external regulation disorganized, few guarantees federalism
documenting dissenting opinion (MoI); easy to revisit or have third party examine past decisions n/a (might misrepresent the state of consensus if a minority is vehemently loud)

Appendix 3 Case Study: Why ICANN is Frightening

A number of critical Internet resources are managed centrally. These resources include the management and assignment of domain names, protocol parameters, and IP addresses. When centralization is required, the Internet approach is to distribute as much of the management as possible to lower levels. For instance, IP numbers in the range of 18.*.*.* are controlled by MIT. This permits information to be routed to computers on MIT's network. Furthermore, human friendly names (domain names) are often associated with IP numbers. Again, MIT would assign the address lcs.mit.edu. to the Lab for Computer Science (LCS). LCS can then assign other names when prefixed to the LCS domain, such as supertech.lcs.mit.edu. While this method is fairly robust and scales well, a root authority is required to manage the initial disposition of top level resources. For instance, someone needed to say all of 18.*.*.* should go to MIT, or that there is an *.edu domain, which then delegates *.mit.edu to MIT.

Consequently, the Internet is faced with a need for governance that I characterized as collective choice in §2. A single policy must be set that cannot help but affect all those within the scope of the community. This authority was delegated from the US government (under the National Science Foundation and Commerce Department) to two organizations. The InterNIC, run by Network Solutions (a private company), allocated domain names under the COM, NET, ORG, EDU, GOV, and MIL domains. The Internet Assigned Numbers Authority (IANA), a pseudo-organization directed by the late Internet elder Jon Postel at the USC Information Sciences Institute, assigned IP and protocol numbers.

Presently, these services are being transferred to a new organization, the Internet Corporation of Assigned Names and Numbers (ICANN). The purpose of this move is to divorce the funding and maintenance of these services from the US government. This move has been very contentious – how the move happens more so than the move itself. I will not explain the contentions (see the related news articles at Wired or [Less98c]) but I do wish to examine how this new organization might be described in the analytical framework presented by this paper.

Internet Characteristic ICANN
open participation Unclear, but not likely.
rough consensus There will be formal voting by the Board. Supporting organizations may adopt consensus mechanisms if they choose.
elders; no kings, presidents nor voting There is a corporate board.
running code; rigorous test of implementation Untested ideas and policies might be promulgated.
lack of coercion ICANN could be strongly tempted to have resource allocation policies based on non-technical criteria (e.g., legal, political, or financial purposes.)
limitation of scope There is little evidence that ICANN need not accrue additional powers and broaden its scope during operation. It has little incentive to refrain from doing so.
funded mandates and lack of fiat ICANN's allocation policies will not be checked by the need to expend resources and test ideas to implement those policies.
uniform enforcement ICANN could easily selectively enforce name and number allocation policies.
veridical policy Not the case.
policy deprecation Again, there is little evidence that ICANN will not accrue additional powers and broaden its scope during operation. It has little incentive to refrain from doing so.
cooperative and competitive scaling ICANN could support this since it will be the root to a distributed policy mechanism.
decentralization; lack of heicharical authority ICANN will serve as an extremely tempting target to real world governments attempting to apply their own policies to the Internet.
documenting dissenting opinion Unclear, but presently deliberations of the board will be kept secret though votes are public.

The point of this analysis is not to throw stones at ICANN. Rather it is to point out that as we move towards the transference of trusted technical authority to an untrusted – but potentially "representative" – legal institution (see §3.2) we must be cognizant of the dangers. We should promote technical solutions that do need not rely upon central registries and when this is not possible, the recommendations in the conclusion of this paper would serve ICANN well.

Appendix 4 Case Study: Spam and Network Effects

In order to test my argument about the "goodness" of Internet governance, one might ask, where has this model of governance failed? One might posit Internet spam (unsolicited commercial emails that clutter one's inbox) as a problem that Internet governance has not solved well. Interestingly, the reason it has not been solved to the satisfaction of most Internet users is explained by my model. My exposition has so far assumed that the characteristics of policy formation  (consensus, uniform implementation, etc.) are distributed throughout the upstream and downstream flows of policy formation. Most importantly, I assume that the implementation of a policy requires community consensus and that an action by a minority cannot have a disproportionate effect on the global outcome. If my assumptions hold, it allows me to argue that we can avoid mechanisms of "democratic" governance and rely upon the better mechanisms of anarchical policy formation. However, if my assumption do not hold, one could argue we should look to democratic mechanisms that set a single policy for the whole community. 

In the case of spam, only a very small community (the spammers) deploy the policy of sending spam. Oddly enough, most of the Internet accidentally implements the policy. Spammers often use open relays (other people's mail servers) to quickly propagate their messages and hide their own identity. As I described earlier, the tool of implementing the policy of cooperation (relaying other people's email without question) can also be used to effect a different policy: sending spam!

The Open Relay Blocking System (ORBS) is a response to open relays. In effect, this community states that, "if your policy is to maintain an open relay, we will bounce (not deliver) any email you send to our users." As described in [Less98d], the ORBS community placed MIT on just such a list for maintaining an open relay. Subsequently, Hewlett-Packard, a member of ORBS, began bouncing MIT email. MIT threatened to block Hewlett-Packard. This dispute between MIT and HP presents the same type of problem as spam itself! Namely, that a small community can have a disproportionate impact on global policy. The intential bouncing of email by even one institution is unacceptable to any end user who needs to reach a person at that institution. Consequently, this policy has a very strong network effect. Those who are likely to be bounced face a significant penalty for not implementing the policy of turning off open relays. This network effect is the negative image of the usefulness of the email network effect, the more folks connected the better.

One of my rationales for governance is that of collective choice, where a single policy is set that cannot help but affect all those within the scope of the community. However, in the case of spam, we do not have a group of people collectively making a choice, nor multiple communities competing, cooperating, or learning between their different and corralled policy proposals. Like many policy problems, this one can be characterized as a winner-take-all scenario. [FC96] In such a scenario, multiple interests compete for a single prize. For instance, hostile nations often escalate their arms development for the single prize of world domination. Or, competing law students interviewing for a single opening may escalate the swankiness of their business suits in order to gain a relative (though marginally arbitrary) advantage over their peers. Such competition can be economically inefficient or even damaging to the community.

In our spam example, we are presented with minorities that can deploy a policy that has a winner-take-all effect on implementation. Each competitor is fighting to set the global policy. For instance, one could imagine Hewlett-Packard and its allies competing against MIT and its allies, each vying to set the global relay policy. Fortunately, both parties were stopped their dispute when ORBS was taken down since it violated its host's network usage policy. Regardless, this type of problem is a reality of policy formation and I wished to provide the following recommendations in the context of this paper.

  1. Internalize externalities. Ensure that the cost of an action by a party is born by that party. MIT's threat to bounce Hewlett-Packard reflects the cost of Hewlett-Packard's policy back upon itself.
  2. Divorce the ability of one mechanism to implement different policies. For instance, one could try to configure open relays to discriminate between acceptable uses and non-acceptable uses, such as spam.
  3. Corral the affect of a policy to only those who contributed to making the policy (upstream flow); or,
  4. Open up the making of the policy to all of those affected.

Appendix 5 Case Study: The Test of PICS

Point 4 of the preceding case study, "Open up the making of the policy to all of those affected," returns me to where I began in this paper! It is reminiscent of Lessig's call for Internet governance [Less98c], and the argument that W3C should be more "open" [Garf98]. Both of these authors have also been critics of the W3C's Platform for Internet Content Selection (PICS). PICS was intended [RW98] to empower users to control the type of content they see as an alternative to government censorship. A feature of PICS is that there is not a single service for naming or describing content. Anyone can create a system for describing Web content and label it. An appealing solution in the context of anarchist principles! However, critiques fear that this mechanism, when deployed, would be used primarily by governments. Like the case of mail forwarding in the case study above, this single mechanism could serve two policies and its implementation would then have an effect on a community greater than those that created it. Such concerns are valid. (PICS specific concerns are partially addressed by the W3C PICS and Intellectual Freedom FAQ [Resn98].) However, PICS might be a good example of governance that worked. Presently, the specification contains ambiguities that lead to unpredictable behavior in implementations that makes labeling content difficult and error prone. [See the Censoring the Internet with PICS email thread for a description of the problem.] The present combination of technical, economic and political difficulties has prevented the widespread adoption of PICS.  It has yet to pass the larger implementation test of an Internet policy.