Privacy Manifesto

From Project VRM
Revision as of 10:42, 6 December 2021 by Dsearls (talk | contribs) (→‎Manifesto: Clarified some language around "notice and consent." Also updated the current draft from Spring 2019 to December 2021.)
Jump to navigation Jump to search

Preamble

The purpose of this manifesto is to encourage and guide development of tools that enhance and extend people's ability to protect and project their privacy in the online world. We have used such tools in the natural world for as long as we've had the privacy technologies called clothing and shelter, and social norms for signaling and respecting personal intentions around privacy. We are not close to having those yet the online world, which most of us have inhabited for less than two decades, and will likely be with us for centuries—if not millennia—to come.

In the absence of those technologies and norms, it is easy for those with power to violate our personal privacy, and to rationalize those violations as well. In fact it is so easy to do both that violating privacy has become worse than rampant: it is normalized by a lucrative and easily rationalized business model that Shoshana Zuboff calls surveillance capitalism and a purpose that Brett Frischmann and Evan Selinger call "re-engineering humanity." In some countries, the extent of government surveillance comports with Orwell's worst fears. In other countries we are right to fear the same.

In these early days, when personal privacy tech and norms are still at an embryonic stage, there are two pro forma ways for potential violators to claim they respect personal privacy. One is with a company privacy policy, which became common starting in the 1970s. While these are required of companies doing business in our digital age, they are easy for a company to ignore or to change at any time and without notice. The other is with a one-sided statement of terms, proffered most commonly by "notice and consent" banners on websites and detailed in thousands of words in legalese. These terms are what Friedrich Kessler, in a landmark in 1943 paper called "contracts of adhesion." In the digital world, these kinds of contracts are typically proffered in take-it-or-leave-it ways, with no means for an individual to record their agreement or to audit the company's compliance to it.

These policies, terms, and conditions are also as numerous and varied as the websites we visit and the apps and services we use. To read and consider all the policies and terms we encounter online would be more than a full-time job for all who accept those terms as a matter of course, which is why we don't bother. (A 2012 Carnegie Mellon study says it would take 76 days per year just to read the privacy policies of the world's top 75 websites. Since terms tend to be at least as long as privacy policies, the actual time required to read both would be around double that.)

It is natural under these conditions for privacy advocates to look toward the government for new laws and regulations to relieve us from personal privacy violations by others. Necessary though it is (laws maintain civilization), there are two separate problems with looking for law alone to solve our privacy problems:

1. New laws risk putting the regulatory cart in front of the horse and reins of tech and norms. This is what we already have with the GDPR, which presumes maximum agency for corporate "data controllers" and "data processors" and little agency for the "natural persons" the regulation calls "data subjects." All privacy for the individual is by grace of companies and their policies and terms, to which individuals must agree, separately, one at a time, for all of them. And, while the CCPA in California gives people the right to get back the data horses that have been taken from personal barns, it doesn't support the development of ways for people to protect what's in their barns in the first place.

2. Existing law may be enough when there is little tech to lock our personal data barns. As Steve Wilson says in The last thing privacy needs is new laws, "existing privacy law can substantially deal with Big Data."

And none of this addresses the ability of governments everywhere covertly to harvest and process personal information as well.

The simple fact is that we need new tools — privacy tech, and standards supporting that tech — on our side. There is no other way to create privacy in the online world that begins to resemble what we have long enjoyed in the offline world.

Making the status quo less bad risks making it worse. Hence this manifesto.

Manifesto

  1. The digital world, connected by the Internet, is inhabited by human beings and not just by machines, governments and corporate entities. All of us have a right to be there, and to enjoy the same freedoms and forms of respect that we do in the physical world.
  2. Privacy is personal. Technically speaking, it's a root right. If you have a right to exist, you have a right to privacy.
  3. Privacy is also social and political. Shoshana Zuboff in The New York Times: "The lesson is that privacy is public — it is a collective good that is logically and morally inseparable from the values of human autonomy and self-determination upon which privacy depends and without which a democratic society is unimaginable." But that doesn't mean privacy is not personal.
  4. We each experience privacy as a state of possession, as personal as our body's organs, though far more vulnerable.
  5. To experience privacy is to also experience personal sovereignty, independence and agency.
  6. To control one's privacy is to selectively conceal, disclose or project information about one's self outward into the world — and to obtain respect from others for that.
  7. Privacy is no less a right than those to life, liberty and the pursuit of happiness. (The right to privacy is also recognized in Aricle 12 of the United Nations' Universal Declaration of Human Rights.)
  8. Our agency —the ability to act with effect in the word — depends on maintaining and managing our privacy. (We operate at full agency, for example, when we tie our shoes, ride a bike, write something down or drive a car, or participate in a conversation.)
  9. Privacy starts with what others don't know about us. To strangers we present first as human, but also as anonymous. (To be anonymous is to be nameless, not to be invisible.)
  10. Through anonymity, personal privacy is a public grace. It's why we don't wear a name badge when we walk down a city street. It helps all of us to not to know private information about all the other people we each see or meet.
  11. Not knowing much about most other people is an economic and political grace as well as a social one.
  12. Getting to know another person is to experience selective control of personal privacy by both parties. Friendship and intimacy are earned through selective and trusting personal disclosures of personal information that is essentially private.
  13. All social, economic and political graces arising from personal privacy require personal independence, sovereignty and agency over what others can learn about us, even though our control is far short of absolute.
  14. Having control over what we selectively disclose to others, in ways we can generally trust, allows social norms to grow around how personal privacy works. Though these norms differ by culture, they exist in all cultures.
  15. Like nature, the Internet came without privacy.
  16. The first privacy technologies we invented in the natural world were clothing and shelter. We did this when we first became human, dozens of millennia ago.
  17. The Internet we have today is barely more than two decades old, and we still lack the online equivalents of clothing and shelter. This is why most of us are still as naked and exposed on the Internet as we were in Eden. It's also why it has been easy for businesses and governments to exploit our exposed selves.
  18. It is now the norm — even in the presence of laws clearly forbidding it — for nearly every commercial website we visit to plant tracking beacons in our devices, so our lives can be examined and exploited by companies and governments that extract personal data and manipulate our lives for their purposes. This diminishes our agency and is an affront to our personal dignity.
  19. These problems must be solved with personal privacy tech and standards to support that tech. Privacy tech will create private spaces for ourselves online, and ways for signaling to others what is acceptable, and what is not, in respect to our privacy.
  20. Our privacy tech should support, among other activities, the ability to proffer terms to which others (be they individuals or organizations) can agree. This is simple freedom of contract, which has operated in human society offline for thousands of years, but is not yet normative in the online world.
  21. Government regulations and corporate privacy policies at most can encourage personal privacy tech. They can't invent or provide it.
  22. Standards are essential for personal privacy tech to operate at scale in the online world. This shouldn't be hard. The common protocols of the Net and the Web (TCP/IP, HTTP/S, IRC, FTP, et. al.) give us a good base to build on, and good models for how scale can work for each of us.
  23. New laws and regulations for protecting personal privacy online (e.g. the GDPR and ePrivacy in the E.U. and A.B. 375—CCPA—in California) are being instituted in the absence of the personal privacy tech and norms we should have had first. Thus they put the regulatory cart in front of the technology horse. Worse, they all tend to rely on "notice and consent," a norm by which a site or service is always the first party, issuing a "notice" to which the individual must "consent." This requires that individuals must always be second parties to all agreements involving consent. Besides locking individuals into countless subordinate roles, each controlled by others, this offends the peer-to-peer nature of the Internet itself.
  24. Worse, because these laws and regulations are being developed in the absence of personal privacy tech and norms, they assume that human beings are mere "data subjects" (GDPR) or "consumers" (CCPA) with no personal agency beyond "choices" provided by others.
  25. At this early stage in the evolution of life online, the only record we have of our consent to notices online are cookies given by sites and their third parties to our browsers. These are assembled within our browsers into long DNA chains of personal information presented to every subsequent site we visit. While a consent cookie's main privacy purpose for a given site is to say whether or not the individual has consented to the site's notice, far more information from other cookies in that DNA chain is also being leaked to parties unknown by the individual (and in many cases also the site). This happens everywhere we go online, as a matter of course. As long as this system remains the status quo, we have no true personal privacy on the Web.
  26. Even if today's online privacy laws are enforced, none will give us privacy any more than laws against indecent exposure will give us clothing. We need privacy tech of our own.
  27. Technologies and services that address corporate demand for claiming "GDPR compliance" (mostly by obtaining "consents" through "this site uses cookies" notices) serve only to mask the site's intent to continue tracking people for marketing purposes. As of this writing (December 2021), applied "notice and consent" at most commercial websites facilitate obedience to the letter the GDPR while violating its spirit.
  28. The good guidance of "Privacy by Design" for organizations needs also to apply to privacy tech for individuals.
  29. The The United States Federal Trade Commission's fair information practice principles (FIPPs), which date back to this list of rights from a July 1973 U.S. Government report also provides good guidance, as does EPIC.org: • There must be no personal data record-keeping systems whose very existence is secret. • There must be a way for a person to find out what information about the person is in a record and how it is used. • There must be a way for a person to prevent information about the person that was obtained for one purpose from being used or made available for other purposes without the person's consent. •There must be a way for a person to correct or amend a record of identifiable information about the person. • Any organization creating, maintaining, using, or disseminating records of identifiable personal data must assure the reliability of the data for their intended use and must take precautions to prevent misuses of the data. To those we add,
  30. There must be ways for individuals to secure and exercise all those rights, using standard and well-understood tools of their own.
  31. We do have some early forms of tech to work with, such as crypto, onion routing, PKI and VPNs. But those are too few, and (with the exception of VPNs) too hard for non-experts to use. None yet give us what clothing and shelter afford in the natural world: lots of ways, easily available to everyone, for concealing and exposing private spaces selectively, signaling how we want those private spaces respected, making clear what information we would like others to keep secret or to reveal (and to whom) — and for keeping track of agreements about all those things.
  32. The challenge then, for all tech developers, is to create personal privacy technologies, and means for establishing and enforcing norms based on those technologies.
  33. Those technologies need to be, at their base, free and open.
  34. When Archimedes said, "Give me a place to stand, and I can move the earth," he was talking about a place that did not exist in his time, but does in ours. That place is the Internet. TCP/IP, the free and open protocol at the Internet's base, is a fulcrum sturdy enough to make everyone an Archimedes, given the right levers. Our mission is to provide those levers.
  35. None of those levers can be imagined without standing on the side of the individual, and without personal privacy as the first consideration.

Calls to Action

As with all free and open source code, every word in this manifesto is provisional and subject to improvement. It is also dedicated to the public domain through Creative Commons licence CC0. Members of ProjectVRM with editing powers can also work on this copy of the manifesto, in this wiki, or by contributing through the ProjectVRM mailing list.

Note: a version of this, current on 5 July 2019, appeared in Medium.

— Doc Searls