GROUP ONE

From Identifying Difficult Problems in Cyberlaw
Jump to navigation Jump to search

Online Reputation[1] : Literature Review

The Current Situation

Perhaps the most seminal article of privacy is Samuel Warren and Louis Brandeis’ Harvard Law Review article The Right to Privacy from 1890. The authors wrote this article in response to the invention of new camera technologies that allowed for instantaneous photographs. Warren and Brandeis feared that such technologies would expose the private realm to the public, creating a situation in which “what is whispered in the closet shall be proclaimed from the house-tops.” Id. Their article argued that a tort remedy should be available for those whose privacy had been violated in such a way. Id. at 219. Warren and Brandeis demonstrated a major trend in legal thought that would achieve more prominence in the Internet Age: the need for formal legal and governmental structures to protect private and undesirable information about people from reaching the public.

The Internet has created a world of permanence, in which whispers in the closets are not only “proclaimed from the house-tops” but also never disappear. In Viktor Mayer-Schönberger’s Delete: The Virtue of Forgetting in the Digital Age, Mayer- Schönberger described how the digital age changed the status quo of society, which was “we remembered what we somehow perceived as important enough to expend that extra bit of effort on, and forgot most of the rest.” [2]. Four technological advances (digitization of information, cheap storage, easy retrieval, and the global reach of the Internet) have made all memories permanent. [3] This change has forced humans to constantly deal with their past, preventing them from growing and moving on.

The permanent past created by the Internet has increased the availability of information that can be used to judge people. Professor Lior Jacob Strahilevitz discusses the effect of this flood of information on reputation and exclusion in his article Reputation Nation: Law in an Era of Ubiquitous Personal Information. The ability of a Google search or social networking site to deliver personal information allow people to factor online and real world reputation into decision making in a way that was not feasible years ago. Strahilevitz discusses both the positive and negative ramifications of this information surplus but other scholars have focused on its negative effects on offline lives. Professor Daniel Solove in The Future of Reputation: Gossip, Rumor, and Privacy on the Internet describes many of these concerns. Solove relates many stories of the negative ramifications of online information, such a girl who refusal to pick up her dog’s waste led to her pictures becoming famous online. A commenter on a blog which discussed this picture stated “Right or wrong, the [I]nternet is a cruel historian.” Solove states that the ability of information, especially that which is considered scandalous or interesting, to spread quickly and become permanent has distorted how we judge people. Many different types of people (family, friends, employers, etc.) can evaluate other using personal information and gossip that many would consider to be unfair factors in such an evaluation. One inappropriate Facebook picture or malicious blog post can haunt someone for the rest of their lives.

Soloye notes that “disclosure can also be harmful because it makes a person a "pris-oner of [her] recorded past." People grow and change, and disclo-sures of information from their past can inhibit their ability to reform their behavior, to have a second chance, or to alter their life's direc-tion. Moreover, when information is released publicly, it can be used in a host of unforeseeable ways, creating problems related to those caused by secondary use. P. 533. Lior Strahilevitz aptly observes that disclosure involves spreading information beyond existing networks of information flow.311 The harm of disclosure is not so much the elimination of secrecy as it is the spreading of information beyond expected boundaries. P. 535. Daniel J. Solove, A Taxonomy of Privacy, University of Pennsylvania Law Review, Vol. 154, No. 3 (Jan., 2006), pp. 477-564, Stable URL: http://www.jstor.org/stable/40041279 .

A Stanford Law Review Symposium on cyberspace and privacy provides a number of instructive articles on legal and theoretical prescriptions for privacy problems on the internet. Privacy As Intellectual Property? Pamela Samuelson, Stanford Law Review, Vol. 52, No. 5, Symposium: Cyberspace and Privacy: A New Legal Paradigm? (May, 2000), pp. 1125-1173, http://www.jstor.org/stable/1229511(suggesting that information privacy law needs to impose minimum standards of commercial morality on firms engaged in the processing of personal data and proposes that certain default licensing rules of trade secrecy law may be adapted to protect personal information in cyberspace); The Death of Privacy? A. Michael Froomkin, Stanford Law Review, Vol. 52, No. 5, Symposium: Cyberspace and Privacy: A New Legal Paradigm? (May, 2000), pp. 1461-1543, http://www.jstor.org/stable/1229519 (discussing privacy-destroying technologies and leading attempts to craft legal responses to the assault on privacy-including self-regulation, privacy-enhancing technologies, data-protection law, and property-rights based solutions); Information Privacy/Information Property, Jessica Litman, Stanford Law Review, Vol. 52, No. 5, Symposium: Cyberspace and Privacy: A New Legal Paradigm? (May, 2000), pp. 1283-1313, Stable URL: http://www.jstor.org/stable/1229515 (exploring property and common law solutions to protecting information privacy); Julie E. Cohen, Examined Lives: Informational Privacy and the Subject as Object, Stanford Law Review, Vol. 52, No. 5, Symposium: Cyberspace and Privacy: A New Legal Paradigm? (May, 2000), pp. 1373-1438, http://www.jstor.org/stable/1229517 (arguing that the debate about data privacy protection should be grounded in an appreciation of the conditions necessary for individuals to develop and exercise autonomy in fact, and that meaningful autonomy requires a degree of freedom from monitoring, scrutiny, and categorization by others); Jonathan Zittrain, What the Publisher Can Teach the Patient: Intellectual Property and Privacy in an Era of Trusted Privication, Stanford Law Review, Vol. 52, No. 5, Symposium: Cyberspace and Privacy: A New Legal Paradigm? (May, 2000), pp. 1201-1250, http://www.jstor.org/stable/1229513 (discussing how “trusted systems” technologies might allow more thorough mass distribution of data, while allowing publishers to retain unprecedented control over their wares).

Solutions

Many academics have proposed ways to deal flood of personal information and gossip in a way that prevents their effect on everyday life. The ensuring categorizations represent the best fit, but all contain elements of many of the categories.

Tort

Warren and Brandeis’ seminal suggestion of a tort remedy for those whose privacy had been invaded has been seen as a potential solution to the issue of unwanted information on the Internet.

Daniel Solove argues that the law should expand its recognition of duties of privacy. This expansion would allow the law to see the placing of certain information online as a violation of privacy. Solove also proposes that only the initial entity who posts gossip online, not subsequent posters, should be liable.

Daniel J. Soloye also develops justifications for protections against the disclosure of private information. Daniel J. Solove, The Virtues of Knowing Less: Justifying Privacy Protections against Disclosure, Source: Duke Law Journal, Vol. 53, No. 3 (Dec., 2003), pp. 967-1065 Stable URL: http://www.jstor.org/stable/1373222. He argues that speech of private concern is less valuable than speech of public concern and that the propriety of disclosures depends upon their purpose, not merely on the type of information disclosed. He also responds to the argument that gossip is valuable because it helps educate us about human nature and argues that the value of concealing one's past can, in many circumstances, outweigh the benefits of disclosure. He further argues that privacy protects against certain rational judgments that society may want to prohibit (such as employment decisions based on genetic information).

Lior Jacob Strahilevitz proposes a revised standard of distinguishing between public and private facts in privacy torts using insights gleaned from the empirical literature on social networks and information dissemination. Lior Jacob Strahilevitz, A Social Networks Theory of Privacy, The University of Chicago Law Review, Vol. 72, No. 3 (Summer, 2005), pp. 919-988 Stable URL: http://www.jstor.org/stable/4495516. He argues that judges can use this research to determine whether an individual had a reasonable expectation of privacy in a particular fact that he has shared with one or more person. He argues that if the plaintiff's previously private information would have been widely disseminated regardless of the defendant's actions in a particular case, the information in question was public, and if not, the tort law ought to deem the information private. Essentially, he argues that an individual has a reasonable expectation of privacy where there is a low risk that the information will spread beyond the individual's social network.

Norms

James Grimmelmann in Saving Facebook stresses that any solution to privacy concerns on Facebook must comport with how users perceive their social environment. Solutions like restrictions on certain uses of social media or strong technical controls on who can see what information will not work because they do not comport with the nuances of social relationships and the desire of young people to subvert attempts to control their behavior. Conversely, solutions that take into account what Facebook users’ expectation of privacy are and how they interact with their online friends will have a greater chance for success.

Jonathan Zittrain proposes the concept of “reputation bankruptcy”, in which an individual would be able to wipe out the entirety of one’s online reputation, both the good and bad aspects of it:

“Like personal financial bankruptcy, or the way in which a state often seals a juvenile criminal record and gives a child a “fresh start” as an adult, we ought to consider how to implement the idea of a second or third chance into our digital spaces. People ought to be able to express a choice to de-emphasize if not entirely delete older information that has been generated about them by and through various systems: political preferences, activities, youthful likes and dislikes.”

Zittrain suggests that the lack of selectivity would help prevent excessive use of this tool. Zittrain’s proposal is compatible with all categories depending upon how he further defines it.

Frank Pasquale in Rankings, Reductionism, and Responsibility and in Asterisk Revisited: Debating a Right of Reply on Search Results proposes a government regulation of search results that would give individuals a right of reply. The asterisk would lead to the individual in question’s own statement on the undesirable result. Our model includes a similar right of reply at the phase when the status of renewed data is contested; that is, when a content creator renews data that implicates one or more parties against their objection. The model allows the party to opt for an asterisk that can provide context for the objectionable image.

Lauren Gelman in Privacy, Free Speech, and “Blurry-Edged” Online Networks argues for a norms-based approach to privacy online. She describes the online world as having “blurry edges” in the sense that information online that is intended for a select group of people can be accessed by the entire online community. She proposes a technological solution in which information an individual posts online can include tags stating that individual’s preferences regarding privacy and how the information may be used. Gelman asserts that online users will likely accept this request if it is presented immediately upon accessing the information.

Technological

Viktor Mayer-Schönberger’s solution shares Grimmelmann’s focus on user-expectations, as Mayer-Schönberger seeks to establish in the online world our natural expectation of forgetting. He proposes the application of expiration dates to information, whereby creator(s) set expiration dates for their data. Mayer-Schönberger allows for government involvement to determine when expiration dates could be changed or who should be involved in setting the date. Also, this system allows for no gradual decay or modification; expansion dates can only be extended by social mandate. [4]

Regulatory

The European Union is considering an online “right to be forgotten.” Users will able to tell websites to remove all personal information the website has about them. If a website did not comply, it could face official punishment. On November 4th, 2010, the European Commission announced that it will propose new legislation in 2011 to strengthen EU data protection and unveiled a series of proposals as part of its overall strategy to protect personal data. Vice-President Viviane Reding, EU Commissioner for Justice, Fundamental Rights and Citizenship, called the protection of personal data a fundamental right. With respect to personal data online, the European Commission stated that “[p]eople should be able to give their informed consent to the processing of their personal data . . . and should have the "right to be forgotten" when their data is no longer needed or they want their data to be deleted.”

http://europa.eu/rapid/pressReleasesAction.do?reference=IP/10/1462&format=HTML&aged=0&language=EN&guiLanguage=fr

http://www.telegraph.co.uk/technology/internet/8112702/EU-proposes-online-right-to-be-forgotten.html

http://redtape.msnbc.com/2010/11/eu-to-create-right-to-be-forgotten-online.html

Cass Sunstein in his book On Rumors suggests the creation of a notice and take down procedure for falsehoods similar to the Digital Millennium Copyright Act. [5] Sunstein describes this as a “right” rather than a norm, demonstrating the need for government action. Also, this proposed notice and takedown system only deals with falsehoods, not with embarrassing information that is true.

Orrin Kerr suggests conditioning the immunity for websites under section 230 of the Communications Decency Act on whether the site owner prevented search engine bots from taking up its content. If a search engine did pick up the content, the website would be liable for the content the search engine displays. Under this suggestion, the objectionable content would still exist, it just could not be found through a search engine.

Market

Joseph Bonneau and Soren Prebusch at the University of Cambridge conducted the first thorough analysis of the market for privacy practices and policies in online social networks in their article, The Privacy Jungle: On the Market for Data Protection in Social Netowrks. Economics of Information Security and Privacy 2010, 121-167, DOI: 10.1007/978-1-4419-6967-5_8, The Privacy Jungle:On the Market for Data Protection in Social Networks Joseph Bonneau and Sören Preibusch http://preibusch.de/publications/social_networks/privacy_jungle_dataset.htm

Bonneau and Prebusch conclude that “the market for privacy in social networks is dysfunctional in that there is significant variation in sites’ privacy controls, data collection requirements, and legal privacy policies, but this is not effectively conveyed to users.” They also find that “privacy is rarely used as a selling point, even then only as auxiliary, nondecisive feature.” Bonneau and Prebusch advocate “a privacy communication game, where the economically rational choice for a site operator is to make privacy control available to evade criticism from privacy fundamentalists, while hiding the privacy control interface and privacy policy to maximize sign-up numbers and encourage data sharing from the pragmatic majority of users.” Their proposal is in tension with our model insofar as it would make privacy policies even more inaccessible to users; our proposal seeks to market privacy as a selling point based on user-preferences and principles of reputation-management, while protecting the ability of site operators to use member data.

Property

Paul M Schartz has proposed a model for propertized personal information that would help fashion a market that would respect individual privacy and help maintain a democratic order. His proposal includes 5 elements: “limitations on an individual's right to alienate personal information; default rules that force disclosure of the terms of trade; a right of exit for participants in the market; the establishment of damages to deter market abuses; and institutions to police the personal information market and punish privacy violations.” Property, Privacy, and Personal Data, Paul M. Schwartz, Harvard Law Review Vol. 117, No. 7 (May, 2004), pp. 2056-2128. http://www.jstor.org/stable/4093335.

  • Link to Stanford property articles.

Our Approach

Many of these proposals share several characteristics: significant government intervention and a focus on dramatic short-term reputation harm. Regarding intervention, many of the aforementioned scholars devote a portion of their articles and books to discussing various legal problems with their proposals, namely First Amendment challenges. Eugene Volokh in his article Freedom of Speech and Information Privacy: The Troubling Implications of a Right to Stop People From Speaking About You discusses how broad information privacy regulations would not be allowable under the First Amendment and how allowing such regulations could create negative legal and social ramifications. The aforementioned proposal, many of which involve the government restricting speech, would likely face significant First Amendment roadblocks.

Some of these proposals focus on the highly objectionable material that would warrant liability or a complete erasing of one’s online reputation. Yet, these approaches do not address the embarrassing yet legal and mundane, such as a drunk picture or a inappropriate Facebook group an individual once posted on that the individual was not concerned about at the time. Such activities likely do not rise to the level warranting litigation or complete erasure of an online identity, but they could negatively affect one’s professional and personal relationships.

Our project departs from these proposals both in its focus and implementation. We have catalogued reputational harms that one can suffer online into three different types:: short-term, medium-term, and long-term. Short-term harms is exemplified by those involved with the Duke sex powerpoint scandal, in which an incendiary piece of information creates immediate and dramatic ramifications for one’s reputation. Long-term harms are things that affect reputation that has been built over many years, such as one’s eBay rating. Medium-term harms are not necessarily immediately recognized as harmful, but their ramifications may come up when establishing a relationship or looking for a job. For example, someone as a freshman in college may have a picture of themselves drunk on Facebook which may not be an issue now but could become one when he looks for a job in several years. Our proposal will address these medium-term harms.

Our proposal relies on a combination of a market- and norms-based approach. Jonathan Zittrain creates a classification system for such approaches to online issues in his article The Fourth Quadrant. This system uses two measures: how “generative” it is and how “singular” it is. The former judges the separation if any between those who create and implement the approach and those who are affected by it. The “singular” qualification evaluates whether the approach is the only one available or one of many. An undertaking that is created and operated by the entire community and is the only one of its kind utilized in the online realms falls into what Zittrain terms the “fourth quadrant.” Our system will be created and implemented by the social networks’ and blogs’ entire community, both their operators and users. The government will not implement or enforce anything. Since our system is optional for websites, it initially will be one of many approach is dealing with the medium-term harms to reputation. However, our hope will be that its advantages and benefits will allow it to become a popular system and, therefore, eventually become of the fourth quadrant.

We believe that a market and norms-based approach avoids the First Amendment issues, is more responsive to the needs of users and internet service providers, and is more likely to be implemented as a bottom up approach that can be tailored to meet the specific preferences of different demographics interfacing with different forms of online expression.

Other Research

http://www.pewinternet.org/Reports/2010/Reputation-Management.aspx

A May 2010 Pew Internet and American Life study suggests that privacy is highly valued by the next generation of social networking users. The report found that “Young adults, far from being indifferent about their digital footprints, are the most active online reputation managers in several dimensions. For example, more than two-thirds (71%) of social networking users ages 18-29 have changed the privacy settings on their profile to limit what they share with others online.” In addition, “When compared with older users, young adults are more likely to restrict what they share and whom they share it with. Mary Madden, one of the coauthors, stated that “Contrary to the popular perception that younger users embrace a laissez-faire attitude about their online reputations, young adults are often more vigilant than older adults when it comes to managing their online identities.”

A study by scholars at the University of Berkeley Center for Law & Technology, Center for the Study of Law and Society, and the University of Pennsylvania - Annenberg School for Communication found privacy norms that are consistent with our model. For instance, “Large proportions of all age groups have refused to provide information to a business for privacy reasons. They agree or agree strongly with the norm that a person should get permission before posting a photo of someone who is clearly recognizable to the internet, even if that photo was taken in public. They agree that there should be a law that gives people the right to know “everything that a website knows about them.” And they agree that there should be a law that requires websites and advertising companies to delete “all stored information” about an individual.” Page 11. Hoofnagle, Chris Jay, King, Jennifer, Li, Su and Turow, Joseph, How Different are Young Adults from Older Adults When it Comes to Information Privacy Attitudes and Policies? (April 14, 2010). Available at SSRN: http://ssrn.com/abstract=1589864

Right to Delete.jpg[6]


Drive-Thru Arbitration in the Digital Age: Empowering Consumers through Binding ODR [article] Baylor Law Review, Vol. 62, Issue 1 (2010), pp. 178-244 Schmitz, Amy J. 62 Baylor L. Rev. 178 (2010)

Many articles discuss privacy and data management in the context of online commercial transactions. See, e.g. Milne, G. R. and Culnan, M. J. (2004), Strategies for reducing online privacy risks: Why consumers read (or don't read) online privacy notices. Journal of Interactive Marketing, 18: 15–29. doi: 10.1002/dir.20009 (concluding that reading is related to concern for privacy, positive perceptions about notice comprehension, and higher levels of trust in the notice); Consumer Online Privacy: Legal and Ethical Issues, Eve M. Caudill and Patrick E. Murphy, Journal of Public Policy & Marketing Vol. 19, No. 1, Privacy and Ethical Issues in Database/Interactive Marketing and Public Policy (Spring, 2000), pp. 7-19 (article consists of 13 pages), Published by: American Marketing Association Stable URL: http://www.jstor.org/stable/30000483 (reviewing ethical theories that apply to consumer privacy and offering specific suggestions for corporate ethical policy and public policy).

Proposed Mid-term Content Management System

Assumptions

Model

The Model.jpg

Explanation of the Model

Concerns and Responses to the Model

1. How do we prevent a deluge of expiration notifications and other notifications from the process?

Concern: The model is likely to create a considerable amount of email notification both for the originator of the gater. Online social media users are generating content everyday so the flow will be constant. This flood of emails could discourage some walled gardens from implementing the model if they fear their users will abandon their network, because their mailbox is constantly flooded. Are users going to take concerns seriously when they receive so many of them over short periods of time?

Response: Like in every architectural design (both physical and technological), there are tradeoffs. The benefit of the model is that it facilitates interactions and keeps users constantly apprised about content they generate and content generated about them. The con is reflected in the benefit: they are constantly kept apprised, which means they are constantly being notified. Asides from the benefit of being aware, there are some tools users could employ to avoid mailbox flooding.

Users could use their default settings to have content scheduled to expire, expire automatically, without notification. They could do this for all their content and select it as a default or they could manage pictures, status updates, posts, etc. individually. The more tailored their control over the system, the more easily users could manage their data and notifications. Content generators seeking to avoid notification from those gated to their content, could also benefit from the use of default settings. Perhaps they could set their default settings such if someone would like the content taken down, it would automatically defer to that user. The walled garden settings could also permit individuals to differentiate by user or group. For example, a default setting could look something like: I do not want to be notified if Family and Good Friends object to content being renewed and I will defer to their requests, but I would like to be notified if Pete Campbell or someone from the High School group objects.

2. Should there be a modified system for children? Should this system only exist for children? Should there only be one period of atonement for coming of age?

Concern: Minors are arguably the most susceptible to generate content they later regret, is embarrassing, or could effect them in the future. They are also on the whole not as mature as the rest of the population and should be further protected from the permeability of content generated about them online. Similar to how the criminal records of minors are often expunged in the legal system, should the model guarantee minors greater protection or should the model only exist for minors?

Response: Individual walled gardens are best situated to determine the individual needs of their users. While the authors of this model maintain that the model should apply universally, there is no doubt that minors are perhaps the most in need of the model. That said, individual sites that will adopt this model can best determine how they want to implement it and to what population of their users they would like to make it available to. Perhaps there could be variations in the model for children, such as no renewal after valid complaints on the content or the availability of content expiration more frequently or when the minor comes of age. This question, however, we believe is better resolved by individual walled gardens.

3. Should expired content be deleted forever? Are there ways to preserve one’s content?

Concern: One of the primary reasons individuals post content online is to share and preserve it in digital form. The model favors the deletion of information and could cause content that would have otherwise remained available for storing or sharing to be removed. What if the individuals generating the content or featured in the content want to preserve it for themselves, friends, or future generations?

Response: There are alternative ways to preserve one’s content. Users could be allowed to have a public and personal digital dossier. Expired content generated by individuals could be stored in a private mechanism: both online and offline. The most secure way to retain your personal data would be off a could and stored on a personal hard drive, but perhaps a user would like to have their content in a clould. Either way, there are multiple devices currently where users can store data that is not publicly available.

4. What happens if people circumvent the process and simply repost the material?

Concern: While in theory the model may work well, what is to stop users wanting to avoid the effort of the process from simply agreeing to delete their content and reposting it again or allowing everything to expire, only to repost it again in a few years, months, days, minutes, or seconds?

Response: Like any law or system, there will always be those that look for loopholes and areas of the technological court that are poorly defended. The additional affirmative effort required by users to repost content is in itself a moderate discouragement. Furthermore, the establishment of a model and the social norms governing its usage also serve to discourage users from running a red light even when they think they will not get caught. However, it is undeniable that for at least a sector of online users, the additional effort required to repost content and norms alone are not enough.

Walled gardens could develop software that is able to detect when content is reposted or identify content that is posted as the same as other content that was previously posted. Other individuals featured in the reposted content could be allowed to file a complaint with the walled garden notifying them that the content posted had already expired or was somewhere in the process of expiration.

5. Will this have a chilling effect on speech?

Concern: Individuals that do not want to be over-flooded with complaints or notifications might avoid posting content that is arguably beneficial to society, simply because they know some individuals might object and they do not want to deal with the burdensome process of responding to complaints.

Response: While the model in its current state might prevent content from being renewed, it certainly should not prevent individuals from originally posting the content. Individuals could set their default settings to avoid receiving notification of gating. They would thus only encounter resistance when renewing the content.

6. Who has standing to gat themselves to content?

Concern: If there are no minimum requirements for standing, then content generators could be flooded by complaints and notifications form individuals that have no direct reputational stake in the content. Content that is political or otherwise controversial could be deluged with gats in order to get the content generator to remove.

Response: Standing should be limited to those individuals directly featured in the content. If it is a picture, then the individual must be in the picture. If it is text, then the individuals name should be mentioned in the text. Walled gardens should establish some mechanism for verifying that those gating themselves to content actually have the standing to do so. In walled gardens such as Facebook that might be easier to accomplish, but perhaps facial recognition software can be employed or some other mechanism that can verify who gaters are and whether they have indeed have standing. Perhaps the process could be outsourced to mechanical turks?

7. What about pictures that have social or even historical value, but where the originator of the material either does not recognize such value or does not want to put forth the additional effort to preserve it.

Concerns: a model that favors the expiration and possible deletion of content online carries the risk of permanently deleting or creating greater access barriers to content that has social and historical value either currently or in the future.

Response: A model favoring the expiration and public deletion of user generated content could certainly result in the loss of some content that has social and historic value. But this is not a new problem. Famous authors often leave manuscripts unpublished upon their death and often leave clear instructions that such material should not be published. Famous artwork is often kept in the hands of private owners and is not made available for the public at large to see.

It has always been the job news corporations and media to cover events of historical significance and obtain licensing or copyright release upon citizens who capture events of historical or social significance. It is arguably only in the last few years through the Internet that user generated content has been elevated to its current importance and presence in the public media.

Furthermore, this model should not be applied to certain institutions. Traditional news media and even current semi-news media sites should not be subject to expiration dates. Archives of newspapers and important events and news should remain available indefinitely. Moreover, this model does not cause the deletion of material. It simply might discourage its prolonged existence online. So while the costs of acquiring content of social and historic value might have increased, they are not insurmountable.

8. Are we creating an environment that discourages responsibility?

Concern: If content we generated or that is generated about us is more easily purged from the public Internet, then people will be less accountable for their actions and perhaps even act more recklessly. Furthermore, they will continue to commit follies, because the consequences are not as permanent. People should take responsibility for their actions rather than rely on deletion.

Response: The above is a matter of preference. It is true that in at least some cases bad actors will make poor choices because their misgivings are more forgiven under this model, but we would argue that a good majority of bad actors are not considering the consequences of their actions at all in the moment. Perhaps getting burned by the current system could influence their behavior in the future, but the reputational damage suffered by the individual might be steeper than the lesson learned. Furthermore, it is not as if the individual was being isolated from the worse immediate effects of his follies. It is only the more longer term effects that the model shields him partially from.

9. What is the something cool that could happen?

Concern: If the content generated and the gated individual does not come to a mutual agreement or solution, then what happens? Are we left back at square one?

Response: There are several things that could happen. First, nothing could happen. This could be a system based solely on the contact hypothesis and social psychology research that suggests that individuals are more likely to say yes to another’s individualized request for aid or help. In a large percentage of cases, the model resolves the problem without the need for some other mechanism than the mere facilitation of interaction between two human beings.

Second, there could be a final arbitration or mediation facilitated by a third party. This third party could be in house or could be outsourced. A good example of out sourcing to a third party would be SquareTrade. SquareTrade conducted dispute resolution for eBay a few years back and was quite successful in doing so. The benefits of using an outsourced third party would be that if something had to be taken down, walled gardens would potentially be able to continue asserting their Section 230 Communications Decency Act shield by not directly or editing content on their site. Another option would be for the walled garden to generate its own dispute resolution process. The site could do so by hiring a panel specifically assigned to do just that. The site could elect particular members of the site to vote (however, this might cause more unwanted attention to the content an individual necessarily wants eradicated).

10. Should certain institutions be exempt from expiration dates?

Concern: There is certain content on the net that society would be hurt if expired: like news media organizations and websites.

Response: We only intend this model to be applied by walled gardens of social media and only on user generated content.

Notes

  1. We would like to thank Viktor Mayer-Schönberger for his all of his assistance and support for our project.
  2. Viktor Mayer-Schönberger, Delete: The Virtue of Forgetting in the Digital Age 49 (2009)
  3. Id. at 52.
  4. Id. at 184-192.
  5. Cass Sunstein, On Rumors 78-79 (2009).
  6. Hoofnagle, Chris Jay, King, Jennifer, Li, Su and Turow, Joseph, How Different are Young Adults from Older Adults When it Comes to Information Privacy Attitudes and Policies? (April 14, 2010). Available at SSRN: http://ssrn.com/abstract=1589864.