GROUP TWO
Introduction
Reputation Bankruptcy deserves to be called a difficult problem: Eric Schmidt's comment in a Wall Street Journal interview that soon “every young person…will be entitled automatically to change his or her name on reaching adulthood in order to disown youthful hijinks stored on their friends’ social media sites” [1] (even though, when interviewed for this project he asserted this was only a "joke") stirred a heated debate [2]; and provided enough motivation for our team to take a closer look at the problem.
We start by providing some evidence of the importance of online identity and reputation; in particular going forward. We then dive into the recent scholarship and other proposed solutions, using Lessig's framework of Law, Code, Market and Norms [3] before presenting our proposed solution: a law, supported by code and norms, which we call the Minor's Online Identity Protection Act. We are aware of the potential implications and potential concerns and present some of them in the following section.
Our solution is intended to benefit both children and adults. We seek to provide better protection for children, when still minors, from any damaging content that either they or others post about them. Looking forward, we seek to protect adults from earlier "youthful indiscretions" and other damaging records from when they were minors, analogous to juvenile record sealing and expungement.
Importance of Online Identity and Reputation
- "75% of US recruiters and human-resource professionals report that their companies require them to do online research about candidates, and many use a range of sites when scrutinizing applicants", recent Microsoft study [4]
- With quickly improving facial-recognition technology, there will be a stark challenge to "our expectation of anonymity in public" [5]
- Viral dynamics of the internet may amplify reputation damage (e.g. search engine rankings)
- COPPA protection ends at age 13 and has proven ineffective:
- 37% of children aged 10-12 have Facebook accounts and 4.4 million Facebook users under age 13 in the US despite Facebook policy in compliance with COPPA (see video [6])
- "If there are no private pictures of you online, anything made up will seem made up. Treasure your privacy. The future money making on the web will not be social networking but information search and destroy." ("As Bullies Go Digital, Parents Play Catch-Up", New York Times [7])
The Recent Scholarship and Other Proposed Solutions
Law-Based
In thirty-six states, there is already a recognized tort for "public disclosure of private fact." Essentially, this tort bars dissemination of "non-newsworthy" personal information that a reasonable person would find highly offensive. (Chander). Some state laws are more specific. For example, criminal laws forbidding the publication of the names of rape victims (for a discussion in opposition to such laws see [8]). Anupam Chander has argued in a forthcoming article, "Youthful Indiscretion in an Internet Age," that the tort for public disclosure of private fact should be strengthened. (forthcoming in "The Offensive Internet" 2011 [9]).
Chander also recognizes two important legal hurdles to overcome in strengthening the public disclosure of private fact tort. The first is that the indiscretion at issue may be legitimately newsworthy. This raises serious First Amendment concerns. The second hurdle is that intermediaries are often protected from liability under the Communications Decency Act (CDA). Despite these obstacles, Chander persuasively argues that such protection is needed, particularly in the context of nude images, because the society's fascination with embarrassing content will not abate. Moreover, he observes that an individual's humiliation "does not turn on whether some activity is out of the ordinary or freakish" but rather than common behavior can still cause significant personal damage.
- Law Restricting Employers
- Existing Children Online Protection Laws
- Children's Online Privacy Protection Act (COPPA) [12]
- Scope: "If you operate a commercial Web site or an online service directed to children under 13 that collects personal information from children or if you operate a general audience Web site and have actual knowledge that you are collecting personal information from children, you must comply with the Children's Online Privacy Protection Act." -- too narrow
- Cf. Children's Online Protection Act (COPA) [13]
- Restricting minor's access to potentially harmful material on the internet
- Children's Online Privacy Protection Act (COPPA) [12]
- Existing Children Online Protection Laws
- Dan Solove: Give legal right to sue Facebook friends where confidence has been breached
- Peter Taylor: Constitutional right to privacy/“oblivion” allowing more anonymity online
- Cass Sunstein: DMCA Notice-and-Takedown Model
Code-Based
- Jonathan Zittrain: Rating systems that allow you to declare reputation bankruptcy in certain area [14]
Market- and Norm- Based
- Private companies to defend reputation, e.g. Reputation Defender
- Educate the public, especially young people
- Society norms to adapt to new media: "Please don't tweet this" (via [17])
- Tim Berners-Lee: establish market norm of employers barred from accessing Facebook data of prospective employees
- “The End of Forgetting” NY Times 7/25/10, Jeffrey Rosen (law professor at George Washington University) [18]
- Web takes away "second chances": "the worst thing you've done is often the first thing everyone knows about you"
- ReputationDefender a "promising short-term solution", but not enough given fast advances of "facial-recognition technology"
- "Freedom of Speech and Information Privacy: The Troubling Implications of a Right to Stop People From Speaking About You", Eugene Volokh [19]
- Scope: restrictions on communication
- "recognition of one free speech exception certainly does not mean the end of free speech generally"
- "possible unintended consequences of various justifications for information privacy speech restrictions [...] sufficiently troubling"
- "disclosure" tort - bars dissemination of "nonnewsworthy" personal information that most people would find highly private (some state laws more specific, e.g. "criminal laws forbidding the publication of the names of rape victims"
- (II)D. Contracts with Children: the "discussion of contracts presupposes that both parties are legally capable of entering into the contract and of accepting a disclaimer of any implied warranty of confidentiality. If a cyber-consumer is a child, then such an acceptance might not be valid" (Source: Children's Online Privacy Protection Act of 1998 [20], 15 U.S.C. §§ 6501 et seq.; Matlick, note 245 infra; Singleton, infra note 251, text accompanying nn.76-79.)
- "Reputation Bankruptcy" Blog Post, 9/7/2010, on The Future of the Internet and How to Stop it, Jonathan Zittrain [21]
- Eric Schmidt (the CEO of Google ) made a statement to the effect that in the future "every young person... will be entitled automatically to change his or her name on reaching adulthood in order to disown youthful hijinks stored on the friends' social media sites" (even though he has later asserted this was only a "joke").
- Suggests that intermediaries who demand an individual's identity on the internet "ought to consider making available a form of reputation bankruptcy. Like personal financial bankruptcy, or the way in which a state often seals a juvenile criminal record and gives a child a 'fresh start' as an adult, we ought to consider how to implement the idea of a second or third chance into our digital spaces."
- "Reputation in a Networked World," by David S. Ardia
- Argues that the focus should be on ensuring the reliability of reputational information rather than on imposing liability
- Advocates community governance: deploy communities’ assistance in resolving disputes (enforcing its norms and ensuring the reliability of reputational information)
- "Reputation is disaggregated; information is disaggregated; and liability is disaggregated”
Proposed Solution
Overview
We are proposing the creation of a personal legal right to control content depicting or identifying oneself as a minor, supported by best-practices norms and code for Content Intermediaries to watermark content depicting minors:
- Minor's Online Identity Protection Act (MOIPA):
- Scope: all content depicting and identifying minors (under the age of 18)
- Form: Notice-and-takedown system (Proof of Scope: digital watermark as prima facie evidence; absent watermark/metadata one could prove minor status with government-issued ID/notary based on date of photo and/or content upload)
- Requires affirmative action on the part of the individual. Nothing would be removed or deleted automatically (cf. Juvenile record sealing and expungement)
- Standardized simple notice form could be available to fill out and send online
- Supporting Norms and Code:
- Photos/Videos: meta data (Compare: [22])
- Text: RDFa tags for identity information (See: [23])
- Could either automatically tag to accounts identified as belonging to minors or require affirmative check box (or both) if content depicts or identifies minors (similar to “I have accepted the terms of service” or “I have the right to post this content” boxes)
- Possible objective limitation similar to that seen in many torts, including the public disclosure of private fact tort, which provides a cause of action where the disclosure would be “highly offensive to a reasonable person."
- Consider simple or stream-lined ways to implement such a standard to avoid prolonged legal battles. One potential option would be crowd-sourcing to a number others (possibly a "jury" of 12) for their opinion on whether the content is objectively embarrassing or unfavorable. Or, possibly a determination simply of whether it is "reasonable" for the individual to request that the identifying or depicting content be taken down. A simple majority finding that the content is embarrassing or unfavorable could suffice. Such participants would not be able to download the content and would be subject to an agreement not to share or disclose the content themselves.
Scope
In our opinion, the scope of MOIPA has to cover all pieces of content that depict or identify minors. In order to satisfy First Amendment concerns, minors who are of "legitimate public interest" i.e. celebrities, performers/actors, and children of famous public figures will be excluded from the scope of our proposal.
- Identity-related content
Within the scope of our proposal, identity-related content can take various formats, ranging from pictures, to videos to text.
While a general discussion of Star Wars Kid [24] is not necessarily bad, pieces of content linking Star Wars Kid's real name to the footage are potentially harmful for the person and moreover add very little value. While this content should not be removed or deleted automatically, the person should have the right to have the identifying information removed).
- Minors
In our opinion, minors present a particularly compelling case. The age of 18 is a cutoff already recognized by the government and drawn in numerous instances. As a reference point, one could mention the provision to wipe juvenile criminal record (e.g. expungement) [25].
With respect to the age barrier, four scenarios are possible, as illustrated in the chart below.
(1) Content about oneself and at the time of posting still a minor (younger than 18)
(2) Content about another minor and at the time of posting the poster is a minor (younger than 18)
(3) Content about a minor and poster is an adult at time of posting
(4) Content about oneself and at the time of posting poser is an adult
While scenario (1) to (3) are clearly covered by our proposal, (4) should be discussed further. While it might be theoretically less compelling (see section below), it is practically easier to implement and more efficient, in our opinion.
There are certain other scenarios when MOIPA would not be available, for example due to contracts that were entered regarding the copyright of the content in question. The chart below provides a framework for what is and what is not covered under MOIPA.
Involved Parties
- Individual Identified or Depicted
- Content Creator (Person who creates content i.e. takes a picture)
- Content Sharer (Person who uploads content to Content Storage/Distribution Platform)
- Content Intermediaries (Personal Blog vs Facebook)
- Search Engines
- Note: some of these parties could be the same person in a given scenario
Is a Legal Measure Necessary?
It is possible that the burgeoning problems of online identity and reputation will dissipate on their own in a natural fashion. Thus, it might not ultimately be necessary to respond with a law or other significant change, so we do not recommend the immediate adoption of our proposal. Instead, it may be worthwhile to observe the trends in hopes that less drastic yet effective solutions appear.
For example, some believe that society will adapt to the ramifications of the internet and either adjust its (online) behavior accordingly, or such transgressions will become so commonplace that any negative impact will be lost. Much of the information available now, however, suggests that will not be the case. The internet has been a household staple for many for well over a decade, but no change in behavior has occurred. If anything, young and older users alike have grown more comfortable with the internet and therefore more willing than ever to share their private information, personal photos, and more on the internet. Our sensibilities to transgressions, however, do not appear to have similarly evolved. Employers continue to conduct internet and social network site research on prospective (and current) employees, and then make hiring and firing decisions based on that information. The public has not grown to accept these circumstances as normal over time: embarrassing situations continue to be embarrassing and can carry severe consequences, seemingly no matter how universal they are or how many times similar conduct occurs (see, e.g. Chander’s discussion on whether “society will become inured to the problem”). Ideally, society would evolve and adapt to the changes naturally, but so far in this area there has been no significant change. Moreover, children and adolescents are probably the least likely to pick up on these social cues and to understand the full ramifications of their online behavior. Our primarily legal-based solution is thus a possible avenue for government to protect the online identity and reputation of minors absent any other meaningful changes.
Before adopting such a law, which carries some significant ramifications of its own, some time should be taken to see if other solutions will appear. But at the current time such solutions seem unlikely. For example, at present there is no real pressure on content-hosting platforms or search engines to adopt any code-based or norm-based solutions, and there is no reason to suspect that they will spend time, money, and other resources to voluntarily adopt any protective measures that are not required. To the extent that companies such as Reputation Defender exist to solve these problems, they are problematic in that they are (i) limited to only those who can afford them, and (ii) cannot offer full protection because they lack any real power or authority over the sources. Thus, absent any unexpected and meaningful changes, the MOIPA proposal is an effort to adequately protect the most vulnerable segment of the population.
Implications and Potential Concerns
First Amendment Concerns
- Could include exception for minors who are of "legitimate public interest" i.e. celebrities, performers/actors, (what about: children of famous public figures)
- On the other hand, protection of minors is a category for which the Supreme Court has recognized a legitimate (and even compelling) state interest (see, e.g., NY v. Ferber; US v. American Library Association; FCC v. Pacifica Foundation)
- Also, only the identifying or depicting information or content would be removed (so, for example, need only redact the minor's name from a discussion; blur, crop or cut minor from photo or video). Other content not related specifically to the minor's identity would remain untouched by MOIPA
Technical Limitations
- Easy to remove metadata?
- Abuse/overuse of the digital watermark/metadata
- International standards?
Notice and Takedown
- Notice-and-takedown system should not require payment -- the right to control should be available to all
- Both code-based solutions are solutions ex post. How about preventing or making it more difficult to happen in the first place? Perhaps we could require settings to be invisible-by-default for minor users?
Entanglement
- Entanglement: who "owns" what information about a person and thus what can be managed / deleted, i.e. reposts of images, comments, wall posts
Advantage for Content Intermediaries
- Public goodwill (parents might feel more comfortable with children using the site)
- Gain additional insights from enhanced user/content meta data
- Potential new market for content management systems: i.e. search, track, and delete material across the Internet
Disadvantage for Content Intermediaries
- Penalties for violating MOIPA?
- Could databases and archives include references to deleted data? i.e. write "removed"?
- What if aggregators include copies of photos that lack metadata?
- Differentiate liability of main intermediaries and third-party app makers?
- Who is responsible for what? Does an intermediary like Facebook have the same degree of liability as its third-party partners who create apps and make use of Facebook data?
Effects on Behavior of Minors
Implications for Adults
- Reluctance to post material about minors. Fewer class photos?
- First Amendment aside, if kids are too little to use this, can guardians avail of it?
- What about content posted by adults from when the adult was a minor?
Potential for Abuse
- Is it a civil lawsuit, if they don’t want to take it down? E.g. girlfriend breaks up with boyfriend scenario, potential for abuse; imposing costs on takedown might be helpful here
Economics
- New market for material about minors?
Relevant Sources
- “The End of Forgetting” NY Times 7/25/10, Jeffrey Rosen (law professor at George Washington University) [26]
- Web takes away "second chances": "the worst thing you've done is often the first thing everyone knows about you"
- ReputationDefender a "promising short-term solution", but not enough given fast advances of "facial-recognition technology"
- "Freedom of Speech and Information Privacy: The Troubling Implications of a Right to Stop People From Speaking About You", Eugene Volokh [27]
- Scope: restrictions on communication
- "recognition of one free speech exception certainly does not mean the end of free speech generally"
- "possible unintended consequences of various justifications for information privacy speech restrictions [...] sufficiently troubling"
- "disclosure" tort - bars dissemination of "nonnewsworthy" personal information that most people would find highly private (some state laws more specific, e.g. "criminal laws forbidding the publication of the names of rape victims"
- (II)D. Contracts with Children: the "discussion of contracts presupposes that both parties are legally capable of entering into the contract and of accepting a disclaimer of any implied warranty of confidentiality. If a cyber-consumer is a child, then such an acceptance might not be valid" (Source: Children's Online Privacy Protection Act of 1998 [28], 15 U.S.C. §§ 6501 et seq.; Matlick, note 245 infra; Singleton, infra note 251, text accompanying nn.76-79.)
- "Youthful Indiscretion in an Internet Age," Anupam Chander
- Supports a strengthening of the public disclosure tort
- Recognizes two principal legal hurdles:
- (1) the youth’s indiscretion may itself be of legitimate interest to the public (newsworthy)
- (2) intermediaries can escape demands to withdraw info posted by others because of special statutory immunity
- Society will not become accustomed to the problem; fascination has continued, and embarrassing behavior has not been accepted as normal over time. Embarrassment or humiliation does not turn on whether some activity is out of the ordinary or freakish (for example, bodily functions are often considered embarrassing)
- "Reputation Bankruptcy" Blog Post, 9/7/2010, on The Future of the Internet and How to Stop it, Jonathan Zittrain [29]
- Eric Schmidt (the CEO of Google ) made a statement to the effect that in the future "every young person... will be entitled automatically to change his or her name on reaching adulthood in order to disown youthful hijinks stored on the friends' social media sites" (even though he has later asserted this was only a "joke").
- Suggests that intermediaries who demand an individual's identity on the internet "ought to consider making available a form of reputation bankruptcy. Like personal financial bankruptcy, or the way in which a state often seals a juvenile criminal record and gives a child a 'fresh start' as an adult, we ought to consider how to implement the idea of a second or third chance into our digital spaces."
- "Reputation in a Networked World," by David S. Ardia
- Argues that the focus should be on ensuring the reliability of reputational information rather than on imposing liability
- Advocates community governance: deploy communities’ assistance in resolving disputes (enforcing its norms and ensuring the reliability of reputational information)
- "Reputation is disaggregated; information is disaggregated; and liability is disaggregated”