GROUP TWO: Difference between revisions
Jump to navigation
Jump to search
No edit summary |
No edit summary |
||
Line 98: | Line 98: | ||
*First Amendment Concerns | *First Amendment Concerns | ||
:*Could include exception for minors who are of "legitimate public interest" i.e. celebrities, performers/actors, (what about: children of famous public figures) | :*Could include exception for minors who are of "legitimate public interest" i.e. celebrities, performers/actors, (what about: children of famous public figures) | ||
:*On the other hand, protection of minors is a category for which the Supreme Court has recognized a legitimate (and even compelling) state interest (see, e.g., ''NY v. Ferber''; ''US v. American Library Association''; ''FCC v. Pacifica Foundation'') | :*On the other hand, protection of minors is a category for which the Supreme Court has recognized a legitimate (and even compelling) state interest (see, e.g., ''NY v. Ferber''; ''US v. American Library Association''; ''FCC v. Pacifica Foundation'') | ||
:*Also, only the identifying or depicting information or content would be removed (so, for example, need only redact the minor's name from a discussion; blur, crop or cut minor from photo or video). Other content not related specifically to the minor's identity would remain untouched by MOIPA | |||
*Distribution of liability: who is responsible for what? Does an intermediary like Facebook have the same degree of liability as its third-party partners who create apps and make use of Facebook data? | *Distribution of liability: who is responsible for what? Does an intermediary like Facebook have the same degree of liability as its third-party partners who create apps and make use of Facebook data? | ||
*Technical limitations | *Technical limitations |
Revision as of 15:28, 29 November 2010
ONLINE IDENTITY AND REPUTATION BANKRUPTCY
Motivation to Explore Identity and Reputation
- "75% of US recruiters and human-resource professionals report that their companies require them to do online research about candidates, and many use a range of sites when scrutinizing applicants", recent Microsoft study [1]
- With quickly improving facial-recognition technology, there will be a stark challenge to "our expectation of anonymity in public" [2]
- Viral dynamics of the internet may amplify reputation damage (e.g. search engine rankings)
- COPPA protection ends at age 13 and has proven ineffective:
- 37% of children aged 10-12 have Facebook accounts and 4.4 million Facebook users under age 13 in the US despite Facebook policy in compliance with COPPA (see video [3])
OUR SOLUTION
- Creation of a personal legal right to control content depicting or identifying oneself as a minor, supported by norms and code
- Minor's Online Identity Protection Act (MOIPA)
- Scope: all content depicting and identifying minors (under the age of 18)
- Form: Notice-and-takedown system (Proof of Scope: digital watermark as prima facie evidence; absent watermark/metadata one could prove minor status with government-issued ID/notary based on date of photo and/or content upload)
- Requires affirmative action on the part of the individual. Nothing would be removed or deleted automatically (cf. Juvenile record sealing and expungement)
- Standardized simple notice form could be available to fill out and send online
- Best-practices norm/code for Content Intermediaries to watermark content depicting minors:
- Photos/Videos: meta data (Compare: [4])
- Text: RDFa tags for identity information (See: [5])
- Could either automatically tag to accounts identified as belonging to minors or require affirmative check box (or both) if content depicts or identifies minors (similar to “I have accepted the terms of service” or “I have the right to post this content” boxes)
Scope
- Identity-related content: Pictures, Videos, Text
- All pieces of content that depict or identify minors (i.e. a general discussion of Star Wars Kid [6] is not necessarily bad; however pieces of content linking Star Wars Kid's real name to the footage are potentially harmful for the person and moreover add very little value. While this content should not be removed or deleted automatically, the person should have the right to have the identifying information removed)
- Reference point: provision to wipe juvenile criminal record (e.g. expungement) [7]
- Minors present a particularly compelling case. The age of 18 is a cutoff already recognized by the government and drawn in numerous instances
- Question: What about the following scenario: 25 year-old uploads pictures of when 16 (exception for content of you as a minor poster by another)
- Question: What about minors who are of "legitimate public interest" i.e. celebrities, performers/actors, (what about: children of famous public figures)
Involved Parties
- Individual Identified or Depicted
- Content Creator (Person who creates content i.e. takes a picture)
- Content Sharer (Person who uploads content to Content Storage/Distribution Platform)
- Content Intermediaries (Personal Blog vs Facebook)
- Search Engines
- Note: some of these parties could be the same person in a given scenario
The Recent Scholarship and Other Proposed Solutions
- Law-Based:
- Extending Public Disclosure of Private Fact Tort
- "Disclosure" tort - bars dissemination of "non-newsworthy" personal information that most people would find highly private (some state laws are more specific, e.g. "criminal laws forbidding the publication of the names of rape victims" (for discussion against see [8])
- Strengthened tort for public disclosure of private fact (Anupam Chander) (forthcoming in "The Offensive Internet" 2011 [9])
- Extending Public Disclosure of Private Fact Tort
- Law Restricting Employers
- Existing Children Online Protection Laws
- Children's Online Privacy Protection Act (COPPA) [12]
- Scope: "If you operate a commercial Web site or an online service directed to children under 13 that collects personal information from children or if you operate a general audience Web site and have actual knowledge that you are collecting personal information from children, you must comply with the Children's Online Privacy Protection Act." -- too narrow
- Cf. Children's Online Protection Act (COPA) [13]
- Restricting minor's access to potentially harmful material on the internet
- Children's Online Privacy Protection Act (COPPA) [12]
- Existing Children Online Protection Laws
- Dan Solove: Give legal right to sue Facebook friends where confidence has been breached
- Peter Taylor: Constitutional right to privacy/“oblivion” allowing more anonymity online
- Cass Sunstein: DMCA Notice-and-Takedown Model
- Code-Based:
- Jonathan Zittrain: Rating systems that allow you to declare reputation bankruptcy in certain area [14]
- Market- and Norm- Based:
- Private companies to defend reputation, e.g. Reputation Defender
- Educate the public, especially young people
- Society norms to adapt to new media: "Please don't tweet this" (via [17])
- Tim Berners-Lee: establish market norm of employers barred from accessing Facebook data of prospective employees
Implications / Concerns / Discussion
- Entanglement: who "owns" what information about a person and thus what can be managed / deleted, i.e. reposts of images, comments, wall posts
- How far to go on the identity "continuum“
- Authentication > pseudonymity > anonymity (Ardia)
- Total deletion or selective management?
- Reputations would be meaningless if they could be subject to a legal right to manipulate (Chander)
- First Amendment Concerns
- Could include exception for minors who are of "legitimate public interest" i.e. celebrities, performers/actors, (what about: children of famous public figures)
- On the other hand, protection of minors is a category for which the Supreme Court has recognized a legitimate (and even compelling) state interest (see, e.g., NY v. Ferber; US v. American Library Association; FCC v. Pacifica Foundation)
- Also, only the identifying or depicting information or content would be removed (so, for example, need only redact the minor's name from a discussion; blur, crop or cut minor from photo or video). Other content not related specifically to the minor's identity would remain untouched by MOIPA
- Distribution of liability: who is responsible for what? Does an intermediary like Facebook have the same degree of liability as its third-party partners who create apps and make use of Facebook data?
- Technical limitations
- Easy to remove metadata?
- Abuse/overuse of the digital watermark/metadata
- International standards?
- Notice-and-takedown system: how would this work? How to scale?
- Notice-and-takedown system should not require payment -- the right to control should be available to all
- Both code-based solutions are solutions ex post. How about preventing or making it more difficult to happen in the first place? Perhaps we could require settings to be invisible-by-default for minor users?
- Advantage for Content Intermediaries:
- Public goodwill (parents might feel more comfortable with children using the site)
- Gain additional insights from enhanced user/content meta data
- Potential new market for content management systems: i.e. search, track, and delete material across the Internet
- Disadvantage for Content Intermediaries:
- Penalties for violating MOIPA?
- Could databases and archives include references to deleted data? i.e. write "removed"?
- What if aggregators include copies of photos that lack metadata?
- Differentiate liability of main intermediaries and third-party app makers?
- What about content posted by adults from when the adult was a minor?
- How might this change behavior of minors?
- New market for material about minors?
- How might this segregate the Net between minors and adults?
Sources
- “The End of Forgetting” NY Times 7/25/10, Jeffrey Rosen (law professor at George Washington University) [18]
- Web takes away "second chances": "the worst thing you've done is often the first thing everyone knows about you"
- ReputationDefender a "promising short-term solution", but not enough given fast advances of "facial-recognition technology"
- "Freedom of Speech and Information Privacy: The Troubling Implications of a Right to Stop People From Speaking About You", Eugene Volokh [19]
- Scope: restrictions on communication
- "recognition of one free speech exception certainly does not mean the end of free speech generally"
- "possible unintended consequences of various justifications for information privacy speech restrictions [...] sufficiently troubling"
- "disclosure" tort - bars dissemination of "nonnewsworthy" personal information that most people would find highly private (some state laws more specific, e.g. "criminal laws forbidding the publication of the names of rape victims"
- (II)D. Contracts with Children: the "discussion of contracts presupposes that both parties are legally capable of entering into the contract and of accepting a disclaimer of any implied warranty of confidentiality. If a cyber-consumer is a child, then such an acceptance might not be valid" (Source: Children's Online Privacy Protection Act of 1998 [20], 15 U.S.C. §§ 6501 et seq.; Matlick, note 245 infra; Singleton, infra note 251, text accompanying nn.76-79.)
- "Youthful Indiscretion in an Internet Age," Anupam Chander
- Supports a strengthening of the public disclosure tort
- Recognizes two principal legal hurdles:
- (1) the youth’s indiscretion may itself be of legitimate interest to the public (newsworthy)
- (2) intermediaries can escape demands to withdraw info posted by others because of special statutory immunity
- Society will not become accustomed to the problem; fascination has continued, and embarrassing behavior has not been accepted as normal over time. Embarrassment or humiliation does not turn on whether some activity is out of the ordinary or freakish (for example, bodily functions are often considered embarrassing)