Anonymity: Difference between revisions

From Identifying Difficult Problems in Cyberlaw
Jump to navigation Jump to search
No edit summary
No edit summary
 
(14 intermediate revisions by 4 users not shown)
Line 1: Line 1:
== Crowdsourcing & Work Quality ==
== Crowdsourcing: Basic Background and Competing Definitions ==




*''Definition'': Although crowdsourcing can have many meanings, we define it here to mean breaking down large tasks into small ones that can be performed asynchronously.  
''Definition'': Although crowdsourcing can have many meanings, we define it here to mean breaking down large tasks into small ones that can be performed asynchronously.  
* The Best Practices for crowdwork, developed last year and reposted on [[Class 3]], classify crowdwork three ways:  
* The Best Practices entry for crowdwork, developed last year and reposted on [[Class 3]], classifies crowdwork three ways:  
''First, a large group of workers may do microtasks to complete a whole project; the best-known platform in this arena is Amazon Mechanical Turk. Second, companies may use cloudwork platforms to connect with individual workers, or a small group of workers, who then complete larger jobs (e.g., Elance and oDesk). Finally, a company may run “contests,” where numerous workers complete a task and only the speediest or best worker is paid (e.g., InnoCentive and Worth1000). In some contests, the company commits to picking at least one winner; in others, there is no such guarant''ee.
''First, a large group of workers may do microtasks to complete a whole project; the best-known platform in this arena is Amazon Mechanical Turk. Second, companies may use cloudwork platforms to connect with individual workers, or a small group of workers, who then complete larger jobs (e.g., Elance and oDesk). Finally, a company may run “contests,” where numerous workers complete a task and only the speediest or best worker is paid (e.g., InnoCentive and Worth1000). In some contests, the company commits to picking at least one winner; in others, there is no such guarant''ee.
*''General Information on Crowdsourcing.''
**For a quick overview by Jeff Howe, author of Crowdsourcing,[http://books.google.com/books?id=ge_0LBOcwWsC&printsec=frontcover&dq=crowdsourcing&hl=en&ei=QinGTKS9AcGAlAeHtLX-AQ&sa=X&oi=book_result&ct=result&resnum=1&ved=0CC8Q6AEwAA#v=onepage&q&f=false] take a look at this YouTube clip.[http://www.youtube.com/watch?v=F0-UtNg3ots]
**Northwestern University Professor Kris Hammond also explains crowdsourcing, but argues its downsides are worker rewards and quality.[http://www.youtube.com/watch?v=eX7RiV-wa_s&feature=related]
**Our very own Jonathan Zittrain discusses crowdsourcing in his talk, ''Minds for Sale''.[http://www.youtube.com/watch?v=Dw3h-rae3uo]
**Several individuals gathered to discuss crowdsourcing in panel moderated by New York Times correspondent Brad Stone.[http://www.youtube.com/watch?v=lxyUaWSblaA]
*''In the News.''
**The New York Times recently ran an article on crowdsourcing featuring two crowdsourcing companies:[http://www.nytimes.com/2010/10/31/business/31digi.html?_r=1&ref=technology] Microtask[http://www.microtask.com/] and CloudCrowd.[http://www.cloudcrowd.com/]
**It's interesting to note that these companies are attempting to monetize crowdsourcing in exactly the way in which Howe says it cannot be monetized successfully.


We are focusing on the quality of crowdsourced work.
== Research on Crowdsourcing ==
 
 
 
== A Framework For Analyzing Issues in Crowdsourcing ==


1. How do concerns of reputation and identity play into crowdsourced work quality?
1. How do concerns of reputation and identity play into crowdsourced work quality?
*tasks where want rep known, others not known
*tasks where want rep known, others not known
*phone card/coupon system
*phone card/coupon system
*Verification of workers is becoming a problem (can access the linked article through Harvard Library).[http://cacm.acm.org/magazines/2009/12/52830-crowdsourcing-and-the-question-of-expertise/fulltext]
2. Can we ensure work quality using (semi)automated mechanisms?
*Some have attempted to use crowdsourcing to ensure quality on crowsourced tasks using cheat detection mechanisms.[http://www3.informatik.uni-wuerzburg.de/TR/tr474.pdf] This can be done for both routine and complex tasks.


2. Can we ensure work quality using automated mechanisms?
*
3. Can we enhance work quality using a targeting system
3. Can we enhance work quality using a targeting system
*Amazon rec, ebay sytle, MT?, differentiate tasks?
*Amazon rec, ebay sytle, MT?, differentiate tasks?

Latest revision as of 12:36, 24 November 2010

Crowdsourcing: Basic Background and Competing Definitions

Definition: Although crowdsourcing can have many meanings, we define it here to mean breaking down large tasks into small ones that can be performed asynchronously.

  • The Best Practices entry for crowdwork, developed last year and reposted on Class 3, classifies crowdwork three ways:

First, a large group of workers may do microtasks to complete a whole project; the best-known platform in this arena is Amazon Mechanical Turk. Second, companies may use cloudwork platforms to connect with individual workers, or a small group of workers, who then complete larger jobs (e.g., Elance and oDesk). Finally, a company may run “contests,” where numerous workers complete a task and only the speediest or best worker is paid (e.g., InnoCentive and Worth1000). In some contests, the company commits to picking at least one winner; in others, there is no such guarantee.

  • General Information on Crowdsourcing.
    • For a quick overview by Jeff Howe, author of Crowdsourcing,[1] take a look at this YouTube clip.[2]
    • Northwestern University Professor Kris Hammond also explains crowdsourcing, but argues its downsides are worker rewards and quality.[3]
    • Our very own Jonathan Zittrain discusses crowdsourcing in his talk, Minds for Sale.[4]
    • Several individuals gathered to discuss crowdsourcing in panel moderated by New York Times correspondent Brad Stone.[5]
  • In the News.
    • The New York Times recently ran an article on crowdsourcing featuring two crowdsourcing companies:[6] Microtask[7] and CloudCrowd.[8]
    • It's interesting to note that these companies are attempting to monetize crowdsourcing in exactly the way in which Howe says it cannot be monetized successfully.

Research on Crowdsourcing

A Framework For Analyzing Issues in Crowdsourcing

1. How do concerns of reputation and identity play into crowdsourced work quality?

  • tasks where want rep known, others not known
  • phone card/coupon system
  • Verification of workers is becoming a problem (can access the linked article through Harvard Library).[9]

2. Can we ensure work quality using (semi)automated mechanisms?

  • Some have attempted to use crowdsourcing to ensure quality on crowsourced tasks using cheat detection mechanisms.[10] This can be done for both routine and complex tasks.

3. Can we enhance work quality using a targeting system

  • Amazon rec, ebay sytle, MT?, differentiate tasks?