Anonymity: Difference between revisions

From Identifying Difficult Problems in Cyberlaw
Jump to navigation Jump to search
No edit summary
No edit summary
Line 5: Line 5:
* The Best Practices for crowdwork, developed last year and reposted on [[Class 3]], classify crowdwork three ways:  
* The Best Practices for crowdwork, developed last year and reposted on [[Class 3]], classify crowdwork three ways:  
''First, a large group of workers may do microtasks to complete a whole project; the best-known platform in this arena is Amazon Mechanical Turk. Second, companies may use cloudwork platforms to connect with individual workers, or a small group of workers, who then complete larger jobs (e.g., Elance and oDesk). Finally, a company may run “contests,” where numerous workers complete a task and only the speediest or best worker is paid (e.g., InnoCentive and Worth1000). In some contests, the company commits to picking at least one winner; in others, there is no such guarant''ee.
''First, a large group of workers may do microtasks to complete a whole project; the best-known platform in this arena is Amazon Mechanical Turk. Second, companies may use cloudwork platforms to connect with individual workers, or a small group of workers, who then complete larger jobs (e.g., Elance and oDesk). Finally, a company may run “contests,” where numerous workers complete a task and only the speediest or best worker is paid (e.g., InnoCentive and Worth1000). In some contests, the company commits to picking at least one winner; in others, there is no such guarant''ee.
*For a quick overview by author Jeff Howe, take a look at this YouTube clip.[http://www.youtube.com/watch?v=F0-UtNg3ots]
*For a quick overview by author Jeff Howe, author of Crowdsourcing,[http://books.google.com/books?id=ge_0LBOcwWsC&printsec=frontcover&dq=crowdsourcing&hl=en&ei=QinGTKS9AcGAlAeHtLX-AQ&sa=X&oi=book_result&ct=result&resnum=1&ved=0CC8Q6AEwAA#v=onepage&q&f=false] take a look at this YouTube clip.[http://www.youtube.com/watch?v=F0-UtNg3ots]


We are focusing on the quality of crowdsourced work.  
We are focusing on the quality of crowdsourced work.  

Revision as of 20:05, 25 October 2010

Crowdsourcing & Work Quality

  • Definition: Although crowdsourcing can have many meanings, we define it here to mean breaking down large tasks into small ones that can be performed asynchronously.
  • The Best Practices for crowdwork, developed last year and reposted on Class 3, classify crowdwork three ways:

First, a large group of workers may do microtasks to complete a whole project; the best-known platform in this arena is Amazon Mechanical Turk. Second, companies may use cloudwork platforms to connect with individual workers, or a small group of workers, who then complete larger jobs (e.g., Elance and oDesk). Finally, a company may run “contests,” where numerous workers complete a task and only the speediest or best worker is paid (e.g., InnoCentive and Worth1000). In some contests, the company commits to picking at least one winner; in others, there is no such guarantee.

  • For a quick overview by author Jeff Howe, author of Crowdsourcing,[1] take a look at this YouTube clip.[2]

We are focusing on the quality of crowdsourced work.

1. How do concerns of reputation and identity play into crowdsourced work quality?

  • tasks where want rep known, others not known
  • phone card/coupon system
  • Verification of workers is becoming a problem (can access the linked article through Harvard Library).[3]

2. Can we ensure work quality using (semi)automated mechanisms?

  • Some have attempted to use crowdsourcing to ensure quality on crowsourced tasks using cheat detection mechanisms.[4] This can be done for both routine and complex tasks.

3. Can we enhance work quality using a targeting system

  • Amazon rec, ebay sytle, MT?, differentiate tasks?