Definition: Although crowdsourcing can have many meanings, we define it here to mean breaking down large tasks into small ones that can be performed asynchronously.
- The Best Practices entry for crowdwork, developed last year and reposted on Class 3, classifies crowdwork three ways:
First, a large group of workers may do microtasks to complete a whole project; the best-known platform in this arena is Amazon Mechanical Turk. Second, companies may use cloudwork platforms to connect with individual workers, or a small group of workers, who then complete larger jobs (e.g., Elance and oDesk). Finally, a company may run “contests,” where numerous workers complete a task and only the speediest or best worker is paid (e.g., InnoCentive and Worth1000). In some contests, the company commits to picking at least one winner; in others, there is no such guarantee.
- General Information on Crowdsourcing.
- For a quick overview by Jeff Howe, author of Crowdsourcing, take a look at this YouTube clip.
- Northwestern University Professor Kris Hammond also explains crowdsourcing, but argues its downsides are worker rewards and quality.
- Our very own Jonathan Zittrain discusses crowdsourcing in his talk, Minds for Sale.
- Several individuals gathered to discuss crowdsourcing in panel moderated by New York Times correspondent Brad Stone.
- In the News.
A Framework For Analyzing Issues in Crowdsourcing
1. How do concerns of reputation and identity play into crowdsourced work quality?
- tasks where want rep known, others not known
- phone card/coupon system
- Verification of workers is becoming a problem (can access the linked article through Harvard Library).
2. Can we ensure work quality using (semi)automated mechanisms?
- Some have attempted to use crowdsourcing to ensure quality on crowsourced tasks using cheat detection mechanisms. This can be done for both routine and complex tasks.
3. Can we enhance work quality using a targeting system
- Amazon rec, ebay sytle, MT?, differentiate tasks?