Definition: Although crowdsourcing can have many meanings, we define it here to mean breaking down large tasks into small ones that can be performed asynchronously.
- The Best Practices for crowdwork, developed last year and reposted on Class 3, classify crowdwork three ways:
First, a large group of workers may do microtasks to complete a whole project; the best-known platform in this arena is Amazon Mechanical Turk. Second, companies may use cloudwork platforms to connect with individual workers, or a small group of workers, who then complete larger jobs (e.g., Elance and oDesk). Finally, a company may run “contests,” where numerous workers complete a task and only the speediest or best worker is paid (e.g., InnoCentive and Worth1000). In some contests, the company commits to picking at least one winner; in others, there is no such guarantee.
- For a quick overview by Jeff Howe, author of Crowdsourcing, take a look at this YouTube clip.
- Northwestern University Professor Kris Hammond also explains crowdsourcing, but argues its downsides are worker rewards and quality.
- In the News. The New York Times recently ran an article on crowdsourcing featuring two crowdsourcing companies: Microtask and CloudCrowd.
- It's interesting to note that these companies are attempting to monetize crowdsourcing in exactly the way in which Howe says it cannot be monetized successfully.
A Framework For Analyzing Issues in Crowdsourcing
1. How do concerns of reputation and identity play into crowdsourced work quality?
- tasks where want rep known, others not known
- phone card/coupon system
- Verification of workers is becoming a problem (can access the linked article through Harvard Library).
2. Can we ensure work quality using (semi)automated mechanisms?
- Some have attempted to use crowdsourcing to ensure quality on crowsourced tasks using cheat detection mechanisms. This can be done for both routine and complex tasks.
3. Can we enhance work quality using a targeting system
- Amazon rec, ebay sytle, MT?, differentiate tasks?