- Definition: Although crowdsourcing can have many meanings, we define it here to mean breaking down large tasks into small ones that can be performed asynchronously.
- The Best Practices for crowdwork, developed last year and reposted on Class 3, classify crowdwork three ways:
First, a large group of workers may do microtasks to complete a whole project; the best-known platform in this arena is Amazon Mechanical Turk. Second, companies may use cloudwork platforms to connect with individual workers, or a small group of workers, who then complete larger jobs (e.g., Elance and oDesk). Finally, a company may run “contests,” where numerous workers complete a task and only the speediest or best worker is paid (e.g., InnoCentive and Worth1000). In some contests, the company commits to picking at least one winner; in others, there is no such guarantee.
We are focusing on the quality of crowdsourced work.
1. How do concerns of reputation and identity play into crowdsourced work quality?
- tasks where want rep known, others not known
- phone card/coupon system
- Verification of workers is becoming a problem (can access the linked article through Harvard Library).
2. Can we ensure work quality using (semi)automated mechanisms?
- Some have attempted to use crowdsourcing to ensure quality on crowsourced tasks using cheat detection mechanisms. This can be done for both routine and complex tasks.
3. Can we enhance work quality using a targeting system
- Amazon rec, ebay sytle, MT?, differentiate tasks?