Anonymity: Difference between revisions
No edit summary |
No edit summary |
||
Line 1: | Line 1: | ||
== Crowdsourcing == | == Crowdsourcing: Basic Background and Competing Definitions == | ||
Line 13: | Line 13: | ||
**The New York Times recently ran an article on crowdsourcing featuring two crowdsourcing companies:[http://www.nytimes.com/2010/10/31/business/31digi.html?_r=1&ref=technology] Microtask[http://www.microtask.com/] and CloudCrowd.[http://www.cloudcrowd.com/] | **The New York Times recently ran an article on crowdsourcing featuring two crowdsourcing companies:[http://www.nytimes.com/2010/10/31/business/31digi.html?_r=1&ref=technology] Microtask[http://www.microtask.com/] and CloudCrowd.[http://www.cloudcrowd.com/] | ||
**It's interesting to note that these companies are attempting to monetize crowdsourcing in exactly the way in which Howe says it cannot be monetized successfully. | **It's interesting to note that these companies are attempting to monetize crowdsourcing in exactly the way in which Howe says it cannot be monetized successfully. | ||
== Research on Crowdsourcing == | |||
== A Framework For Analyzing Issues in Crowdsourcing == | == A Framework For Analyzing Issues in Crowdsourcing == |
Latest revision as of 11:36, 24 November 2010
Crowdsourcing: Basic Background and Competing Definitions
Definition: Although crowdsourcing can have many meanings, we define it here to mean breaking down large tasks into small ones that can be performed asynchronously.
- The Best Practices entry for crowdwork, developed last year and reposted on Class 3, classifies crowdwork three ways:
First, a large group of workers may do microtasks to complete a whole project; the best-known platform in this arena is Amazon Mechanical Turk. Second, companies may use cloudwork platforms to connect with individual workers, or a small group of workers, who then complete larger jobs (e.g., Elance and oDesk). Finally, a company may run “contests,” where numerous workers complete a task and only the speediest or best worker is paid (e.g., InnoCentive and Worth1000). In some contests, the company commits to picking at least one winner; in others, there is no such guarantee.
- General Information on Crowdsourcing.
- For a quick overview by Jeff Howe, author of Crowdsourcing,[1] take a look at this YouTube clip.[2]
- Northwestern University Professor Kris Hammond also explains crowdsourcing, but argues its downsides are worker rewards and quality.[3]
- Our very own Jonathan Zittrain discusses crowdsourcing in his talk, Minds for Sale.[4]
- Several individuals gathered to discuss crowdsourcing in panel moderated by New York Times correspondent Brad Stone.[5]
- In the News.
Research on Crowdsourcing
A Framework For Analyzing Issues in Crowdsourcing
1. How do concerns of reputation and identity play into crowdsourced work quality?
- tasks where want rep known, others not known
- phone card/coupon system
- Verification of workers is becoming a problem (can access the linked article through Harvard Library).[9]
2. Can we ensure work quality using (semi)automated mechanisms?
- Some have attempted to use crowdsourcing to ensure quality on crowsourced tasks using cheat detection mechanisms.[10] This can be done for both routine and complex tasks.
3. Can we enhance work quality using a targeting system
- Amazon rec, ebay sytle, MT?, differentiate tasks?