Crowdsourcing: "a neologistic compound of Crowd and Outsourcing for the act of taking tasks traditionally performed by an employee or contractor, and outsourcing them to a group of people or community, through an 'open call' to a large group of people (a crowd) asking for contributions." – Wikipedia
The concept of crowdsourcing has attracted a lot of press but over the last few months its visibility seems to have waned slightly. It looks like two factors involved. The first is there's not been much in the way of "hot" news about crowdsourcing while the second is that for all but simple problems most crowdsourcing attempts can be disappointing.
Crowdsourcing takes center stage at DEMOfall '09
While defining the work to be done has been a major issue with what I'll call crowdsourcing 1.0 the bigger problem has been over the quality of the results.
Consider Amazon's Mechanical Turk (MTurk). There are really only two controls that you have over quality. One of these is that you can require that workers have an "approval rate" equal or greater than some value (approval rate is the percentage or work units that have not been rejected). This is problematic because approval rate isn't necessarily accurate as those assigning the work may not bother to give feedback on problematic responses.
The other control is to have multiple workers assigned to the same work unit then have their results automatically cross-compared to detect conflicting answers. The problem here is that the comparison can only be done for relatively simple cases and many types of response aren't amenable to string or numeric value comparisons.
In what CloudCrowd describes as crowdsourcing 2.0, the company uses a crowd of vetted workers who are given a credibility rating not only on the basis of customer response to their output but also through assessment by CrowdCloud employees and peer review by other workers.
What CloudCrowd brings to the business is consultation to determine and define the problem to be solved and then management of the project execution. The big gain here is that the customer no longer has to wrestle with defining their own problem in terms of a system that essentially pre-packages solutions and consequently can achieve a faster turnaround time because experts implement and manage the workflow.
Think of this not as workload outsourcing but business process outsourcing. For example, a copy editing process with CloudCrowd can result in as much as a 90% reduction in turnaround time.
CloudCrowd charges a setup fee to establish the problem and define the solution and pricing is based on the required level of worker skill and scale of the problem.
CloudCrowd workers, who currently number over 20,000 worldwide and interface with the company via a Facebook application, are paid according to their specific skills and the difficulty of the problem thus identifying image porn might be priced at $0.02 per work unit, writing a short script to generate and advertising video might be worth $5. This equates to hourly rates from $3 or $4 per hour to start with up to $10 or more for more skilled workers.
CloudCrowd, founded last year, already claims to have executed over 500,000 tasks for a wide range of clients.
In the crowdsourcing 2.0 world there are only two significant players so far, CloudCrowd and CrowdFlower but it's pretty certain that more competitors will appear quickly. This is definitely a hot market.
To Learn More Click Here
No comments:
Post a Comment