Crowdsourced Data: When to Use Curated Crowds vs. Crowdsourcing

“When should I use crowdsourcing and when should I use a curated crowd?” This is the question anyone interested in staffing human annotation tasks should be asking, but many don’t because they don’t even know there are two different options for crowdsourced data. So let’s start there—defining the options. Assuming you need human annotation, for example for search relevance evaluation, there are two ways you can gather the necessary humans to do that work: 1. Crowdsourcing, where the task is made available to a large crowd without any training or management beyond a very limited set of task instructions and possibly a simple screening test; or 2. Curated crowds, where a smaller group is selected to complete the task accurately according to quality guidelines. The power of crowdsourcing is in its numbers. You can accomplish a lot quickly because many hands make light work. A hundred thousand people can do quite a bit more than a hundred can. The cost is less because crowdsourcing typically pays only a few pennies per task. Most members of the crowd aren’t trying to make a living—they’re just trying to make a few extra bucks in their spare time. There’s usually little overhead involved in crowdsourcing because the crowd looks after itself. You put the task out there, and if it’s interesting enough and pays enough, the crowd will get it done. With these advantages come a few limitations: First, quality control is extremely limited. Without this, you must rely on clear instructions, automated understanding checks, and high overlap to get data you can trust. Overlap is important because there will always be noise in the crowd, so you’ll likely pay for at least five members of the crowd to review each result. Some members of the crowd will try to game the system, using bots to do their work for them, so you’ll need to account for this with screening tests on top of your high overlap. The second limitation is your lack of control over task completion. The crowd can get a lot done quickly, but it will only get your task done quickly if it wants to. This means if your task is more difficult or less exciting than the shiny new task another team is offering, then you’re going to have to incentivize the crowd with higher payment to work on your task. The crowd has made no commitment to you and may not be motivated to make a deadline. Also, if you’re looking for data in smaller markets, you’re out of luck—crowdsourcing is huge in the U.S., but the same doesn’t hold true globally. Curated crowds, on the other hand, are all about quality. With this solution you have a group of people who are dedicated, if not specifically to your task, then to similar tasks. People in curated crowds become experts in search relevance evaluation, social media evaluation, or whatever type of human annotation tasks they work on. This is not simply a case of counting on their accumulated experience to ensure quality, although that experience does play a large role. The key to quality is constant checks and balances. Members of curated crowds are held to quality metrics, receive quality feedback, and are removed from your task if they don’t deliver the required quality. This means that you can use very little overlap, paying for each judgment only one to three times instead of five or more times, because you can trust the data each person delivers. Curated crowd providers also monitor productivity and throughput, ensuring that the crowd meets their weekly, daily, and hourly commitments so that you have the data you need when you need it. And if you’ve chosen a good vendor, then the manager will also be an invaluable resource, leveraging years of experience to partner with you in building out tasks and guidelines based on your needs. With this higher level of quality and productivity management comes a cost. Curated crowds cost more than crowdsourcing because this work is typically a primary source of income. You also pay for the quality oversight that you don’t have in crowdsourcing. Keep in mind, though, that lower overlap mitigates these costs because you aren’t paying for each collected data point multiple times. Apart from the financial cost, curated crowds also require more of a commitment from you in exchange for the greater commitment you get. They will be happiest and will keep their skills sharpest when you provide work for them consistently. That said, if the natural ebb and flow of your need for human annotation necessitates more flexibility, there are alternatives, such as sharing a flexible curated crowd with other teams running similar tasks. So when should you use crowdsourcing and when should you use a curated crowd? Crowdsourcing is great for simple tasks that can be adequately explained in two or three sentences. You’ll get a lot done quickly, but be prepared to raise the pay rate if you have a tight deadline and the crowd doesn’t find your task sexy enough. On the other hand, if you have a more complex task, particularly if it’s a longer-term or ongoing task that dedicated people can build expertise on over time, then a curated crowd is for you. Either way, be sure you fully understand your options so that you can make the best choice for your business.  
Website for deploying AI with world class training data