Abstract
This article explores the viability of online crowdsourcing for creating matched-comparison groups. This exploratory study compares survey results from a randomized control group to survey results from a matched-comparison group created from Amazon.com’s MTurk crowdsourcing service to determine their comparability. Study findings indicate that online crowdsourcing, a process that allows access to many participants to complete specific tasks, is a potentially viable resource for evaluation designs where access to comparison groups, large budgets, and/or time are limited. The article highlights the strengths and limitations of the online crowdsourcing approach and describes ways that it could potentially be used in evaluation practice.
Keywords
Get full access to this article
View all access options for this article.
