Today I read a paper titled “Crowdsourcing for Usability Testing”
The abstract is:
While usability evaluation is critical to designing usable websites, traditional usability testing can be both expensive and time consuming.
The advent of crowdsourcing platforms such as Amazon Mechanical Turk and CrowdFlower offer an intriguing new avenue for performing remote usability testing with potentially many users, quick turn-around, and significant cost savings.
To investigate the potential of such crowdsourced usability testing, we conducted two similar (though not completely parallel) usability studies which evaluated a graduate school’s website: one via a traditional usability lab setting, and the other using crowdsourcing.
While we find crowdsourcing exhibits some notable limitations in comparison to the traditional lab environment, its applicability and value for usability testing is clearly evidenced.
We discuss both methodological differences for crowdsourced usability testing, as well as empirical contrasts to results from more traditional, face-to-face usability testing.