One Shot Crowdtesting: Approaching the Extremes of Crowdsourced Subjective Quality Testing

Michael Seufert, Tobias Hoßfeld


Crowdsourcing studies for subjective quality testing have become a particularly useful tool for Quality of Experience researchers. Typically, crowdsouring studies are conducted by many unsupervised workers, which rate the perceived quality of several test conditions during one session (mixed within-subject test design). However, those studies often show to be very sensitive, for example, to test instructions, design, and filtering of unreliable participants. Moreover, the exposure of several test conditions to single workers potentially leads to an implicit training and anchoring of ratings. Therefore, this works investigates the extreme case of presenting only a single test condition to each worker (completely between-subjects test design). The results are compared to a typical crowdsourcing study design with multiple test conditions to discuss training effects in crowdsourcing studies. Thus, this work investigates if it is possible to use a simple 'one shot' design with only one rating of a large number of workers instead of sophisticated (mixed or within-subject) test designs in crowdsourcing.


 DOI: 10.21437/PQS.2016-26

Cite as: Seufert, M., Hoßfeld, T. (2016) One Shot Crowdtesting: Approaching the Extremes of Crowdsourced Subjective Quality Testing. Proc. 5th ISCA/DEGA Workshop on Perceptual Quality of Systems (PQS 2016), 122-126, DOI: 10.21437/PQS.2016-26.


@inproceedings{Seufert2016,
  author={Michael Seufert and Tobias Hoßfeld},
  title={One Shot Crowdtesting: Approaching the Extremes of Crowdsourced Subjective Quality Testing},
  year=2016,
  booktitle={Proc. 5th ISCA/DEGA Workshop on Perceptual Quality of Systems (PQS 2016)},
  pages={122--126},
  doi={10.21437/PQS.2016-26},
  url={http://dx.doi.org/10.21437/PQS.2016-26}
}